Sample records for agri-fooda method based

  1. RFID Application Strategy in Agri-Food Supply Chain Based on Safety and Benefit Analysis

    NASA Astrophysics Data System (ADS)

    Zhang, Min; Li, Peichong

    Agri-food supply chain management (SCM), a management method to optimize internal costs and productivities, has evolved as an application of e-business technologies. These days, RFID has been widely used in many fields. In this paper, we analyze the characteristics of agri-food supply chain. Then the disadvantages of RFID are discussed. After that, we study the application strategies of RFID based on benefit and safety degree.

  2. The application of knowledge synthesis methods in agri-food public health: recent advancements, challenges and opportunities.

    PubMed

    Young, Ian; Waddell, Lisa; Sanchez, Javier; Wilhelm, Barbara; McEwen, Scott A; Rajić, Andrijana

    2014-03-01

    Knowledge synthesis refers to the integration of findings from individual research studies on a given topic or question into the global knowledge base. The application of knowledge synthesis methods, particularly systematic reviews and meta-analysis, has increased considerably in the agri-food public health sector over the past decade and this trend is expected to continue. The objectives of our review were: (1) to describe the most promising knowledge synthesis methods and their applicability in agri-food public health, and (2) to summarize the recent advancements, challenges, and opportunities in the use of systematic review and meta-analysis methods in this sector. We performed a structured review of knowledge synthesis literature from various disciplines to address the first objective, and used comprehensive insights and experiences in applying these methods in the agri-food public health sector to inform the second objective. We describe five knowledge synthesis methods that can be used to address various agri-food public health questions or topics under different conditions and contexts. Scoping reviews describe the main characteristics and knowledge gaps in a broad research field and can be used to evaluate opportunities for prioritizing focused questions for related systematic reviews. Structured rapid reviews are streamlined systematic reviews conducted within a short timeframe to inform urgent decision-making. Mixed-method and qualitative reviews synthesize diverse sources of contextual knowledge (e.g. socio-cognitive, economic, and feasibility considerations). Systematic reviews are a structured and transparent method used to summarize and synthesize literature on a clearly-defined question, and meta-analysis is the statistical combination of data from multiple individual studies. We briefly describe and discuss key advancements in the use of systematic reviews and meta-analysis, including: risk-of-bias assessments; an overall quality

  3. Mixed biodiversity benefits of agri-environment schemes in five European countries.

    PubMed

    Kleijn, D; Baquero, R A; Clough, Y; Díaz, M; De Esteban, J; Fernández, F; Gabriel, D; Herzog, F; Holzschuh, A; Jöhl, R; Knop, E; Kruess, A; Marshall, E J P; Steffan-Dewenter, I; Tscharntke, T; Verhulst, J; West, T M; Yela, J L

    2006-03-01

    Agri-environment schemes are an increasingly important tool for the maintenance and restoration of farmland biodiversity in Europe but their ecological effects are poorly known. Scheme design is partly based on non-ecological considerations and poses important restrictions on evaluation studies. We describe a robust approach to evaluate agri-environment schemes and use it to evaluate the biodiversity effects of agri-environment schemes in five European countries. We compared species density of vascular plants, birds, bees, grasshoppers and crickets, and spiders on 202 paired fields, one with an agri-environment scheme, the other conventionally managed. In all countries, agri-environment schemes had marginal to moderately positive effects on biodiversity. However, uncommon species benefited in only two of five countries and species listed in Red Data Books rarely benefited from agri-environment schemes. Scheme objectives may need to differentiate between biodiversity of common species that can be enhanced with relatively simple modifications in farming practices and diversity or abundance of endangered species which require more elaborate conservation measures.

  4. Sen2-Agri country level demonstration for Ukraine

    NASA Astrophysics Data System (ADS)

    Kussul, N.; Kolotii, A.; Shelestov, A.; Lavreniuk, M. S.

    2016-12-01

    Due to launch of Sentinel-2 mission European Space Agency (ESA) started Sentinel-2 for Agriculture (Sen2-Agri) project coordinated by Universite catholique de Louvain (UCL). Ukraine is selected as one of 3 country level demonstration sites for benchmarking Sentinel-2 data due to wide range of main crops (both winter and summer), big fields and high enough climate variability over the territory [1-2]. Within this county level demonstration main objectives are following: i) Sentinel's products quality assessment and their suitability estimation for the territory of Ukraine [2]; ii) demonstration in order to convince decision makers and state authorities; iii) assessment of the personnel and facilities required to run the Sen2-Agri system and creation of Sen-2 Agri products (crop type maps and such essential climatic variable as Leaf Area Index - LAI [3]). During this project ground data were collected for crop land mapping and crop type classification along the roads within main agro-climatic zones of Ukraine. For LAI estimation we used indirect non-destructive method which is based on DHP-images and VALERI protocol. Products created with use of Sen2-Agri system deployed during project execution and results of neural-network approach utilization will be compared. References Kussul, N., Lemoine, G., Gallego, F. J., Skakun, S. V., Lavreniuk, M., & Shelestov, A. Y. Parcel-Based Crop Classification in Ukraine Using Landsat-8 Data and Sentinel-1A Data. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing , 9 (6), 2500-2508. Kussul, N., Skakun, S., Shelestov, A., Lavreniuk, M., Yailymov, B., & Kussul, O. (2015). Regional scale crop mapping using multi-temporal satellite imagery. The International Archives of Photogrammetry, Remote Sensing and Spatial Information Sciences, 40(7), 45-52. Shelestov, A., Kolotii, A., Camacho, F., Skakun, S., Kussul, O., Lavreniuk, M., & Kostetsky, O. (2015, July). Mapping of biophysical parameters based on high

  5. Researches in agri-food supply chain: A bibliometric study

    NASA Astrophysics Data System (ADS)

    Hisjam, Muhammad; Sutopo, Wahyudi

    2017-11-01

    Agri-food is very important for human being. Problems in managing agri-food are very complicated. There are many entities involved in managing agri-food with conflict of interest between them makes the problems become more complicated. Using supply chain approaches in agri-food will help solving the problems. The purpose of this paper is to show that the publications in agri-food supply chain research area are still promising and to show the research trend in agri-food supply chain. The study was a bibliometric study by using some queries on the website with the largest database of peer-reviewed literature. The queries were using various categories and refinements. Firstly the study was exploring all publications in this research area in some categories and then divided the duration into 2 intervals. The last query was to know how many publications are review type publications. The results show that the number of the publications with agri-food supply chain topics are still limited, and tend to increase. It means researches in this area are still promising. The results also show the most publications are from which source title, country, and affiliation. The results also show the research trend in this research area. The quantities of review type publications in agri-food supply chain are still few. It shows the need for more review type publications in this area.

  6. Fort Benton Agri-Industry Curriculum Outline.

    ERIC Educational Resources Information Center

    Fort Benton Public Schools, MT.

    The agri-industry curriculum for the Fort Benton school system was developed with funds under Title III of the Elementary and Secondary Education Act as part of the vocational technology curricula to develop skills and attitudes that will permit students to find satisfaction and success in their careers. The curriculum consists of agri-industry…

  7. The Seven Challenges for Transitioning into a Bio-based Circular Economy in the Agri-food Sector.

    PubMed

    Borrello, Massimiliano; Lombardi, Alessia; Pascucci, Stefano; Cembalo, Luigi

    2016-01-01

    Closed-loop agri-food supply chains have a high potential to reduce environmental and economic costs resulting from food waste disposal. This paper illustrates an alternative to the traditional supply chain of bread based on the principles of a circular economy. Six circular interactions among seven actors (grain farmers, bread producers, retailers, compostable packaging manufacturers, insect breeders, livestock farmers, consumers) of the circular filière are created in order to achieve the goal of "zero waste". In the model, two radical technological innovations are considered: insects used as animal feed and polylactic acid compostable packaging. The main challenges for the implementation of the new supply chain are identified. Finally, some recent patents related to bread sustainable production, investigated in the current paper, are considered. Recommendations are given to academics and practitioners interested in the bio-based circular economy model approach for transforming agri-food supply chains.

  8. Building Alternative Agri-Food Networks: Certification, Embeddedness and Agri-Environmental Governance

    ERIC Educational Resources Information Center

    Higgins, Vaughan; Dibden, Jacqui; Cocklin, Chris

    2008-01-01

    This paper examines the role of certification in alternative agri-food networks (AAFNs), which are "in the process" of building markets for their produce outside conventional supply chains. Drawing upon recent writing on "embeddedness", we argue that certification provides an important focus for exploring the relationship and…

  9. Spatial targeting of agri-environmental policy using bilevel evolutionary optimization

    USDA-ARS?s Scientific Manuscript database

    In this study we describe the optimal designation of agri-environmental policy as a bilevel optimization problem and propose an integrated solution method using a hybrid genetic algorithm. The problem is characterized by a single leader, the agency, that establishes a policy with the goal of optimiz...

  10. Improving the utilization of research knowledge in agri-food public health: a mixed-method review of knowledge translation and transfer.

    PubMed

    Rajić, Andrijana; Young, Ian; McEwen, Scott A

    2013-05-01

    Knowledge translation and transfer (KTT) aims to increase research utilization and ensure that the best available knowledge is used to inform policy and practice. Many frameworks, methods, and terms are used to describe KTT, and the field has largely developed in the health sector over the past decade. There is a need to review key KTT principles and methods in different sectors and evaluate their potential application in agri-food public health. We conducted a structured mixed-method review of the KTT literature. From 827 citations identified in a comprehensive search, we characterized 160 relevant review articles, case studies, and reports. A thematic analysis was conducted on a prioritized and representative subset of 33 articles to identify key principles and characteristics for ensuring effective KTT. The review steps were conducted by two or more independent reviewers using structured and pretested forms. We identified five key principles for effective KTT that were described within two contexts: to improve research utilization in general and to inform policy-making. To ensure general research uptake, there is a need for the following: (1) relevant and credible research; (2) ongoing interactions between researchers and end-users; (3) organizational support and culture; and (4) monitoring and evaluation. To inform policy-making, (5) researchers must also address the multiple and competing contextual factors of the policy-making process. We also describe 23 recommended and promising KTT methods, including six synthesis (e.g., systematic reviews, mixed-method reviews, and rapid reviews); nine dissemination (e.g., evidence summaries, social media, and policy briefs); and eight exchange methods (e.g., communities of practice, knowledge brokering, and policy dialogues). A brief description, contextual example, and key references are provided for each method. We recommend a wider endorsement of KTT principles and methods in agri-food public health, but there are

  11. Creating Culturally Sustainable Agri-Environmental Schemes

    ERIC Educational Resources Information Center

    Burton, Rob J. F.; Paragahawewa, Upananda Herath

    2011-01-01

    Evidence is emerging from across Europe that contemporary agri-environmental schemes are having only limited, if any, influence on farmers' long-term attitudes towards the environment. In this theoretical paper we argue that these approaches are not "culturally sustainable," i.e. the actions are not becoming embedded within farming…

  12. Studies on Agri-environmental Measures: A Survey of the Literature

    NASA Astrophysics Data System (ADS)

    Uthes, Sandra; Matzdorf, Bettina

    2013-01-01

    Agri-environmental measures (AEM) are incentive-based instruments in the European Union (EU) that provide payments to farmers for voluntary environmental commitments related to preserving and enhancing the environment and maintaining the cultural landscape. We review the AEM literature and provide an overview of important research topics, major research results and future challenges as discussed in the available literature concerning these measures. This review contributes to the existing literature by attempting to equally consider ecological and economic perspectives. The reviewed articles are analyzed regarding their regional focus, topics and methods. The analytical section of the article seeks to discuss commonly asked questions about AEM on the basis of results from reviewed studies. The vast amount of available literature provides valuable insights into specific cases and reveals a complex picture with few general conclusions. The existing research is usually either biased toward ecological or economic perspectives and fails to provide a holistic picture of the problems and challenges within agri-environmental programming (e.g., multiple measures, multiple target areas, legal aspects, financial constraints, transaction costs). Most empirical studies provide detailed insights into selected individual measures but are incapable of providing results at a level relevant to decision-making, as they neglect the role of farmers and the available AEM budget. Predominantly economic approaches often only consider rough assumptions of ecological and economic processes and are also not suitable for decision-making. Decision-support tools that build on these disciplinary results and simultaneously consider scheme factors and environmental conditions at high spatial resolution for application by the responsible authorities are rare and require further research.

  13. Blurring the Boundaries between Vocational Education, Business and Research in the Agri-Food Domain

    ERIC Educational Resources Information Center

    Wals, Arjen E. J.; Lans, Thomas; Kupper, Hendrik

    2012-01-01

    This article discusses the emergence and significance of new knowledge configurations within the Dutch agri-food context. Knowledge configurations can be characterised as arrangements between VET and (often regional) partners in business and research aimed at improving knowledge transfer, circulation or co-creation. Based on a literature review…

  14. Nanotechnology in agri-food production: an overview

    PubMed Central

    Sekhon, Bhupinder Singh

    2014-01-01

    Nanotechnology is one of the most important tools in modern agriculture, and agri-food nanotechnology is anticipated to become a driving economic force in the near future. Agri-food themes focus on sustainability and protection of agriculturally produced foods, including crops for human consumption and animal feeding. Nanotechnology provides new agrochemical agents and new delivery mechanisms to improve crop productivity, and it promises to reduce pesticide use. Nanotechnology can boost agricultural production, and its applications include: 1) nanoformulations of agrochemicals for applying pesticides and fertilizers for crop improvement; 2) the application of nanosensors/nanobiosensors in crop protection for the identification of diseases and residues of agrochemicals; 3) nanodevices for the genetic manipulation of plants; 4) plant disease diagnostics; 5) animal health, animal breeding, poultry production; and 6) postharvest management. Precision farming techniques could be used to further improve crop yields but not damage soil and water, reduce nitrogen loss due to leaching and emissions, as well as enhance nutrients long-term incorporation by soil microorganisms. Nanotechnology uses include nanoparticle-mediated gene or DNA transfer in plants for the development of insect-resistant varieties, food processing and storage, nanofeed additives, and increased product shelf life. Nanotechnology promises to accelerate the development of biomass-to-fuels production technologies. Experts feel that the potential benefits of nanotechnology for agriculture, food, fisheries, and aquaculture need to be balanced against concerns for the soil, water, and environment and the occupational health of workers. Raising awareness of nanotechnology in the agri-food sector, including feed and food ingredients, intelligent packaging and quick-detection systems, is one of the keys to influencing consumer acceptance. On the basis of only a handful of toxicological studies, concerns have

  15. Identifying research advancements in supply chain risk management for Agri-food Industries: Literature review

    NASA Astrophysics Data System (ADS)

    Septiani, W.; Astuti, P.

    2017-12-01

    Agri-food supply chain has different characteristics related to the raw materials it uses. Food supply chain has a high risk of damage, thus drawing a lot of attention from researchers in supply chain management. This research aimed to investigate the development of supply chain risk management research on agri-food industries. These reviews were arranged in steps systematically, ranging from searching related to the review of SCRM paper, reviewing the general framework of SCRM and the framework of agri-food SCRM. Selection of literature review papers in the period 2005-2017, and obtained 45 papers. The results of the identification research were illustrated in a supply chain risk management framework model. This provided insight toward future research directions and needs.

  16. agriGO v2.0: a GO analysis toolkit for the agricultural community, 2017 update.

    PubMed

    Tian, Tian; Liu, Yue; Yan, Hengyu; You, Qi; Yi, Xin; Du, Zhou; Xu, Wenying; Su, Zhen

    2017-07-03

    The agriGO platform, which has been serving the scientific community for >10 years, specifically focuses on gene ontology (GO) enrichment analyses of plant and agricultural species. We continuously maintain and update the databases and accommodate the various requests of our global users. Here, we present our updated agriGO that has a largely expanded number of supporting species (394) and datatypes (865). In addition, a larger number of species have been classified into groups covering crops, vegetables, fish, birds and insects closely related to the agricultural community. We further improved the computational efficiency, including the batch analysis and P-value distribution (PVD), and the user-friendliness of the web pages. More visualization features were added to the platform, including SEACOMPARE (cross comparison of singular enrichment analysis), direct acyclic graph (DAG) and Scatter Plots, which can be merged by choosing any significant GO term. The updated platform agriGO v2.0 is now publicly accessible at http://systemsbiology.cau.edu.cn/agriGOv2/. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  17. agriGO v2.0: a GO analysis toolkit for the agricultural community, 2017 update

    PubMed Central

    Tian, Tian; Liu, Yue; Yan, Hengyu; You, Qi; Yi, Xin; Du, Zhou

    2017-01-01

    Abstract The agriGO platform, which has been serving the scientific community for >10 years, specifically focuses on gene ontology (GO) enrichment analyses of plant and agricultural species. We continuously maintain and update the databases and accommodate the various requests of our global users. Here, we present our updated agriGO that has a largely expanded number of supporting species (394) and datatypes (865). In addition, a larger number of species have been classified into groups covering crops, vegetables, fish, birds and insects closely related to the agricultural community. We further improved the computational efficiency, including the batch analysis and P-value distribution (PVD), and the user-friendliness of the web pages. More visualization features were added to the platform, including SEACOMPARE (cross comparison of singular enrichment analysis), direct acyclic graph (DAG) and Scatter Plots, which can be merged by choosing any significant GO term. The updated platform agriGO v2.0 is now publicly accessible at http://systemsbiology.cau.edu.cn/agriGOv2/. PMID:28472432

  18. Connecting Payments for Ecosystem Services and Agri-Environment Regulation: An Analysis of the Welsh Glastir Scheme

    ERIC Educational Resources Information Center

    Wynne-Jones, Sophie

    2013-01-01

    Policy debates in the European Union have increasingly emphasised "Payments for Ecosystem Services" (PES) as a model for delivering agri-environmental objectives. This paper examines the Glastir scheme, introduced in Wales in 2009, as a notable attempt to move between long standing models of European agri-environment regulation and…

  19. Agri-Environmental Policy Measures in Israel: The Potential of Using Market-Oriented Instruments

    NASA Astrophysics Data System (ADS)

    Amdur, Liron; Bertke, Elke; Freese, Jan; Marggraf, Rainer

    2011-05-01

    This paper examines the possibilities of developing agri-environmental policy measures in Israel, focusing on market-oriented instruments. A conceptual framework for developing agri-environmental policy measures is presented, first in very broad lines (mandatory regulations, economic instruments and advisory measures) and subsequently focusing on economic instruments, and specifically, on market-oriented ones. Two criteria of choice between the measures are suggested: their contribution to improving the effectiveness of the policy; and the feasibility of their implementation. This is the framework used for analyzing agri-environmental measures in Israel. Israel currently implements a mix of mandatory regulations, economic instruments and advisory measures to promote the agri-environment. The use of additional economic instruments may improve the effectiveness of the policy. When comparing the effectiveness of various economic measures, we found that the feasibility of implementation of market-oriented instruments is greater, due to the Israeli public's preference for strengthening market orientation in the agricultural sector. Four market-oriented instruments were practiced in a pilot project conducted in an Israeli rural area. We found that in this case study, the institutional feasibility and acceptance by stakeholders were the major parameters influencing the implementation of the market-oriented instruments, whereas the instruments' contribution to enhancing the ecological or economic effectiveness were hardly considered by the stakeholders as arguments in favor of their use.

  20. Agri-environmental policy measures in Israel: the potential of using market-oriented instruments.

    PubMed

    Amdur, Liron; Bertke, Elke; Freese, Jan; Marggraf, Rainer

    2011-05-01

    This paper examines the possibilities of developing agri-environmental policy measures in Israel, focusing on market-oriented instruments. A conceptual framework for developing agri-environmental policy measures is presented, first in very broad lines (mandatory regulations, economic instruments and advisory measures) and subsequently focusing on economic instruments, and specifically, on market-oriented ones. Two criteria of choice between the measures are suggested: their contribution to improving the effectiveness of the policy; and the feasibility of their implementation. This is the framework used for analyzing agri-environmental measures in Israel. Israel currently implements a mix of mandatory regulations, economic instruments and advisory measures to promote the agri-environment. The use of additional economic instruments may improve the effectiveness of the policy. When comparing the effectiveness of various economic measures, we found that the feasibility of implementation of market-oriented instruments is greater, due to the Israeli public's preference for strengthening market orientation in the agricultural sector. Four market-oriented instruments were practiced in a pilot project conducted in an Israeli rural area. We found that in this case study, the institutional feasibility and acceptance by stakeholders were the major parameters influencing the implementation of the market-oriented instruments, whereas the instruments' contribution to enhancing the ecological or economic effectiveness were hardly considered by the stakeholders as arguments in favor of their use.

  1. Regulatory aspects of nanotechnology in the agri/feed/food sector in EU and non-EU countries.

    PubMed

    Amenta, Valeria; Aschberger, Karin; Arena, Maria; Bouwmeester, Hans; Botelho Moniz, Filipa; Brandhoff, Puck; Gottardo, Stefania; Marvin, Hans J P; Mech, Agnieszka; Quiros Pesudo, Laia; Rauscher, Hubert; Schoonjans, Reinhilde; Vettori, Maria Vittoria; Weigel, Stefan; Peters, Ruud J

    2015-10-01

    Nanotechnology has the potential to innovate the agricultural, feed and food sectors (hereinafter referred to as agri/feed/food). Applications that are marketed already include nano-encapsulated agrochemicals or nutrients, antimicrobial nanoparticles and active and intelligent food packaging. Many nano-enabled products are currently under research and development, and may enter the market in the near future. As for any other regulated product, applicants applying for market approval have to demonstrate the safe use of such new products without posing undue safety risks to the consumer and the environment. Several countries all over the world have been active in examining the appropriateness of their regulatory frameworks for dealing with nanotechnologies. As a consequence of this, different approaches have been taken in regulating nano-based products in agri/feed/food. The EU, along with Switzerland, were identified to be the only world region where nano-specific provisions have been incorporated in existing legislation, while in other regions nanomaterials are regulated more implicitly by mainly building on guidance for industry. This paper presents an overview and discusses the state of the art of different regulatory measures for nanomaterials in agri/feed/food, including legislation and guidance for safety assessment in EU and non-EU countries. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  2. Co-Construction of Agency and Environmental Management. The Case of Agri-Environmental Policy Implementation at Finnish Farms

    ERIC Educational Resources Information Center

    Kaljonen, Minna

    2006-01-01

    One of the main challenges of European environmental policies is to recruit local-level actors to fulfill set targets. This article explores how targets of European agri-environmental policy have been achieved in Finland. It also analyses how implementation practices produce conditions for agri-environmental management and how policy success-or…

  3. Stakeholder involvement in agri-environmental policy making--learning from a local- and a state-level approach in Germany.

    PubMed

    Prager, Katrin; Freese, Jan

    2009-02-01

    Recent European regulations for rural development emphasise the requirement to involve stakeholder groups and other appropriate bodies in the policy-making process. This paper presents two cases involving stakeholder participation in agri-environmental development and policy making, targeted at different policy-making levels. One study was undertaken in Lower Saxony where a local partnership developed and tested an agri-environmental prescription, which was later included in the state's menu of agri-environmental schemes. In Sachsen-Anhalt, state-facilitated stakeholder workshops including a mathematical model were used to optimise the programme planning and budget allocation at the state level. Both studies aimed at improving the acceptance of agri-environmental schemes. The authors gauge the effectiveness of the two approaches and discuss what lessons can be learned. The experience suggests that the approaches can complement one another and could also be applied to rural policy making.

  4. Integrated cost-effectiveness analysis of agri-environmental measures for water quality.

    PubMed

    Balana, Bedru B; Jackson-Blake, Leah; Martin-Ortega, Julia; Dunn, Sarah

    2015-09-15

    This paper presents an application of integrated methodological approach for identifying cost-effective combinations of agri-environmental measures to achieve water quality targets. The methodological approach involves linking hydro-chemical modelling with economic costs of mitigation measures. The utility of the approach was explored for the River Dee catchment in North East Scotland, examining the cost-effectiveness of mitigation measures for nitrogen (N) and phosphorus (P) pollutants. In-stream nitrate concentration was modelled using the STREAM-N and phosphorus using INCA-P model. Both models were first run for baseline conditions and then their effectiveness for changes in land management was simulated. Costs were based on farm income foregone, capital and operational expenditures. The costs and effects data were integrated using 'Risk Solver Platform' optimization in excel to produce the most cost-effective combination of measures by which target nutrient reductions could be attained at a minimum economic cost. The analysis identified different combination of measures as most cost-effective for the two pollutants. An important aspect of this paper is integration of model-based effectiveness estimates with economic cost of measures for cost-effectiveness analysis of land and water management options. The methodological approach developed is not limited to the two pollutants and the selected agri-environmental measures considered in the paper; the approach can be adapted to the cost-effectiveness analysis of any catchment-scale environmental management options. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Recent Advances in the Fabrication and Application of Screen-Printed Electrochemical (Bio)Sensors Based on Carbon Materials for Biomedical, Agri-Food and Environmental Analyses

    PubMed Central

    Hughes, Gareth; Westmacott, Kelly; Honeychurch, Kevin C.; Crew, Adrian; Pemberton, Roy M.; Hart, John P.

    2016-01-01

    This review describes recent advances in the fabrication of electrochemical (bio)sensors based on screen-printing technology involving carbon materials and their application in biomedical, agri-food and environmental analyses. It will focus on the various strategies employed in the fabrication of screen-printed (bio)sensors, together with their performance characteristics; the application of these devices for the measurement of selected naturally occurring biomolecules, environmental pollutants and toxins will be discussed. PMID:27690118

  6. Virtual water trade of agri-food products: Evidence from italian-chinese relations.

    PubMed

    Lamastra, Lucrezia; Miglietta, Pier Paolo; Toma, Pierluigi; De Leo, Federica; Massari, Stefania

    2017-12-01

    At global scale, the majority of world water withdrawal is for the agricultural sector, with differences among countries depending on the relevance of agri-food sector in the economy. Virtual water and water footprint could be useful to express the impact on the water resources of each production process and good with the objective to lead to a sustainable use of water at a global level. International trade could be connected to the virtual water flows, in fact through commodities importation, water poor countries can save their own water resources. The present paper focuses on the bilateral virtual water flows connected to the top ten agri-food products traded between Italy and China. Comparing the virtual water flow related to the top 10 agri-food products, the virtual water flow from Italy to China is bigger than the water flow in the opposite direction. Moreover, the composition of virtual water flows is different; Italy imports significant amounts of grey water from China, depending on the different environmental strategies adopted by the two selected countries. This difference could be also related to the fact that traded commodities are very different; the 91% of virtual water imported by Italy is connected to crops products, while the 95% of virtual water imported by China is related to the animal products. Considering national water saving and global water saving, appears that Italy imports virtual water from China while China exerts pressure on its water resources to supply the exports to Italy. This result at global scale implies a global water loss of 129.29millionm3 because, in general, the agri-food products are traded from the area with lower water productivity to the area with the higher water productivity. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Elementary student and prospective teachers' agri-food system literacy: Understandings of agricultural and science education's goals for learning

    NASA Astrophysics Data System (ADS)

    Trexler, Cary Jay

    1999-09-01

    Although rhetoric abounds in the agricultural education literature regarding the public's dearth of agri-food system literacy, problems arise when establishing educational interventions to help ameliorate illiteracy. Researchers do not fully know what individuals understand about the complex agri-food system. Hence, educational programs and curricula may focus on areas where students already possess well developed and scientifically accurate schemata, while ignoring other areas where incompatible or naive understandings persist. Democratic decisions about complex societal and environmental issues, such as trade-offs of our industrial agri-food system, require individuals to possess understandings of complex interrelationships. This exploratory qualitative study determines what two groups---elementary students and prospective elementary school teachers---understand about selected concepts foundational to agri-food system literacy. To ground the study in current national education curricular standards, a synthesis of both agricultural and science education benchmarks was developed. This helped structure interviews with the study's informants: nine elementary students and nine prospective elementary teachers. Analysis of discourse was based upon a conceptual change methodology. Findings showed that informant background and non-school experiences were linked to agri-food system literacy, while formal, in-school learning was not. For elementary students, high socio-economic status, gardening and not living in urban areas were correlates with literacy; the prospective teacher group exhibited similar trends. Informants understood that food came from farms where plants and animals were raised. For the majority, however, farms were described as large gardens. Additionally, informants lacked a clear understanding of the roles soil and fertilizers play in crop production. Further, few spoke of weeds as competitors with crops for growth requirements. Informants understood that

  8. Exploring the Social Benefits of Agri-Environment Schemes in England

    ERIC Educational Resources Information Center

    Mills, Jane

    2012-01-01

    Recent decades have seen sustainable development emerging as a core concern of European Union (EU) policy. In order to consider how policies can contribute more positively to the goals of sustainable development, major EU policies must undergo an assessment of their potential economic, environmental and social impacts. Within the agri-environment…

  9. Agri-Business, Natural Resources, Marine Science; Grade 7. Cluster V.

    ERIC Educational Resources Information Center

    Calhoun, Olivia H.

    A curriculum guide for grade 7, the document is devoted to the occupational clusters "Agri-business, Natural Resources, and Marine Science." It is divided into five units: natural resources, ecology, landscaping, conservation, oceanography. Each unit is introduced by a statement of the topic, the unit's purpose, main ideas, quests, and a…

  10. The Difficulties of the Bilingual Social Sciences Teacher Candidates in Educational Activities: A Case of Agri Ibrahim Cecen Univeristy

    ERIC Educational Resources Information Center

    Keçeci Kurt, Songül

    2014-01-01

    This study aims at determining the difficulties experienced by bilingual teacher candidates in education. The study has a qualitative design, semi-structured interview is used and data are obtained with content analysis method. The study group of the research includes 117 students studying at different classes in Agri Ibrahim Cecen University,…

  11. Work-Related Lifelong Learning for Entrepreneurs in the Agri-Food Sector

    ERIC Educational Resources Information Center

    Lans, Thomas; Wesselink, Renate; Biemans, Harm J. A.; Mulder, Martin

    2004-01-01

    This article presents a study on work-related lifelong learning for entrepreneurs in the agri-food sector. Accordingly, learning needs, learning preferences, learning motivation and conditions in the context of lifelong learning were identified. The results indicate that technology, IT and entrepreneurial competencies will become of increasing…

  12. Efficacy and efficiency of Agri-environmental payments in impacts of crops' management

    NASA Astrophysics Data System (ADS)

    Blasi, Emanuele; Martella, Angelo; Passeri, Nicolo; Ghini, Paolo

    2015-04-01

    Since the 90s, in Europe the Common Agricultural Policy (CAP) started to activate measures for improving the sustainability of European agriculture, these measures were systematized in 2000 with the tools of rural development, pursuing a synergistic environmental action trough the agri-environmental payments. Since their definition, those payments were designed to ensure the protection, maintenance and enhancement of natural resources (water, soil, forests), biodiversity (species and habitat), and landscape. In particular initiatives as set aside, afforestation, organic agriculture, integrated pest management, low input and precision agriculture have enriched the agricultural management practices. The aim of this work is to check the trend between agro-environmental subsidies and environmental performance (based on Ecological Indicators and CO2 evaluation) at country level in EU, in order to study the regulatory framework impact in addressing the European cropping system towards sustainability. In particular soils and their land use can storage CO2 as pool and so provide environmental services and, on the other hand the agricultural practices can stimulate the emission and the environmental footprint. Impacts (so called emissions/footprints and storage/environmental services) will be compared with the Agri-environmental Payments for calculating performances due to environmental management practices, supported by political initiatives. Such analysis sustains the European policy makers towards more suitable agricultural policies and in particular it can address national sustainability through agricultural practices.

  13. Curriculum Development for the Achievement of Multiple Goals in the Agri-Food Industry.

    ERIC Educational Resources Information Center

    Stonehouse, D. P.

    1994-01-01

    The agri-food industry is concerned with maximizing global food output while preventing environmental damage. Agricultural education focuses on multidisciplinary, holistic, and integrative approaches that enhance student capabilities to address this complex issue. (SK)

  14. Design rules for successful governmental payments for ecosystem services: Taking agri-environmental measures in Germany as an example.

    PubMed

    Meyer, Claas; Reutter, Michaela; Matzdorf, Bettina; Sattler, Claudia; Schomers, Sarah

    2015-07-01

    In recent years, increasing attention has been paid to financial environmental policy instruments that have played important roles in solving agri-environmental problems throughout the world, particularly in the European Union and the United States. The ample and increasing literature on Payments for Ecosystem Services (PES) and agri-environmental measures (AEMs), generally understood as governmental PES, shows that certain single design rules may have an impact on the success of a particular measure. Based on this research, we focused on the interplay of several design rules and conducted a comparative analysis of AEMs' institutional arrangements by examining 49 German cases. We analyzed the effects of the design rules and certain rule combinations on the success of AEMs. Compliance and noncompliance with the hypothesized design rules and the success of the AEMs were surveyed by questioning the responsible agricultural administration and the AEMs' mid-term evaluators. The different rules were evaluated in regard to their necessity and sufficiency for success using Qualitative Comparative Analysis (QCA). Our results show that combinations of certain design rules such as environmental goal targeting and area targeting conditioned the success of the AEMs. Hence, we generalize design principles for AEMs and discuss implications for the general advancement of ecosystem services and the PES approach in agri-environmental policies. Moreover, we highlight the relevance of the results for governmental PES program research and design worldwide. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. Understanding the complexities of private standards in global agri-food chains as they impact developing countries.

    PubMed

    Henson, Spencer; Humphrey, John

    2010-01-01

    The increasing prevalence of private standards governing food safety, food quality and environmental and social impacts of agri-food systems has raised concerns about the effects on developing countries, as well as the governance of agri-food value chains more broadly. It is argued that current debates have been 'clouded' by a failure to recognise the diversity of private standards in terms of their institutional form, who develops and adopts these standards and why. In particular, there is a need to appreciate the close inter-relationships between public regulations and private standards and the continuing ways in which private standards evolve.

  16. The role of multi-target policy instruments in agri-environmental policy mixes.

    PubMed

    Schader, Christian; Lampkin, Nicholas; Muller, Adrian; Stolze, Matthias

    2014-12-01

    The Tinbergen Rule has been used to criticise multi-target policy instruments for being inefficient. The aim of this paper is to clarify the role of multi-target policy instruments using the case of agri-environmental policy. Employing an analytical linear optimisation model, this paper demonstrates that there is no general contradiction between multi-target policy instruments and the Tinbergen Rule, if multi-target policy instruments are embedded in a policy-mix with a sufficient number of targeted instruments. We show that the relation between cost-effectiveness of the instruments, related to all policy targets, is the key determinant for an economically sound choice of policy instruments. If economies of scope with respect to achieving policy targets are realised, a higher cost-effectiveness of multi-target policy instruments can be achieved. Using the example of organic farming support policy, we discuss several reasons why economies of scope could be realised by multi-target agri-environmental policy instruments. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. Environmental monitoring of the area surrounding oil wells in Val d'Agri (Italy): element accumulation in bovine and ovine organs.

    PubMed

    Miedico, Oto; Iammarino, Marco; Paglia, Giuseppe; Tarallo, Marina; Mangiacotti, Michele; Chiaravalle, A Eugenio

    2016-06-01

    In this work, environmental heavy metal contamination in the Val d'Agri area of Southern Italy was monitored, measuring the accumulation of 18 heavy metals (U, Hg, Pb, Cd, As, Sr, Sn, V, Ni, Cr, Mo, Co, Cu, Zn, Ca, Mn, Fe, and Al) in the organs of animals raised in the surrounding area (kidney, lung, and liver of bovine and ovine species). Val d'Agri features various oil processing centers which are potentially a significant source of environmental pollution, making it essential to perform studies that will outline the state of the art on which any recovery plans and interventions may be developed. The analysis was carried out using official and accredited analytical methods based on inductively coupled plasma mass spectrometry, and the measurements were statistically processed in order to give a contribution to risk assessment. Even though five samples showed Pb and Cd concentrations above the limits defined in the European Commission Regulation (EC) No 1881/2006, the mean concentrations of most elements suggest that contamination in this area is low. Consequently, these results also suggest that there is no particular risk for human exposure to toxic trace elements. Nevertheless, the findings of this work confirm that element accumulation in ovine species is correlated with geographical livestock area. Therefore, ovine-specific organs might be used as bioindicators for monitoring contamination by specific toxic elements in exposed areas.

  18. Hyperspectral imaging for diagnosis and quality control in agri-food and industrial sectors

    NASA Astrophysics Data System (ADS)

    García-Allende, P. Beatriz; Conde, Olga M.; Mirapeix, Jesus; Cobo, Adolfo; Lopez-Higuera, Jose M.

    2010-04-01

    Optical spectroscopy has been utilized in various fields of science, industry and medicine, since each substance is discernible from all others by its spectral properties. However, optical spectroscopy traditionally generates information on the bulk properties of the whole sample, and mainly in the agri-food industry some product properties result from the heterogeneity in its composition. This monitoring is considerably more challenging and can be successfully achieved by the so-called hyperspectral imaging technology, which allows the simultaneous determination of the optical spectrum and the spatial location of an object in a surface. In addition, it is a nonintrusive and non-contact technique which gives rise to a great potential for industrial applications and it does not require any particular preparation of the samples, which is a primary concern in food monitoring. This work illustrates an overview of approaches based on this technology to address different problems in agri-food and industrial sectors. The hyperspectral system was originally designed and tested for raw material on-line discrimination, which is a key factor in the input stages of many industrial sectors. The combination of the acquisition of the spectral information across transversal lines while materials are being transported on a conveyor belt, and appropriate image analyses have been successfully validated in the tobacco industry. Lastly, the use of imaging spectroscopy applied to online welding quality monitoring is discussed and compared with traditional spectroscopic approaches in this regard.

  19. Agri-Environmental Resource Management by Large-Scale Collective Action: Determining KEY Success Factors

    ERIC Educational Resources Information Center

    Uetake, Tetsuya

    2015-01-01

    Purpose: Large-scale collective action is necessary when managing agricultural natural resources such as biodiversity and water quality. This paper determines the key factors to the success of such action. Design/Methodology/Approach: This paper analyses four large-scale collective actions used to manage agri-environmental resources in Canada and…

  20. Roll-Out Neoliberalism and Hybrid Practices of Regulation in Australian Agri-Environmental Governance

    ERIC Educational Resources Information Center

    Lockie, Stewart; Higgins, Vaughan

    2007-01-01

    In the last 15 years, agri-environmental programmes in Australia have been underpinned by a neoliberal regime of governing which seeks to foster participation and "bottom-up" change at the regional level at the same time as encouraging farmers to become entrepreneurial and improve their productivity and environmental performance without…

  1. The role of network bridging organisations in compensation payments for agri-environmental services under the EU Common Agricultural Policy.

    PubMed

    Dedeurwaerdere, Tom; Polard, Audrey; Melindi-Ghidi, Paolo

    2015-11-01

    Compensation payments to farmers for the provision of agri-environmental services are a well-established policy scheme under the EU Common Agricultural Policy. However, in spite of the success in most EU countries in the uptake of the programme by farmers, the impact of the scheme on the long term commitment of farmers to change their practices remains poorly documented. To explore this issue, this paper presents the results of structured field interviews and a quantitative survey in the Walloon Region of Belgium. The main finding of this study is that farmers who have periodic contacts with network bridging organisations that foster cooperation and social learning in the agri-environmental landscapes show a higher commitment to change. This effect is observed both for farmers with high and low concern for biodiversity depletion. Support for network bridging organisations is foreseen under the EU Leader programme and the EU regulation 1306/2013, which could open-up interesting opportunities for enhancing the effectiveness of the current payment scheme for agri-environmental services.

  2. Adoption of Agri-Environmental Measures by Organic Farmers: The Role of Interpersonal Communication

    ERIC Educational Resources Information Center

    Unay Gailhard, Ilkay; Bavorová, Miroslava; Pirscher, Frauke

    2015-01-01

    Purpose: The purpose of this study is to investigate the impact of interpersonal communication on the adoption of agri-environmental measures (AEM) by organic farmers in Germany. Methodology: The study used the logit model to predict the probability of adoption behaviour, and Social Network Analysis (SNA) was conducted to analyse the question of…

  3. A satellite-based analysis of the Val d'Agri Oil Center (southern Italy) gas flaring emissions

    NASA Astrophysics Data System (ADS)

    Faruolo, M.; Coviello, I.; Filizzola, C.; Lacava, T.; Pergola, N.; Tramutoli, V.

    2014-10-01

    In this paper, the robust satellite techniques (RST), a multi-temporal scheme of satellite data analysis, was implemented to analyze the flaring activity of the Val d'Agri Oil Center (COVA), the largest Italian gas and oil pre-treatment plant, owned by Ente Nazionale Idrocarburi (ENI). For this site, located in an anthropized area characterized by a large environmental complexity, flaring emissions are mainly related to emergency conditions (i.e., waste flaring), as industrial processes are regulated by strict regional laws. While regarding the peculiar characteristics of COVA flaring, the main aim of this work was to assess the performances of RST in terms of sensitivity and reliability in providing independent estimations of gas flaring volumes in such conditions. In detail, RST was implemented for 13 years of Moderate Resolution Imaging Spectroradiometer (MODIS) medium and thermal infrared data in order to identify the highly radiant records associated with the COVA flare emergency discharges. Then, using data provided by ENI about gas flaring volumes in the period 2003-2009, a MODIS-based regression model was developed and tested. The results achieved indicate that the such a model is able to estimate, with a good level of accuracy (R2 of 0.83), emitted gas flaring volumes at COVA.

  4. Integrated Approach of Agri-nanotechnology: Challenges and Future Trends

    PubMed Central

    Mishra, Sandhya; Keswani, Chetan; Abhilash, P. C.; Fraceto, Leonardo F.; Singh, Harikesh Bahadur

    2017-01-01

    Nanotechnology representing a new frontier in modern agriculture is anticipated to become a major thrust in near future by offering potential applications. This integrating approach, i.e., agri-nanotechnology has great potential to cope with global challenges of food production/security, sustainability and climate change. However, despite the potential benefits of nanotechnology in agriculture so far, their relevance has not reached up to the field conditions. The elevating concerns about fate, transport, bioavailability, nanoparticles toxicity and inappropriateness of regulatory framework limit the complete acceptance and inclination to adopt nanotechnologies in agricultural sector. Moreover, the current research trends lack realistic approach that fail to attain comprehensive knowledge of risk assessment factors and further toxicity of nanoparticles toward agroecosystem components viz. plant, soil, soil microbiomes after their release into the environment. Hence in the present review we attempt to suggest certain key points to be addressed in the current and future agri-nanotechnology researches on the basis of recognized knowledge gaps with strong recommendation of incorporating biosynthesized nanoparticles to carry out analogous functions. In this perspective, the major points are as follows: (i) Mitigating risk assessment factors (responsible for fate, transport, behavior, bioavailability and toxicity) for alleviating the subsequent toxicity of nanoparticles. (ii) Optimizing permissible level of nanoparticles dose within the safety limits by performing dose dependent studies. (iii) Adopting realistic approach by designing the experiments in natural habitat and avoiding in vitro assays for accurate interpretation. (iv) Most importantly, translating environmental friendly and non-toxic biosynthesized nanoparticles from laboratory to field conditions for agricultural benefits. PMID:28421100

  5. Current use of impact models for agri-environment schemes and potential for improvements of policy design and assessment.

    PubMed

    Primdahl, Jørgen; Vesterager, Jens Peter; Finn, John A; Vlahos, George; Kristensen, Lone; Vejre, Henrik

    2010-06-01

    Agri-Environment Schemes (AES) to maintain or promote environmentally-friendly farming practices were implemented on about 25% of all agricultural land in the EU by 2002. This article analyses and discusses the actual and potential use of impact models in supporting the design, implementation and evaluation of AES. Impact models identify and establish the causal relationships between policy objectives and policy outcomes. We review and discuss the role of impact models at different stages in the AES policy process, and present results from a survey of impact models underlying 60 agri-environmental schemes in seven EU member states. We distinguished among three categories of impact models (quantitative, qualitative or common sense), depending on the degree of evidence in the formal scheme description, additional documents, or key person interviews. The categories of impact models used mainly depended on whether scheme objectives were related to natural resources, biodiversity or landscape. A higher proportion of schemes dealing with natural resources (primarily water) were based on quantitative impact models, compared to those concerned with biodiversity or landscape. Schemes explicitly targeted either on particular parts of individual farms or specific areas tended to be based more on quantitative impact models compared to whole-farm schemes and broad, horizontal schemes. We conclude that increased and better use of impact models has significant potential to improve efficiency and effectiveness of AES. (c) 2009 Elsevier Ltd. All rights reserved.

  6. A satellite-based analysis of the Val d'Agri (South of Italy) Oil Center gas flaring emissions

    NASA Astrophysics Data System (ADS)

    Faruolo, M.; Coviello, I.; Filizzola, C.; Lacava, T.; Pergola, N.; Tramutoli, V.

    2014-06-01

    In this paper the Robust Satellite Techniques (RST), a multi-temporal scheme of satellite data analysis, was implemented to analyze the flaring activity of the largest Italian gas and oil pre-treatment plant (i.e. the Ente Nazionale Idrocarburi - ENI - Val d'Agri Oil Center - COVA). For this site, located in an anthropized area characterized by a~large environmental complexity, flaring emissions are mainly related to emergency conditions (i.e. waste flaring), being the industrial process regulated by strict regional laws. With reference to the peculiar characteristics of COVA flaring, the main aim of this work was to assess the performances of RST in terms of sensitivity and reliability in providing independent estimations of gas flaring volumes in such conditions. In detail, RST was implemented on thirteen years of Moderate Resolution Imaging Spectroradiometer (MODIS) medium and thermal infrared data in order to identify the highly radiant records associated to the COVA flare emergency discharges. Then, exploiting data provided by ENI about gas flaring volumes in the period 2003-2009, a MODIS-based regression model was developed and tested. Achieved results indicate that such a model is able to estimate, with a good level of accuracy (R2 of 0.83), emitted gas flaring volumes at COVA.

  7. Agri-spillways as soil erosion protection tools in conventional sloping vineyards (Montes de Málaga, Spain)

    NASA Astrophysics Data System (ADS)

    Rodrigo-Comino, Jesús

    2017-04-01

    Rainfall causes soil erosion on Mediterranean sloping vineyards (>25˚ of slope inclination), however, little is known about information related to cheap, effective and suitable soil erosion protection measures. In the vineyards of the Montes de Málaga (southern Spain), a concrete land management practice against soil erosion is actually conducted by building tilled rills to down-slope direction to canalize water and sediments. We decided to call them agri-spillways. In this study, by carrying out runoff experiments, we assessed two agri-spillways (from 10 m to 15 m length) under extreme conditions. A motor driven pump mobilizes a constant water inflow about of 1.33 L s-1during between 12 and 15 minutes (≈1000 litres). Finally, we observed: i) a high capacity of these agri-spillways to canalize a large volume of water and sediments; and, ii) higher speed of water flow (from 0.16 m s-1to 0.28 m s-1) and sediment concentration (SC) rates with ratios up to 1538.6 g l-1). By comparing among them, the speed of water flow and the SC were much higher in one of tested rills, which was 5 meters length less and 7 degrees more of inclination. So, we concluded that these agri-spillways, after correctly planning and long term maintenance from contribution area to down-slope direction, can be function as a potential tool for designing suitable and cheap plans to protect the soil in Mediterranean sloping vineyards. Acknowledgements Firstly, we acknowledge the farmer's syndicate UPA (Unión de Pequeños Agricultores) and the wine-grower Pepe Gámez (Almáchar) for providing access to the study area. Secondly, we thank the students of the Bachelor course and Master from Trier University for their hard efforts in the field and laboratory works in the Almáchar campaign. Thirdly, we acknowledge the geomorphology and soil laboratory technicians María Pedraza and Rubén Rojas of GSoilLab (Málaga University) for the soil analysis. Finally, we also thank the Ministerio de Educaci

  8. ABS-SmartComAgri: An Agent-Based Simulator of Smart Communication Protocols in Wireless Sensor Networks for Debugging in Precision Agriculture.

    PubMed

    García-Magariño, Iván; Lacuesta, Raquel; Lloret, Jaime

    2018-03-27

    Smart communication protocols are becoming a key mechanism for improving communication performance in networks such as wireless sensor networks. However, the literature lacks mechanisms for simulating smart communication protocols in precision agriculture for decreasing production costs. In this context, the current work presents an agent-based simulator of smart communication protocols for efficiently managing pesticides. The simulator considers the needs of electric power, crop health, percentage of alive bugs and pesticide consumption. The current approach is illustrated with three different communication protocols respectively called (a) broadcast, (b) neighbor and (c) low-cost neighbor. The low-cost neighbor protocol obtained a statistically-significant reduction in the need of electric power over the neighbor protocol, with a very large difference according to the common interpretations about the Cohen's d effect size. The presented simulator is called ABS-SmartComAgri and is freely distributed as open-source from a public research data repository. It ensures the reproducibility of experiments and allows other researchers to extend the current approach.

  9. AGRI Grain Power ethanol-for-fuel project feasibility-study report. Volume I. Project conceptual design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1981-04-01

    The AGRI GRAIN POWER (AGP) Project, hereafter referred to as the Project, was formed to evaluate the commercial viability and assess the desireability of implementing a large grain based grass-roots anhydrous ethanol fuel project to be sited near Des Moines, Iowa. This report presents the results of a Project feasibility evaluation. The Project concept is based on involving a very strong managerial, financial and technical joint venture that is extremely expert in all facets of planning and implementing a large ethanol project; on locating the ethanol project at a highly desireable site; on utilizing a proven ethanol process; and onmore » developing a Project that is well suited to market requirements, resource availability and competitive factors. The Project conceptual design is presented in this volume.« less

  10. Aerial application of Agri-Fos® to prevent Sudden Oak Death in Oregon tanoak forests

    Treesearch

    Alan Kanaskie; Everett Hansen; Wendy Sutton; Paul Reeser; Carolyn Choquette

    2010-01-01

    We have been testing the practicality and efficacy of aerial application of Agri-Fos® for control of sudden oak death (SOD) in the Oregon tanoak (Lithocarpus densiflorus) forest. Helicopter application to forest stands has been compared with bole injection and groundbased spray application to seedlings and stump sprouts. We bio-assayed...

  11. Experiences and attitudes towards evidence-informed policy-making among research and policy stakeholders in the Canadian agri-food public health sector.

    PubMed

    Young, I; Gropp, K; Pintar, K; Waddell, L; Marshall, B; Thomas, K; McEwen, S A; Rajić, A

    2014-12-01

    Policy-makers working at the interface of agri-food and public health often deal with complex and cross-cutting issues that have broad health impacts and socio-economic implications. They have a responsibility to ensure that policy-making based on these issues is accountable and informed by the best available scientific evidence. We conducted a qualitative descriptive study of agri-food public health policy-makers and research and policy analysts in Ontario, Canada, to understand their perspectives on how the policy-making process is currently informed by scientific evidence and how to facilitate this process. Five focus groups of 3-7 participants and five-one-to-one interviews were held in 2012 with participants from federal and provincial government departments and industry organizations in the agri-food public health sector. We conducted a thematic analysis of the focus group and interview transcripts to identify overarching themes. Participants indicated that the following six key principles are necessary to enable and demonstrate evidence-informed policy-making (EIPM) in this sector: (i) establish and clarify the policy objectives and context; (ii) support policy-making with credible scientific evidence from different sources; (iii) integrate scientific evidence with other diverse policy inputs (e.g. economics, local applicability and stakeholder interests); (iv) ensure that scientific evidence is communicated by research and policy stakeholders in relevant and user-friendly formats; (V) create and foster interdisciplinary relationships and networks across research and policy communities; and (VI) enhance organizational capacity and individual skills for EIPM. Ongoing and planned efforts in these areas, a supportive culture, and additional education and training in both research and policy realms are important to facilitate evidence-informed policy-making in this sector. Future research should explore these findings further in other countries and contexts.

  12. Spatial analysis of agri-environmental policy uptake and expenditure in Scotland.

    PubMed

    Yang, Anastasia L; Rounsevell, Mark D A; Wilson, Ronald M; Haggett, Claire

    2014-01-15

    Agri-environment is one of the most widely supported rural development policy measures in Scotland in terms of number of participants and expenditure. It comprises 69 management options and sub-options that are delivered primarily through the competitive 'Rural Priorities scheme'. Understanding the spatial determinants of uptake and expenditure would assist policy-makers in guiding future policy targeting efforts for the rural environment. This study is unique in examining the spatial dependency and determinants of Scotland's agri-environmental measures and categorised options uptake and payments at the parish level. Spatial econometrics is applied to test the influence of 40 explanatory variables on farming characteristics, land capability, designated sites, accessibility and population. Results identified spatial dependency for each of the dependent variables, which supported the use of spatially-explicit models. The goodness of fit of the spatial models was better than for the aspatial regression models. There was also notable improvement in the models for participation compared with the models for expenditure. Furthermore a range of expected explanatory variables were found to be significant and varied according to the dependent variable used. The majority of models for both payment and uptake showed a significant positive relationship with SSSI (Sites of Special Scientific Interest), which are designated sites prioritised in Scottish policy. These results indicate that environmental targeting efforts by the government for AEP uptake in designated sites can be effective. However habitats outside of SSSI, termed here the 'wider countryside' may not be sufficiently competitive to receive funding in the current policy system. Copyright © 2013 Elsevier Ltd. All rights reserved.

  13. ABS-SmartComAgri: An Agent-Based Simulator of Smart Communication Protocols in Wireless Sensor Networks for Debugging in Precision Agriculture

    PubMed Central

    2018-01-01

    Smart communication protocols are becoming a key mechanism for improving communication performance in networks such as wireless sensor networks. However, the literature lacks mechanisms for simulating smart communication protocols in precision agriculture for decreasing production costs. In this context, the current work presents an agent-based simulator of smart communication protocols for efficiently managing pesticides. The simulator considers the needs of electric power, crop health, percentage of alive bugs and pesticide consumption. The current approach is illustrated with three different communication protocols respectively called (a) broadcast, (b) neighbor and (c) low-cost neighbor. The low-cost neighbor protocol obtained a statistically-significant reduction in the need of electric power over the neighbor protocol, with a very large difference according to the common interpretations about the Cohen’s d effect size. The presented simulator is called ABS-SmartComAgri and is freely distributed as open-source from a public research data repository. It ensures the reproducibility of experiments and allows other researchers to extend the current approach. PMID:29584703

  14. Draft Genome Sequence of Pedobacter agri PB92T, Which Belongs to the Family Sphingobacteriaceae

    PubMed Central

    Lee, Myunglip; Roh, Seong Woon; Lee, Hae-Won; Yim, Kyung June; Kim, Kil-Nam; Bae, Jin-Woo; Choi, Kwang-Sik; Jeon, You-Jin; Jung, Won-Kyo; Kang, Heewan

    2012-01-01

    Strain PB92T of Pedobacter agri, which belongs to the family Sphingobacteriaceae, was isolated from soil in the Republic of Korea. The draft genome of strain PB92T contains 5,141,552 bp, with a G+C content of 38.0%. This is the third genome sequencing project of the type strains among the Pedobacter species. PMID:22740666

  15. Meeting a Growing Demand: Texas A&M AgriLife Extension Service's Early Childhood Educator Online Training Program

    ERIC Educational Resources Information Center

    Green, Stephen

    2013-01-01

    Demand for professional development training in the early childhood field has grown substantially in recent years. To meet the demand, Texas A&M AgriLife Extension Service's Family Development and Resource Management unit developed the Early Childhood Educator Online Training Program, a professional development system that currently offers…

  16. Advances in high-resolution mass spectrometry based on metabolomics studies for food--a review.

    PubMed

    Rubert, Josep; Zachariasova, Milena; Hajslova, Jana

    2015-01-01

    Food authenticity becomes a necessity for global food policies, since food placed in the market without fail has to be authentic. It has always been a challenge, since in the past minor components, called also markers, have been mainly monitored by chromatographic methods in order to authenticate the food. Nevertheless, nowadays, advanced analytical methods have allowed food fingerprints to be achieved. At the same time they have been also combined with chemometrics, which uses statistical methods in order to verify food and to provide maximum information by analysing chemical data. These sophisticated methods based on different separation techniques or stand alone have been recently coupled to high-resolution mass spectrometry (HRMS) in order to verify the authenticity of food. The new generation of HRMS detectors have experienced significant advances in resolving power, sensitivity, robustness, extended dynamic range, easier mass calibration and tandem mass capabilities, making HRMS more attractive and useful to the food metabolomics community, therefore becoming a reliable tool for food authenticity. The purpose of this review is to summarise and describe the most recent metabolomics approaches in the area of food metabolomics, and to discuss the strengths and drawbacks of the HRMS analytical platforms combined with chemometrics.

  17. Magnetotelluric investigation across the Agri Valley: preliminary results.

    NASA Astrophysics Data System (ADS)

    Balasco, Marianna; Romano, Gerardo; Siniscalchi, Agata; Alfredo Stabile, Tony

    2017-04-01

    The Agri Valley is an axial zone of the Southern Apennines thrust belt chain with a strong seismogenic potential where two important energy technologies responsible for inducing/triggering seismicity are active: (1) the disposal at the Costa Molina 2 injection well of the wastewater produced during the exploitation of the biggest onshore oil field in west Europe (27 wells producing more than 80,000 barrels of crude oil per day), managed by the Eni S.p.A., and (2) the water loading and unloading operations in the Pertusillo artificial reservoir. It is recognized the possibility that the fluctuation of the water level inside the reservoir, due to the hydrological cycle for example, produces pressure perturbations at the bottom of reservoir, causing induced seismicity. Furthermore it is even more known the role of fluids in the rupture processes which could cause an increase of pore pressure specially at high rate of injection fluids and/or for the presence of weakening of preexisting faults. With the aim to better characterize and understand the physical processes involved in the observed induced/triggered seismicity, in 2016 a broadband seismic network, covering an area of about 20 km x 20 km nearby the Pertusillo Dam and Costa Molina2 well has been installed in the framework of SIR-MIUR project INSIEME (INduced Seismicity in Italy: Estimation, Monitoring, and sEismic risk mitigation) and a MagnetoTelluric (MT) survey has been performed. The MT investigation consists of 25 soundings aligned along 30 km profile oriented at about N40 direction, orthogonal with the strike of the major and noticeable geological structures and crossing both of the source that may induce/trigger seismicity. In this work, we present the preliminary 2D resistivity model which provides useful deep geophysical information for understanding the geological and structural setting of the Agri Valley. Moreover, the comparison of the resistivity model with the earthquake location as inferred from

  18. Agri-environmental collaboratives as bridging organisations in landscape management.

    PubMed

    Prager, Katrin

    2015-09-15

    In recent years, landscape and its management has become a focus of policies and academic conceptualisation. Landscape is understood as a concept of interconnected natural and human systems. Its management must take into account the dynamic interdependencies and diverging interests of various stakeholders at different levels. Bridging organisations can provide an arena for trust-building, conflict resolution, learning and collaboration between relevant stakeholders. This paper draws on two strands of literature - landscape governance and co-management of social-ecological systems - to investigate the contributions of agri-environmental collaboratives (AEC) to sustainable landscape management. Based on data from 41 interviews with key informants and AEC members in Germany and the Netherlands, six fields of contributions were identified: policy implementation and service provision; coordination and mediation; awareness raising and behaviour change; care for 'everyday' landscapes; maintenance and protection of landscapes (including species and habitats); and income generation and economic benefits. Some of the contributions evolve around the specific role of AEC as bridging organisations, but other contributions such as economic benefits emerge beyond this analytical lens. The paper therefore emphasises holistic, bottom up assessment of AEC contributions and argues that governments should support such organisations through i) funding for facilitators and ii) funding for impact monitoring and data management. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. Integrated landscape planning and remuneration of agri-environmental services. Results of a case study in the Fuhrberg region of Germany.

    PubMed

    V Haaren, Christina; Bathke, Manfred

    2008-11-01

    Until now, existing remuneration of environmental services has not sufficiently supported the goals of spending money more effectively on the environment and of motivating farmers. Only a small share of the budgets for agriculture in the EU, as well as in US and other countries, is available for buying environmental goods and services beyond the level of good farming practice (GFP). This combined with the insufficient targeting of compensation payments to areas where special measures are needed leads to an unsatisfactorily low impact of agri-environment measures compared to other driving forces that stimulate the intensification of farming. The goal of this paper is to propose a management concept that enhances the ecological and cost efficiency of agri-environment measures. Components of the concept are a comprehensive environmental information base with prioritised goals and targets (available in Germany from landscape planning) and new remuneration models, which complement conventional compensation payments that are based upon predetermined measures and cost. Comprehensive landscape planning locates and prioritises areas which require environmental action. It contains the information that authorities need to prioritise funding for environmental services and direct measures to sites which need environmental services beyond the level of GFP. Also appropriate remuneration models, which can enhance the cost efficiency of public spending and the motivation of the farmers, can be applied on the base of landscape planning. Testing of the planning methodology and of one of the remuneration models (success-oriented remuneration) in a case study area ("Fuhrberger Feld" north of Hanover, Germany) demonstrated the usability of the concept and led to proposals for future development of the methodology and its application in combination with other approaches.

  20. Development of a Method for Evaluating Floor Dry-Cleanability from Wheat Flour in the Food Industry.

    PubMed

    Barreca, F; Cardinali, G D; Borgese, E; Russo, M

    2017-04-01

    Many productive processes are characterized by inadequate protocols of sanitation that increase the possibility of proliferation of microbial contaminants, especially on surfaces. The use of this method for evaluating the degree of floor cleanability in agri-food companies is important not only to reduce the risk of contamination of products, but also to provide companies with a tool to identify critical issues. The method is based on the usage of bicinchoninic acid assay (BCA) in a solution at a 1:50 ratio of Cu 2+ /BCA, which is ideal for detecting the amount of proteins contained in wheat flour residues on industrial flooring. Spectrophotometric analysis allowed identifying maximum absorbance values at 562 nm for different protein concentrations, although the construction of a regression function led to the definition of the intervals of evaluation corresponding to different degrees of cleanliness from residues of wheat flour. The results of the absorbance curves, obtained by applying the proposed evaluation method to 6 tiles commonly used in agri-food buildings, showed the clear persistence of food material on 2 tiles with surface relief. In particular, such tiles showed a higher presence of proteins, with a level of contamination 440% higher. Furthermore, a robotic system was designed to standardize the cleaning method commonly employed in agri-food companies to remove solid particles from flooring. © 2017 Institute of Food Technologists®.

  1. Applications of Computer Vision for Assessing Quality of Agri-food Products: A Review of Recent Research Advances.

    PubMed

    Ma, Ji; Sun, Da-Wen; Qu, Jia-Huan; Liu, Dan; Pu, Hongbin; Gao, Wen-Hong; Zeng, Xin-An

    2016-01-01

    With consumer concerns increasing over food quality and safety, the food industry has begun to pay much more attention to the development of rapid and reliable food-evaluation systems over the years. As a result, there is a great need for manufacturers and retailers to operate effective real-time assessments for food quality and safety during food production and processing. Computer vision, comprising a nondestructive assessment approach, has the aptitude to estimate the characteristics of food products with its advantages of fast speed, ease of use, and minimal sample preparation. Specifically, computer vision systems are feasible for classifying food products into specific grades, detecting defects, and estimating properties such as color, shape, size, surface defects, and contamination. Therefore, in order to track the latest research developments of this technology in the agri-food industry, this review aims to present the fundamentals and instrumentation of computer vision systems with details of applications in quality assessment of agri-food products from 2007 to 2013 and also discuss its future trends in combination with spectroscopy.

  2. AGRI Grain Power ethanol-for-fuel project feasibility-study report. Volume II. Project marketing/economic/financial/ and organization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1981-04-01

    The AGRI GRAIN POWER (AGP) project, hereafter referred to as the Project, was formed to evaluate the commercial viability and assess the desireability of implementing a large grain based grass-roots anhydrous ethanol fuel project to be sited near Des Moines, Iowa. This report presents the results of a Project feasibility evaluation. The Project concept is based on involving a very strong managerial, financial and technical joint venture that is extremely expert in all facets of planning and implementing a large ethanol project; on locating the ethanol project at a highly desireable site; on utilizing a proven ethanol process; and onmore » developing a Project that is well suited to market requirements, resource availability and competitive factors. The results of marketing, economic, and financial studies are reported in this volume.« less

  3. AGRI Grain Power ethanol-for-fuel project feasibility-study report. Volume III. Project environmental/health/safety/ and socioeconomic

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1981-04-01

    The AGRI GRAIN POWER (AGP) project, hereafter referred to as the Project, was formed to evaluate the commercial viability and assess the desireability of implementing a large grain based grass-roots anhydrous ethanol fuel project to be sited near Des Moines, Iowa. This report presents the results of a Project feasibility evaluation. The Project concept is based on involving a very strong managerial, financial and technical joint venture that is extremely expert in all facets of planning and implementing a large ethanol project; on locating the ethanol project at a highly desireable site; on utilizing a proven ethanol process; and onmore » developing a Project that is well suited to market requirements, resource availability and competitive factors. This volume contains the results of the environmental, health, safety, and socio-economic studies.« less

  4. Comparison of meteorological forcing (WFDEI, AGRI4CAST) to in-situ observations in a semi arid catchment. The case of Merguellil in Tunisia.

    NASA Astrophysics Data System (ADS)

    Le Page, Michel; Gosset, Cindy; Oueslati, Ines; Calvez, Roger; Zribi, Mehrez; Lilli Chabaane, Zohra

    2015-04-01

    Meteorological forcing is essential to hydrological and hydro-geological modeling. In the case of the semi-arid catchment of Merguellil in Tunisia, long term time series are only available in the plain for a SYNOP station. Other meteorological stations have been installed since 2010. Therefore, this study aims at qualifying the reliability of the meteorological forcing necessary for an integrated model conception. We compare the meteorological data from 7 stations (sources: WMO and our own station), inside and around the Merguellil catchment, with daily gridded data at 25*25 km from AGRI4CAST and 50*50km from WFDEI. AGRI4CAST (Biaveti et al, 2008) is an interpolated dataset based on actual weather stations produced by the Joint Research Centre (JRC) for the Monitoring Agricultural Resources Unit (MARS). The WFDEI second version dataset (Weedon et al, 2014) has been generated using the same methodology as the widely used WATCH Forcing Data (WFD) by making use of the ERA-Interim reanalysis data. The studied meteorological variables are Rs, Tmoy, U2, P, RH and ET0, with the scores RMSE, bias and R pearson. Regarding the AGRI4CAST dataset, the scores are established over different periods according to variables based on stepping between the observed and interpolated data. The scores show good correlations between the observed temperatures, but with a spatial variability bound to the stations elevations. The moderate and interpolated radiations also show a good concordance indicating a good reliability. The R pearson score obtained for the values of relative humidity show a good correlation between the observations and the interpolations, however, the short periods of comparisons do not allow obtaining significant information and the RMSE and bias are important. Wind speed has an important negative bias for a majority of stations (positively for only one). Only one station shows concordances between the data. The study of the data indicates that we shall have to adjust

  5. Stress aligned cracks in the upper crust of the Val d'Agri region as revealed by shear wave splitting

    NASA Astrophysics Data System (ADS)

    Pastori, M.; Piccinini, D.; Margheriti, L.; Improta, L.; Valoroso, L.; Chiaraluce, L.; Chiarabba, C.

    2009-10-01

    Shear wave splitting is measured at 19 seismic stations of a temporary network deployed in the Val d'Agri area to record low-magnitude seismic activity. The splitting results suggest the presence of an anisotropic layer between the surface and 15 km depth (i.e. above the hypocentres). The dominant fast polarization direction strikes NW-SE parallel to the Apennines orogen and is approximately parallel to the maximum horizontal stress in the region, as well as to major normal faults bordering the Val d'Agri basin. The size of the normalized delay times in the study region is about 0.01 s km-1, suggesting 4.5 percent shear wave velocity anisotropy (SWVA). On the south-western flank of the basin, where most of the seismicity occurs, we found larger values of normalized delay times, between 0.017 and 0.02 s km-1. These high values suggest a 10 percent of SWVA. These parameters agree with an interpretation of seismic anisotropy in terms of the Extensive-Dilatancy Anisotropy (EDA) model that considers the rock volume pervaded by fluid-saturated microcracks aligned by the active stress field. Anisotropic parameters are consistent with borehole image logs from deep exploration wells in the Val d'Agri oil field that detect pervasive fluid saturated microcracks striking NW-SE parallel to the maximum horizontal stress in the carbonatic reservoir. However, we cannot rule out the contribution of aligned macroscopic fractures because the main Quaternary normal faults are parallel to the maximum horizontal stress. The strong anisotropy and the seismicity concentration testify for active deformation along the SW flank of the basin.

  6. A Qualitative Study of Agricultural Literacy in Urban Youth: What Do Elementary Students Understand about the Agri-Food System?

    ERIC Educational Resources Information Center

    Hess, Alexander J.; Trexler, Cary J.

    2011-01-01

    Agricultural literacy of K-12 students is a national priority for both scientific and agricultural education professional organizations. Development of curricula to address this priority has not been informed by research on what K-12 students understand about the agri-food system. While students' knowledge of food and fiber system facts have been…

  7. Public and Private Agri-Environmental Regulation in Post-Socialist Economies: Evidence from the Serbian Fresh Fruit and Vegetable Sector

    ERIC Educational Resources Information Center

    Gorton, Matthew; Zaric, Vlade; Lowe, Philip; Quarrie, Steve

    2011-01-01

    Using primary survey data and interview evidence this paper analyses the implementation and enforcement of public and private environmental regulation in the Serbian Fresh Fruit and Vegetable (FFV) sector. This provides a basis for engaging in a wider debate on the nature of agri-food regulation in post-socialist economies. Depictions of the…

  8. Static yields and quality issues: Is the agri-environment program the primary driver?

    PubMed

    Peltonen-Sainio, Pirjo; Salo, Tapio; Jauhiainen, Lauri; Lehtonen, Heikki; Sieviläinen, Elina

    2015-10-01

    The Finnish agri-environmental program (AEP) has been in operation for 20 years with >90 % farmer commitment. This study aimed to establish whether reduced nitrogen (N) and phosphorus (P) use has impacted spring cereal yields and quality based on comprehensive follow-up studies and long-term experiments. We found that the gap between genetic yield potential and attained yield has increased after the AEP was imposed. However, many contemporary changes in agricultural practices, driven by changes in prices and farm subsidies, also including the AEP, were likely reasons, together with reduced N, but not phosphorus use. Such overall changes in crop management coincided with stagnation or decline in yields and adverse changes in quality, but yield-removed N increased and residual N decreased. Further studies are needed to assess whether all the changes are environmentally, economically, and socially sustainable, and acceptable, in the long run. The concept of sustainable intensification is worth considering as a means to develop northern European agricultural systems to combine environmental benefits with productivity.

  9. Effort for money? Farmers' rationale for participation in agri-environment measures with different implementation complexity.

    PubMed

    Van Herzele, Ann; Gobin, Anne; Van Gossum, Peter; Acosta, Lilibeth; Waas, Tom; Dendoncker, Nicolas; Henry de Frahan, Bruno

    2013-12-15

    European agri-environment programmes are based on the common principle that farmers deliver environmental services for which society pays. Due to the voluntary nature of agri-environment measures (AEM), the issue of farmers' motives or reasons for participation has been an important topic of investigation in past years. The present paper examines farmers' rationale for participation in AEM against the backdrop of continued debate over whether to develop relatively simple measures that can be readily applied by many farmers or give greater priority to measures that are more targeted - i.e. to the specific management requirement of particular habitats or species - but are often more complex. The paper draws on empirical material from a case study in the Dyle valley, Belgium, including in-depth interviews, expert consultations and a mail survey. It was sought not only to identify and quantify the importance of separate reasons for participation, but also to reveal how these reasons and other elements of relevance were logically interrelated in the explanation that farmers themselves give for their participation. As a result, six modes or styles of participation were identified: opportunistic, calculative, compensatory, optimising, catalysing and engaged. The analyses suggest that there were notable differences in that both separate reasons for and modes of participation do vary with the complexity of the measures' requirements. Overall, the study demonstrates that participation in AEM is not simply a matter of weighing the money against the effort for adoption. Whereas money is an important driver for participation (in particular, for those adopting complex AEM) it plays widely differing roles depending on the level of farmers' reasoning (farm enterprise, single practice or landscape feature) and the importance they give to other considerations (environmental effect, production potential of land, goodness of fit, etc.). Practical implications are drawn for both policy

  10. The role of agri-environment schemes in conservation and environmental management.

    PubMed

    Batáry, Péter; Dicks, Lynn V; Kleijn, David; Sutherland, William J

    2015-08-01

    Over half of the European landscape is under agricultural management and has been for millennia. Many species and ecosystems of conservation concern in Europe depend on agricultural management and are showing ongoing declines. Agri-environment schemes (AES) are designed partly to address this. They are a major source of nature conservation funding within the European Union (EU) and the highest conservation expenditure in Europe. We reviewed the structure of current AES across Europe. Since a 2003 review questioned the overall effectiveness of AES for biodiversity, there has been a plethora of case studies and meta-analyses examining their effectiveness. Most syntheses demonstrate general increases in farmland biodiversity in response to AES, with the size of the effect depending on the structure and management of the surrounding landscape. This is important in the light of successive EU enlargement and ongoing reforms of AES. We examined the change in effect size over time by merging the data sets of 3 recent meta-analyses and found that schemes implemented after revision of the EU's agri-environmental programs in 2007 were not more effective than schemes implemented before revision. Furthermore, schemes aimed at areas out of production (such as field margins and hedgerows) are more effective at enhancing species richness than those aimed at productive areas (such as arable crops or grasslands). Outstanding research questions include whether AES enhance ecosystem services, whether they are more effective in agriculturally marginal areas than in intensively farmed areas, whether they are more or less cost-effective for farmland biodiversity than protected areas, and how much their effectiveness is influenced by farmer training and advice? The general lesson from the European experience is that AES can be effective for conserving wildlife on farmland, but they are expensive and need to be carefully designed and targeted. © 2015 The Authors. Conservation Biology

  11. Factors affecting members' evaluation of agri-business ventures' effectiveness.

    PubMed

    Hashemi, Seyyed Mahmoud; Hedjazi, Yousef

    2011-02-01

    This paper presents work to identify factors affecting effectiveness of agri-business ventures (A-BVs) on the side of providers as perceived by their members. A survey was conducted among 95 members of A-BVs in Zanjan province, Iran. To collect data, a questionnaire was designed. Two distinct groups of A-BVs with low (group 1) and high (group 2) perceived (evaluated) levels of effectiveness were revealed. The study showed that there were significant differences between the two groups on important characteristics of A-BVs and their members. The study also found that there were statistically significant relationships between A-BVs' governance structure and capacity, management and organization characteristics and the perceived effectiveness, whereas there were no statistically significant relationships between A-BVs' advisory methods characteristic applied by members and the perceived effectiveness. Logistic regression results also showed that level of application of rules encouraging members' active participation in important decision makings, clear terms of reference to guide contracting procedures, roles, and responsibilities of parties involved, type of people served and geographical area of program coverage, and members' ability to use Information and Communication Technologies (ICTs) were predictors of the perceived (evaluated) effectiveness of A-BVs. The study showed that evaluation of members of effectiveness of A-BVs would not be the same. It is suggested that Iranian public agricultural extension organization, as responsible organization for monitoring and evaluating services conducted by A-BVs, considered these differences between members with different levels of some important variables. 2010 Elsevier Ltd. All rights reserved.

  12. Bio-transformation of agri-food wastes by newly isolated Neurospora crassa and Lactobacillus plantarum for egg production.

    PubMed

    Liu, P; Li, J; Deng, Z

    2016-03-01

    Using bio-transferred feedstuff was a cost-effective approach to improve egg quality and production; particularly, the nutritive diet came from agri-food wastes. In this study, optimization of fermentation conditions and co-cultivation of Neurospora crassa with Lactobacillus plantarum was performed in a simple bioreactor. The optimized fermentation of beer lees substrates through N. crassa led to the hydrolysis rates of crude fiber increasing to 43.27%. Compared to that of using N. crassa alone, the combination of N. crassa and L. plantarum enhanced the content of amino acids (13,120 to 18,032 mg/100 g) on oil-tea seed cake substrates particularly. When hens were fed 10% fermented oil-tea seedcake substrate, the ratio of feed to egg decreased from 3.1 to 2.6, egg production ratio increased from 65.71 to 80.10%, and color of vitelline (Roche) increased from 8.20 to 10.20. Fifteen kinds of carotenoids were identified by HPLC in fermented oil-tea seed cake substrates. The results of this study highlighted that the mixed-fermentation by N. crassa and L. plantarum may be an effective way to convert agri-food wastes into high-valued biomass products, which could have a positive effect on hens and their eggs. © 2016 Poultry Science Association Inc.

  13. The Potential for Collaborative Agri-Environment Schemes in England: Can a Well-Designed Collaborative Approach Address Farmers' Concerns with Current Schemes?

    ERIC Educational Resources Information Center

    Emery, Steven B.; Franks, Jeremy R.

    2012-01-01

    There is increasing recognition that whilst agri-environment schemes in England have had discernable benefits, their success in relation to certain species and resources has been inhibited by the piecemeal implementation of Environmental Stewardship (ES) on the basis of single farm agreements. In this paper we examine the receptivity of farmers to…

  14. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT: BROME AGRI SALES, LTD., MAXIMIZER SEPARATOR, MODEL MAX 1016 - 03/01/WQPC-SWP

    EPA Science Inventory

    Verification testing of the Brome Agri Sales Ltd. Maximizer Separator, Model MAX 1016 (Maximizer) was conducted at the Lake Wheeler Road Field Laboratory Swine Educational Unit in Raleigh, North Carolina. The Maximizer is an inclined screen solids separator that can be used to s...

  15. Orientation of Agri-Food Companies to CSR And Consumer Perception: A Survey on Two Italian Companies.

    PubMed

    Civero, Gennaro; Rusciano, Vincenzo; Scarpato, Debora

    2018-05-07

    Corporate Social Responsibility (CSR) is the most important tool for implementing Sustainability Guidelines for Business (US20030018487A1), delivering economic, social and environmental benefits for all the stakeholders and is currently the focus of international studies and debates (US7260559B1), especially in the agri-food sector as demonstrated by recent patents (CA2862273A1). In most agri-food businesses operating in Italy, there is little effectiveness in the communication strategies of this instrument to the stakeholders since they are often not advanced. Identifying the first two food companies in the European ranking of the 6th CSR Online Awards, through an empirical survey on consumers, their level of perception of the CSR strategies communication of this companies will be analyzed. In both case studies analyzed, there is the presence of a Sustainability Orientation and the evolution of CSR tools within such companies and their promotion and communication to all the stakeholders. Despite this, the level of perception of respondents on the CSR strategies communication of this companies and the promotion of CSR instruments is not optimal. The interviewed consumers has not enriched its knowledge on CSR due to the lack of an adequate communication strategy for both case studies. Consequently, to raise awareness on the subject, there is still a great deal of cooperation needed between public institutions, local communities, businesses and citizens. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  16. Controversial medical and agri-food biotechnology: a cultivation analysis.

    PubMed

    Bauer, Martin W

    2002-04-01

    Whether biotechnology is one or several developments is not clear. Once distinctions are required, the question is: Which one prevails? When the good, the bad, and the ugly settle, where do they fall? Evaluation implies distinction, and representation drives attitude. The controversies over biotechnology are fertile ground on which to study these issues. The imports of genetically modified (GM) soya into Europe in 1996-97 and the cloning of Dolly the sheep from adult cells in 1997 changed the symbolic environment for genetic engineering. The ensuing public controversies came to focus mainly on field trials of GM crops and food labeling. This paper will explore the relationship between quality press coverage and public perception, in particular the cultivation of the contrast between "desirable" biomedical (RED) and "undesirable" agri-food (GREEN) biotechnology in Britain. The argument draws on a systematic analysis of the British press coverage of biotechnology from 1973 to 1999 and analysis of public perceptions in 1996 and 1999. The paper concludes that the debate over GM crops and food ingredients fostered the RED-GREEN contrast among the newspaper-reading public, thereby shielding RED biotechnology from public controversy, and ushered in a realignment of the regulatory framework in 2000.

  17. A MODIS-based analysis of the Val d'Agri Oil Center (South of Italy) thermal emission: an independent gas flaring estimation strategy

    NASA Astrophysics Data System (ADS)

    Pergola, Nicola; Faruolo, Mariapia; Irina, Coviello; Carolina, Filizzola; Teodosio, Lacava; Valerio, Tramutoli

    2014-05-01

    Different kinds of atmospheric pollution affect human health and the environment at local and global scale. The petroleum industry represents one of the most important environmental pollution sources, accounting for about 18% of well-to-wheels greenhouse gas (GHG) emissions. The main pollution source is represented by the flaring of gas, one of the most challenging energy and environmental problems facing the world today. The World Bank has estimated that 150 billion cubic meters of natural gas are being flared annually, that is equivalent to 30% of the European Union's gas consumption. Since 2002, satellite-based methodologies have shown their capability in providing independent and reliable estimation of gas flaring emissions, at both national and global scale. In this paper, for the first time, the potential of satellite data in estimating gas flaring volumes emitted from a single on-shore crude oil pre-treatment plant, i.e. the Ente Nazionale Idrocarburi (ENI) Val d'Agri Oil Center (COVA), located in the Basilicata Region (South of Italy), was assessed. Specifically, thirteen years of night-time Moderate Resolution Imaging Spectroradiometer (MODIS) data acquired in the medium and thermal infrared (MIR and TIR, respectively) bands were processed. The Robust Satellite Techniques (RST) approach was implemented for identifying anomalous values of the signals under investigation (i.e. the MIR-TIR difference one), associated to the COVA flares emergency discharges. Then, the Fire Radiative Power (FRP), computed for the thermal anomalies previously identified, was correlated to the emitted gas flaring volumes, available for the COVA in the period 2003 - 2009, defining a satellite based regression model for estimating COVA gas flaring emitted volumes. The used strategy and the preliminary results of this analysis will be described in detail in this work.

  18. Assessment of atmospheric trace element concentrations by lichen-bag near an oil/gas pre-treatment plant in the Agri Valley (southern Italy)

    NASA Astrophysics Data System (ADS)

    Caggiano, R.; Trippetta, S.; Sabia, S.

    2014-10-01

    The atmospheric concentrations of 17 trace elements (Al, Ca, Cd, Cr, Cu, Fe, K, Li, Mg, Mn, Na, Ni, P, Pb, S, Ti and Zn) were measured by means of the "lichen-bag" technique in the Agri Valley (southern Italy). The lichen samples were collected from an unpolluted site located in Rifreddo forest (southern Italy). The bags were exposed to ambient air for 6 and 12 months. The exposed-to-control (EC) ratio values highlighted that the used lichen species were suitable for biomonitoring investigations. The results showed that the concentrations of almost all the examined trace elements increased with respect to the control after 6-12 month exposures. Furthermore, Ca, Al, Fe, K, Mg and S were the most abundant trace elements both in the 6 and 12 month-exposed samples. Moreover, principal component analysis (PCA) results highlighted that the major sources of the measured atmospheric trace elements were related both to anthropogenic contributions due to traffic, combustion processes, agricultural practices, construction and quarrying activities, and to natural contributions mainly represented by the re-suspension of local soil and road dusts. In addition, the contribution both of secondary atmospheric reactions involving Centro Olio Val d'Agri (COVA) plant emissions and the African dust long-range transport were also identified.

  19. Agribusiness Educational Methods and Cooperation with Agri-Educators.

    ERIC Educational Resources Information Center

    Ubadigbo, Fidelis N.; Gamon, Julia A.

    1988-01-01

    This study explored the educational strategies used by livestock feed, chemical/fertilizer, and seed businesses (n=79) in Iowa. Agribusinesses used a variety of educational methods to reach their clientele. Farm visits and meetings were frequently used by all. (JOW)

  20. A new MODIS based approach for gas flared volumes estimation: the case of the Val d'Agri Oil Center (Southern Italy)

    NASA Astrophysics Data System (ADS)

    Lacava, T.; Faruolo, M.; Coviello, I.; Filizzola, C.; Pergola, N.; Tramutoli, V.

    2014-12-01

    Gas flaring is one of the most controversial energetic and environmental issues the Earth is facing, moreover contributing to the global warming and climate change. According to the World Bank, each year about 150 Billion Cubic Meter of gas are being flared globally, that is equivalent to the annual gas use of Italy and France combined. Besides, about 400 million tons of CO2 (representing about 1.2% of global CO2 emissions) are added annually into the atmosphere. Efforts to evaluate the impact of flaring on the surrounding environment are hampered by lack of official information on flare locations and volumes. Suitable satellite based techniques could offers a potential solution to this problem through the detection and subsequent mapping of flare locations as well as gas emissions estimation. In this paper a new methodological approach, based on the Robust Satellite Techniques (RST), a multi-temporal scheme of satellite data analysis, was developed to analyze and characterize the flaring activity of the largest Italian gas and oil pre-treatment plant (ENI-COVA) located in Val d'Agri (Basilicata) For this site, located in an anthropized area characterized by a large environmental complexity, flaring emissions are mainly related to emergency conditions (i.e. waste flaring), being the industrial process regulated by strict regional laws. With reference to the peculiar characteristics of COVA flaring, the RST approach was implemented on 13 years of EOS-MODIS (Earth Observing System - Moderate Resolution Imaging Spectroradiometer) infrared data to detect COVA-related thermal anomalies and to develop a regression model for gas flared volume estimation. The methodological approach, the whole processing chain and the preliminarily achieved results will be shown and discussed in this paper. In addition, the possible implementation of the proposed approach on the data acquired by the SUOMI NPP - VIIRS (National Polar-orbiting Partnership - Visible Infrared Imaging

  1. AgriSense-STARS: Advancing Methods of Agricultural Monitoring for Food Security in Smallholder Regions - the Case for Tanzania

    NASA Astrophysics Data System (ADS)

    Dempewolf, J.; Becker-Reshef, I.; Nakalembe, C. L.; Tumbo, S.; Maurice, S.; Mbilinyi, B.; Ntikha, O.; Hansen, M.; Justice, C. J.; Adusei, B.; Kongo, V.

    2015-12-01

    In-season monitoring of crop conditions provides critical information for agricultural policy and decision making and most importantly for food security planning and management. Nationwide agricultural monitoring in countries dominated by smallholder farming systems, generally relies on extensive networks of field data collectors. In Tanzania, extension agents make up this network and report on conditions across the country, approaching a "near-census". Data is collected on paper which is resource and time intensive, as well as prone to errors. Data quality is ambiguous and there is a general lack of clear and functional feedback loops between farmers, extension agents, analysts and decision makers. Moreover, the data are not spatially explicit, limiting the usefulness for analysis and quality of policy outcomes. Despite significant advances in remote sensing and information communication technologies (ICT) for monitoring agriculture, the full potential of these new tools is yet to be realized in Tanzania. Their use is constrained by the lack of resources, skills and infrastructure to access and process these data. The use of ICT technologies for data collection, processing and analysis is equally limited. The AgriSense-STARS project is developing and testing a system for national-scale in-season monitoring of smallholder agriculture using a combination of three main tools, 1) GLAM-East Africa, an automated MODIS satellite image processing system, 2) field data collection using GeoODK and unmanned aerial vehicles (UAVs), and 3) the Tanzania Crop Monitor, a collaborative online portal for data management and reporting. These tools are developed and applied in Tanzania through the National Food Security Division of the Ministry of Agriculture, Food Security and Cooperatives (MAFC) within a statistically representative sampling framework (area frame) that ensures data quality, representability and resource efficiency.

  2. Assessment of atmospheric trace element concentrations by lichen-bag near an oil/gas pre-treatment plant in the Agri Valley (southern Italy)

    NASA Astrophysics Data System (ADS)

    Caggiano, R.; Trippetta, S.; Sabia, S.

    2015-02-01

    The atmospheric concentrations of 17 trace elements (Al, Ca, Cd, Cr, Cu, Fe, K, Li, Mg, Mn, Na, Ni, P, Pb, S, Ti and Zn) were measured by means of the "lichen-bag" technique in the Agri Valley (southern Italy). The lichen samples were collected from an unpolluted site located in Rifreddo forest (southern Italy), about 30 km away from the study area along the north direction. The bags were exposed to ambient air for 6 and 12 months. The exposed-to-control (EC) ratio values highlighted that the used lichen species were suitable for biomonitoring investigations. The results showed that the concentrations of almost all the examined trace elements increased with respect to the control after 6-12-month exposures. Furthermore, Ca, Al, Fe, K, Mg and S were the most abundant trace elements both in the 6-month and 12-month-exposed samples. Moreover, principal component analysis (PCA) results highlighted that the major sources of the measured atmospheric trace elements were related both to anthropogenic contributions due to traffic, combustion processes agricultural practices, construction and quarrying activities, and to natural contributions mainly represented by the re-suspension of local soil and road dusts. In addition, the contribution both of secondary atmospheric reactions involving Centro Olio Val d'Agri (COVA) plant emissions and the African dust long-range transport were also identified.

  3. Agri-environmental grass hay: nutritive value and intake in comparison with hay from intensively managed grassland.

    PubMed

    Fiems, L O; De Boever, J L; De Vliegher, A; Vanacker, J M; De Brabander, D L; Carlier, L

    2004-06-01

    Chemical composition, digestibility, nutritive value and intake of hay from an agri-environmental management (EH) were compared with those from hay (Lolium perenne) from an intensive management (IH). IH was of low to moderate quality because of unfavourable weather conditions. EH was harvested mid-June of 2000 (EH1) and 2001 (EH2) on the same sward that had not received mineral fertilizer for 10 years. The EH was characterized by a species-rich botanical composition. On average, it had lower contents of protein (32%), NDF (9%) and ash (35%), and a higher concentration of water-soluble carbohydrates (117%) than IH. Digestibility of dry and organic matter, determined with sheep, was not different between IH and EH and averaged 59 and 63%, respectively. Crude fibre and NDF digestibility were lower in EH (58 and 57%, respectively) than in IH (70 and 69%, respectively). Net energy value for lactation did not differ between IH and EH and amounted to 4.78 MJ per kg DM. True protein digested in the small intestine and rumen degraded protein balance were lower in EH (63 and -60 g per kg DM) than in IH (71 and -33 g per kg DM). Intake of hay was investigated in Holstein-Friesian heifers and Belgian Blue double-muscled heifers (mean BW 280 +/- 22 kg and 269 +/- 21 kg, respectively), and in Belgian Blue non-lactating and non-pregnant double-muscled cows (initial BW 642 +/- 82 kg), using a cross-over design. Hay was freely available. It was supplemented with 1 kg concentrate daily. Dry matter intake from hay was higher for EH than for IH in heifers (4% and 13%, respectively in Holstein-Friesian and Belgian Blue heifers) and in cows (22%). Hay from an agri-environmental management may be used for low-performing animals, as energy intake only exceeded maintenance requirements by 20 to 35%. Several characteristics of EH were different between years, such as dry matter digestibility, net energy value for lactation and fermentable organic matter content.

  4. Dyella agri sp. nov., isolated from reclaimed grassland soil.

    PubMed

    Chaudhary, Dhiraj Kumar; Kim, Jaisoo

    2017-10-01

    A novel strain, DKC-1 T , was isolated from reclaimed grassland soil and was characterized taxonomically by a polyphasic approach. Strain DKC-1 T was a Gram-staining-negative, light-yellow-coloured and rod shaped bacterium, motile with polar flagellum. It was able to grow at 20-37 °C, at pH 4.5-9.0 and with 0-3 % (w/v) NaCl concentration. Based on the 16S rRNA gene sequence analysis, strain DKC-1 T formed a clade within the members of the genus Dyella and showed highest sequence similarities to Dyella japonica XD53 T (98.36 %), Rhodanobacter aciditrophus sjH1 T (97.92 %), Rhodanobacter koreensis THG-DD7 T (97.74 %), Dyella kyungheensis THG-B117 T (97.65 %) and Rhodanobacter terrae GP18-1 T (97.40 %). The only respiratory quinone was ubiquinone-8. The major polar lipids were phosphatidylethanolamine, phosphatidylglycerol, diphosphatidylglycerol and phosphatidyl-N-methylethanolamine. The predominant fatty acids of strain DKC-1 T were iso-C16 : 0, iso-C15 : 0, summed feature 9 (iso-C17 : 1ω9c and/or C16 : 0 10-methyl), iso-C17 : 0, iso-C11 : 0 3-OH and iso-C11 : 0. The genomic DNA G+C content of this novel strain was 63.1 mol%. The DNA-DNA relatedness between strain DKC-1 T and its reference strains (D. japonica XD53 T , R. aciditrophus sjH1 T , R. koreensis THG-DD7 T , D. kyungheensis THG-B117 T and R. terrae GP18-1 T ) was 52.3, 44.7, 38.7, 49.0 and 32.7 %, respectively, which falls below the threshold value of 70 % for the strain to be considered as novel. The morphological, physiological, chemotaxonomic and phylogenetic analyses clearly distinguished this strain from its closest phylogenetic neighbours. Thus, strain DKC-1 T represents a novel species of the genus Dyella, for which the name Dyella agri sp. nov. is proposed. The type strain is DKC-1 T (=KEMB 9005-571 T =KACC 19176 T =JCM 31925 T ).

  5. Non-Double-Couple Component Analysis of Induced Microearthquakes in the Val D'Agri Basin (Italy)

    NASA Astrophysics Data System (ADS)

    Roselli, P.; Improta, L.; Saccorotti, G.

    2017-12-01

    In recent years it has become accepted that earthquake source can attain significant Non-Double-Couple (NDC) components. Among the driving factors of deviation from normal double-couple (DC) mechanisms there is the opening/closing of fracture networks and the activation of pre-existing faults by pore fluid pressure perturbations. This observation makes the thorough analysis of source mechanism of key importance for the understanding of withdrawal/injection induced seismicity from geothermal and hydrocarbon reservoirs, as well as of water reservoir induced seismicity. In addition to the DC component, seismic moment tensor can be decomposed into isotropic (ISO) and compensated linear vector dipole (CLVD) components. In this study we performed a careful analysis of the seismic moment tensor of induced microseismicity recorded in the Val d'Agri (Southern Apennines, Italy) focusing our attention on the NDC component. The Val d'Agri is a Quaternary extensional basin that hosts the largest onshore European oil field and a water reservoir (Pertusillo Lake impoundment) characterized by severe seasonal level oscillations. Our input data-set includes swarm-type induced micro-seismicity recorded between 2005-2006 by a high-performance network and accurately localized by a reservoir-scale local earthquake tomography. We analyze two different seismicity clusters: (i) a swarm of 69 earthquakes with 0.3 ≤ ML ≤ 1.8 induced by a wastewater disposal well of the oilfield during the initial daily injection tests (10 days); (ii) 526 earthquakes with -0.2 ≤ ML ≤ 2.7 induced by seasonal volume changes of the artificial lake. We perform the seismic moment tensor inversion by using HybridMT code. After a very accurate signal-to-noise selection and hand-made picking of P-pulses, we obtain %DC, %ISO, %CLVD for each event. DC and NDC components are analyzed and compared with the spatio-temporal distribution of seismicity, the local stress field, the injection parameters and the water

  6. [Residential cohort study on mortality and hospitalization in Viggiano and Grumento Nova Municipalities in the framework of HIA in Val d'Agri (Basilicata Region, Southern Italy)].

    PubMed

    Minichilli, Fabrizio; Bianchi, Fabrizio; Ancona, Carla; Cervino, Marco; De Gennaro, Gianluigi; Mangia, Cristina; Santoro, Michele; Bustaffa, Elisa

    2018-01-01

    to evaluate the associations among the emissions produced by "Centro olio Val d'Agri" (COVA), with mortality and hospitalization of residents in the Viggiano and Grumento Nova Municipalities, located in Val d'Agri (Basilicata Region, Southern Italy). residential cohort study. Lagrangians dispersion models to estimate the level of exposure at the address of residence to NOX concentrations as tracers of COVA emissions. Based on the tertile of NOX distribution, individual exposure was classified and a Cox model analysis was performed (hazard ratio, HR, trend with relative 95%CI). The association among exposure to NOX and the cohort mortality/hospitalization was evaluated considering age, socioeconomic status, and distance from the high traffic density road. The cohort included 6,795 residents (73,270 person-years) in the period 2000-2014. causes of mortality and hospitalization due to cardio-respiratory diseases, recognised as associated to air pollution, with medium-short latency induction period, consistent with the period of operation at the COVA. increasing trends were observed on three exposure classes for mortality due to circulatory system diseases (HR trend: 1.19; 95%CI 1.02-1.39), stronger considering women (HR trend: 1.19; 95%CI 1.02-1.39). From hospitalizations results, an increased risk emerges for respiratory diseases (HR trend: 1.12; 95%CI 1.01-1.25) and, for women, for diseases of the circulatory system (HR trend: 1.19; 95%CI 1.03-1.38), for ischemic diseases (HR trend: 1.33; 95%CI 1.02-1.74) and respiratory diseases (HR trend: 1.22; 95%CI 1.03-1.46). the excesses of mortality and hospitalization emerged in areas most exposed to pollutants of industrial origin are relevant for preventive actions. It is recommended to define and implement a surveillance system for the entire resident population based on indicators of environmental pollution and related health outcomes on the basis of the scientific literature and the results achieved by the present study.

  7. A new method for studying population genetics of cyst nematodes based on Pool-Seq and genomewide allele frequency analysis.

    PubMed

    Mimee, Benjamin; Duceppe, Marc-Olivier; Véronneau, Pierre-Yves; Lafond-Lapalme, Joël; Jean, Martine; Belzile, François; Bélair, Guy

    2015-11-01

    Cyst nematodes are important agricultural pests responsible for billions of dollars of losses each year. Plant resistance is the most effective management tool, but it requires a close monitoring of population genetics. Current technologies for pathotyping and genotyping cyst nematodes are time-consuming, expensive and imprecise. In this study, we capitalized on the reproduction mode of cyst nematodes to develop a simple population genetic analysis pipeline based on genotyping-by-sequencing and Pool-Seq. This method yielded thousands of SNPs and allowed us to study the relationships between populations of different origins or pathotypes. Validation of the method on well-characterized populations also demonstrated that it was a powerful and accurate tool for population genetics. The genomewide allele frequencies of 23 populations of golden nematode, from nine countries and representing the five known pathotypes, were compared. A clear separation of the pathotypes and fine genetic relationships between and among global populations were obtained using this method. In addition to being powerful, this tool has proven to be very time- and cost-efficient and could be applied to other cyst nematode species. © 2015 Her Majesty the Queen in Right of Canada Molecular Ecology Resources © 2015 John Wiley & Sons Ltd Reproduced with the permission of the Minister of Agriculture and Agri-food.

  8. Ethnobotanical survey of wild food plants traditionally collected and consumed in the Middle Agri Valley (Basilicata region, southern Italy).

    PubMed

    Sansanelli, Sabrina; Ferri, Maura; Salinitro, Mirko; Tassoni, Annalisa

    2017-09-06

    This research was carried out in a scarcely populated area of the Middle Agri Valley (Basilicata region, southern Italy). The aim of the study was to record local knowledge on the traditional uses of wild food plants, as well as to collect information regarding the practices (gathering, processing and cooking) and the medicinal uses related to these plants. Fifty-eight people still possessing traditional local knowledge (TLK), 74% women and 26% men, were interviewed between May-August 2012 and January 2013, using open and semi-structured ethnobotanical interviews. For each described plant species, the botanical family, the Italian common and folk names, the plant parts used, the culinary preparation and, when present, the medicinal use, were recorded and the relative frequency of citation index (RFC) was determined. The 52 plant species mentioned by the respondents belong to 23 botanical families, with Asteraceae (12 plants) and Rosaceae (7 plants) being most frequently cited. The species with the highest RFC index is Cichorium intybus L. (0.95), followed by Sonchus spp. (S. oleraceus L., S. asper L. and S. arvensis L.) (0.76). The plant parts preferably used are leaves (22 plants), fruits (12) and stems (7). Only six wild plants were indicated as having both food use and therapeutic effect. The survey conducted on the traditional use of wild food plants in the Middle Agri Valley revealed that this cultural heritage is only partially retained by the population. Over the last few decades, this knowledge has been in fact quickly disappearing along with the people and, even in the rural context of the study area, is less and less handed down to younger generations. Nevertheless, data also revealed that the use of wild plants is recently being revaluated in a way closely related to local habits and traditions.

  9. Integration of experimental and computational methods for identifying geometric, thermal and diffusive properties of biomaterials

    NASA Astrophysics Data System (ADS)

    Weres, Jerzy; Kujawa, Sebastian; Olek, Wiesław; Czajkowski, Łukasz

    2016-04-01

    Knowledge of physical properties of biomaterials is important in understanding and designing agri-food and wood processing industries. In the study presented in this paper computational methods were developed and combined with experiments to enhance identification of agri-food and forest product properties, and to predict heat and water transport in such products. They were based on the finite element model of heat and water transport and supplemented with experimental data. Algorithms were proposed for image processing, geometry meshing, and inverse/direct finite element modelling. The resulting software system was composed of integrated subsystems for 3D geometry data acquisition and mesh generation, for 3D geometry modelling and visualization, and for inverse/direct problem computations for the heat and water transport processes. Auxiliary packages were developed to assess performance, accuracy and unification of data access. The software was validated by identifying selected properties and using the estimated values to predict the examined processes, and then comparing predictions to experimental data. The geometry, thermal conductivity, specific heat, coefficient of water diffusion, equilibrium water content and convective heat and water transfer coefficients in the boundary layer were analysed. The estimated values, used as an input for simulation of the examined processes, enabled reduction in the uncertainty associated with predictions.

  10. IMAA (Integrated Measurements of Aerosol in Agri valley) campaign: Multi-instrumental observations at the largest European oil/gas pre-treatment plant area

    NASA Astrophysics Data System (ADS)

    Calvello, Mariarosaria; Caggiano, Rosa; Esposito, Francesco; Lettino, Antonio; Sabia, Serena; Summa, Vito; Pavese, Giulia

    2017-11-01

    A short-term intensive multi-instrumental measurement campaign (Integrated Measurements of Aerosol in Agri valley - IMAA) was carried out near the largest European oil and gas pre-treatment plant (Centro Olio Val d'Agri - COVA) in a populated area, where, so far, ample characterization of aerosol loading is missing. As such, between the 2 and 17 July in 2013, using a number of instruments analyses were carried out on physical, chemical, morphological and optical properties of aerosol at this distinctive site, at both ground and over the atmospheric column, including the investigation of the mixing and transformation of particles. The observation of slag silicates with a rough surface texture is consistent with the presence of oil-related activities which represent the only industrial activity in the area. Desulfurization/sulfur liquefaction processes occurring at COVA can explain the peculiar morphology of calcium-sodium-aluminum particles. The common COVA source was associated with high concentrations of sulfur, nickel and zinc, and with significant correlations between zinc-sulfur and zinc-nickel. The Optical Particle Sizer (OPS) data, hygroscopicity and optical properties of atmospheric aerosol are consistent with the typical oil-derived gaseous emissions (e.g. sulfur dioxide and methane) that strongly influence the mixing state of particles and their size distributions. Continuous combustion processes at COVA were found to be responsible for Equivalent Black Carbon (EBC) concentrations from their relevant contribution to the total number of fine particles. The expected significant contribution of WS (water soluble) and BC (Black Carbon) components to the total Aerosol Optical Depth (AOD) are consistent with the results from the radiometric model especially for July 3 and 16.

  11. Effectiveness of incentives for agri-environment measure in Mediterranean degraded and eroded vineyards

    NASA Astrophysics Data System (ADS)

    Galati, Antonino; Gristina, Luciano; Crescimanno, Maria; Barone, Ettore; Novara, Agata

    2015-04-01

    The evaluation of the economic damage caused by soil erosion assumes great importance. It serves to increase awareness of the problem among farmers and policy makers. Moreover, it can promote the implementation of conservative measures at the field and basin level by spurring the development of more sustainable soil management practices. In the present study we have developed a new approach to evaluate the incentive for the adoption of Agri-Environment Measure (AEM) in Mediterranean degraded and eroded vineyards. In order to estimate this incentive, the replacement cost and the loss of income are calculated under two different soil management such as Conventional Tillage (CT) and Cover crop (AEM). Our findings show that the incentive could range between the loss of income due to AEM adoption and ecosystem service benefit (RCCT - RC AEM). In the case of study the incentive ranged between 315 € ha-1 (loss of income) and 1,087.86 € ha-1 (Ecosystem service benefit). Within this range, the incentive amount is determined according to efficiency criteria taking into account the morphological conditions of the territory in which operate the farms. Moreover, a conceptual model on the public spending efficiency has been developed to allocate the incentives where the economic return in term of ecosystem service is higher.

  12. Choice of mineral fertilizer substitution principle strongly influences LCA environmental benefits of nutrient cycling in the agri-food system.

    PubMed

    Hanserud, Ola Stedje; Cherubini, Francesco; Øgaard, Anne Falk; Müller, Daniel B; Brattebø, Helge

    2018-02-15

    Increased nutrient cycling in the agri-food system is a way to achieve a healthier nutrient stewardship and more sustainable food production. In life cycle assessment (LCA) studies, use of recycled fertilizer products is often credited by the substitution method, which subtracts the environmental burdens associated with avoided production of mineral fertilizer from the system under study. The environmental benefits from avoided fertilizer production can make an important contribution to the results, but different calculation principles and often implicit assumptions are used to estimate the amount of avoided mineral fertilizer. This may hinder comparisons between studies. The present study therefore examines how the choice of substitution principles influences LCA results. Three different substitution principles, called one-to-one, maintenance, and adjusted maintenance, are identified, and we test the importance of these in a case study on cattle slurry management. We show that the inventory of avoided mineral fertilizer varies greatly when the different principles are applied, with strong influences on two-thirds of LCA impact categories. With the one-to-one principle, there is a risk of systematically over-estimating the environmental benefits from nutrient cycling. In a sensitivity analysis we show that the difference between the principles is closely related to the application rate and levels of residual nutrients in the soil. We recommend that LCA practitioners first and foremost state and justify the substitution method they use, in order to increase transparency and comparability with other studies. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. A Decision Support System Coupling Fuzzy Logic and Probabilistic Graphical Approaches for the Agri-Food Industry: Prediction of Grape Berry Maturity

    PubMed Central

    Brousset, Jean Marie; Abbal, Philippe; Guillemin, Hervé; Perret, Bruno; Goulet, Etienne; Guerin, Laurence; Barbeau, Gérard; Picque, Daniel

    2015-01-01

    Agri-food is one of the most important sectors of the industry and a major contributor to the global warming potential in Europe. Sustainability issues pose a huge challenge for this sector. In this context, a big issue is to be able to predict the multiscale dynamics of those systems using computing science. A robust predictive mathematical tool is implemented for this sector and applied to the wine industry being easily able to be generalized to other applications. Grape berry maturation relies on complex and coupled physicochemical and biochemical reactions which are climate dependent. Moreover one experiment represents one year and the climate variability could not be covered exclusively by the experiments. Consequently, harvest mostly relies on expert predictions. A big challenge for the wine industry is nevertheless to be able to anticipate the reactions for sustainability purposes. We propose to implement a decision support system so called FGRAPEDBN able to (1) capitalize the heterogeneous fragmented knowledge available including data and expertise and (2) predict the sugar (resp. the acidity) concentrations with a relevant RMSE of 7 g/l (resp. 0.44 g/l and 0.11 g/kg). FGRAPEDBN is based on a coupling between a probabilistic graphical approach and a fuzzy expert system. PMID:26230334

  14. The environmental balance of the Alta Val d'Agri: a contribution to the evaluation of the industrial risk and strategic sustainable development

    NASA Astrophysics Data System (ADS)

    Loperte, S.; Cosmi, C.

    2015-09-01

    This study presents the preliminary environmental balance of the Alta Val d'Agri (Basilicata Region, Southern Italy), an area of great naturalistic interest characterized by the presence of huge oil and gas fields. The Driving Forces-Pressure-State-Impact-Responses (DPSIR) methodology was used to outline the background in terms of environmental impacts mainly caused by oil extraction activities, as well as potential existing responses. The study aims at providing stakeholders with an exhaustive framework to identify the existing data, the main sources of pollution, their potential impacts, the associated industrial risks and the existing policy strategies. Moreover, the DPSIR approach allows the identification of the vulnerable areas and the definition of targeted actions for a sustainable development of the area.

  15. Assessing the environmental performance of English arable and livestock holdings using data from the Farm Accountancy Data Network (FADN).

    PubMed

    Westbury, D B; Park, J R; Mauchline, A L; Crane, R T; Mortimer, S R

    2011-03-01

    Agri-environment schemes (AESs) have been implemented across EU member states in an attempt to reconcile agricultural production methods with protection of the environment and maintenance of the countryside. To determine the extent to which such policy objectives are being fulfilled, participating countries are obliged to monitor and evaluate the environmental, agricultural and socio-economic impacts of their AESs. However, few evaluations measure precise environmental outcomes and critically, there are no agreed methodologies to evaluate the benefits of particular agri-environmental measures, or to track the environmental consequences of changing agricultural practices. In response to these issues, the Agri-Environmental Footprint project developed a common methodology for assessing the environmental impact of European AES. The Agri-Environmental Footprint Index (AFI) is a farm-level, adaptable methodology that aggregates measurements of agri-environmental indicators based on Multi-Criteria Analysis (MCA) techniques. The method was developed specifically to allow assessment of differences in the environmental performance of farms according to participation in agri-environment schemes. The AFI methodology is constructed so that high values represent good environmental performance. This paper explores the use of the AFI methodology in combination with Farm Business Survey data collected in England for the Farm Accountancy Data Network (FADN), to test whether its use could be extended for the routine surveillance of environmental performance of farming systems using established data sources. Overall, the aim was to measure the environmental impact of three different types of agriculture (arable, lowland livestock and upland livestock) in England and to identify differences in AFI due to participation in agri-environment schemes. However, because farm size, farmer age, level of education and region are also likely to influence the environmental performance of a

  16. Ecological restoration of farmland: progress and prospects.

    PubMed

    Wade, Mark R; Gurr, Geoff M; Wratten, Steve D

    2008-02-27

    Sustainable agricultural practices in conjunction with ecological restoration methods can reduce the detrimental effects of agriculture. The Society for Ecological Restoration International has produced generic guidelines for conceiving, organizing, conducting and assessing ecological restoration projects. Additionally, there are now good conceptual frameworks, guidelines and practical methods for developing ecological restoration programmes that are based on sound ecological principles and supported by empirical evidence and modelling approaches. Restoration methods must also be technically achievable and socially acceptable and spread over a range of locations. It is important to reconcile differences between methods that favour conservation and those that favour economic returns, to ensure that conservation efforts are beneficial for both landowners and biodiversity. One option for this type of mutual benefit is the use of agri-environmental schemes to provide financial incentives to landholders in exchange for providing conservation services and other benefits. However, further work is required to define and measure the effectiveness of agri-environmental schemes. The broader potential for ecological restoration to improve the sustainability of agricultural production while conserving biodiversity in farmscapes and reducing external costs is high, but there is still much to learn, particularly for the most efficient use of agri-environmental schemes to change land use practice.

  17. A Review on the Rising Prevalence of International Standards: Threats or Opportunities for the Agri-Food Produce Sector in Developing Countries, with a Focus on Examples from the MENA Region.

    PubMed

    Faour-Klingbeil, Dima; Todd, Ewen C D

    2018-03-03

    Food safety standards are a necessity to protect consumers' health in today's growing global food trade. A number of studies have suggested safety standards can interrupt trade, bringing financial and technical burdens on small as well as large agri-food producers in developing countries. Other examples have shown that economical extension, key intermediaries, and funded initiatives have substantially enhanced the capacities of growers in some countries of the Middle East and North Africa (MENA) region to meet the food safety and quality requirements, and improve their access to international markets. These endeavors often compensate for the weak regulatory framework, but do not offer a sustainable solution. There is a big gap in the food safety level and control systems between countries in the MENA region and those in the developed nations. This certainly has implications for the safety of fresh produce and agricultural practices, which hinders any progress in their international food trade. To overcome the barriers of legal and private standards, food safety should be a national priority for sustainable agricultural development in the MENA countries. Local governments have a primary role in adopting the vision for developing and facilitating the implementation of their national Good Agricultural Practices (GAP) standards that are consistent with the international requirements and adapted to local policies and environment. Together, the public and private sector's support are instrumental to deliver the skills and infrastructure needed for leveraging the safety and quality level of the agri-food chain.

  18. A Review on the Rising Prevalence of International Standards: Threats or Opportunities for the Agri-Food Produce Sector in Developing Countries, with a Focus on Examples from the MENA Region

    PubMed Central

    Faour-Klingbeil, Dima

    2018-01-01

    Food safety standards are a necessity to protect consumers’ health in today’s growing global food trade. A number of studies have suggested safety standards can interrupt trade, bringing financial and technical burdens on small as well as large agri-food producers in developing countries. Other examples have shown that economical extension, key intermediaries, and funded initiatives have substantially enhanced the capacities of growers in some countries of the Middle East and North Africa (MENA) region to meet the food safety and quality requirements, and improve their access to international markets. These endeavors often compensate for the weak regulatory framework, but do not offer a sustainable solution. There is a big gap in the food safety level and control systems between countries in the MENA region and those in the developed nations. This certainly has implications for the safety of fresh produce and agricultural practices, which hinders any progress in their international food trade. To overcome the barriers of legal and private standards, food safety should be a national priority for sustainable agricultural development in the MENA countries. Local governments have a primary role in adopting the vision for developing and facilitating the implementation of their national Good Agricultural Practices (GAP) standards that are consistent with the international requirements and adapted to local policies and environment. Together, the public and private sector’s support are instrumental to deliver the skills and infrastructure needed for leveraging the safety and quality level of the agri-food chain. PMID:29510498

  19. [Recommendations from a health impact assessment in Viggiano and Grumento Nova (Basilicata Region, Southern Italy)].

    PubMed

    Linzalone, Nunzia; Bianchi, Fabrizio; Cervino, Marco; Cori, Liliana; De Gennaro, Gianluigi; Mangia, Cristina; Bustaffa, Elisa

    2018-01-01

    In Europe, Health Impact Assessment (HIA) is a consolidated practice aimed at predicting health impacts supporting the predisposition of plans and projects subjected to authorization procedures. In Italy, further developments are needed to harmonize the practice and consolidate methodologies in order to extend the HIA application in different fields. The recent HIA conducted in Val d'Agri (Basilicata) on the impacts of a first crude oil treatment plant represents an opportunity to illustrate its tools, methods and fields of application. In this experience, participation methods in impact assessment have been adapted to the context, emphasizing aspects of ethics, equity and democracy. Environmental and epidemiological studies were included in the HIA Val d'Agri in order to characterize the environment and assess the health status of the resident population. On the basis of the results public health recommendations have been elaborated, shared with the stakeholders and shared with local and regional administrators. The experience in Val d'Agri introduces elements of reflection on the potential of HIA at local level in order to support the public health and the environmental control systems in the area, as well as planning based on preventive environment and HIA.

  20. Is it worth protecting groundwater from diffuse pollution with agri-environmental schemes? A hydro-economic modeling approach.

    PubMed

    Hérivaux, Cécile; Orban, Philippe; Brouyère, Serge

    2013-10-15

    In Europe, 30% of groundwater bodies are considered to be at risk of not achieving the Water Framework Directive (WFD) 'good status' objective by 2015, and 45% are in doubt of doing so. Diffuse agricultural pollution is one of the main pressures affecting groundwater bodies. To tackle this problem, the WFD requires Member States to design and implement cost-effective programs of measures to achieve the 'good status' objective by 2027 at the latest. Hitherto, action plans have mainly consisted of promoting the adoption of Agri-Environmental Schemes (AES). This raises a number of questions concerning the effectiveness of such schemes for improving groundwater status, and the economic implications of their implementation. We propose a hydro-economic model that combines a hydrogeological model to simulate groundwater quality evolution with agronomic and economic components to assess the expected costs, effectiveness, and benefits of AES implementation. This hydro-economic model can be used to identify cost-effective AES combinations at groundwater-body scale and to show the benefits to be expected from the resulting improvement in groundwater quality. The model is applied here to a rural area encompassing the Hesbaye aquifer, a large chalk aquifer which supplies about 230,000 inhabitants in the city of Liege (Belgium) and is severely contaminated by agricultural nitrates. We show that the time frame within which improvements in the Hesbaye groundwater quality can be expected may be much longer than that required by the WFD. Current WFD programs based on AES may be inappropriate for achieving the 'good status' objective in the most productive agricultural areas, in particular because these schemes are insufficiently attractive. Achieving 'good status' by 2027 would demand a substantial change in the design of AES, involving costs that may not be offset by benefits in the case of chalk aquifers with long renewal times. Copyright © 2013 Elsevier Ltd. All rights reserved.

  1. Compatibility between livestock databases used for quantitative biosecurity response in New Zealand.

    PubMed

    Jewell, C P; van Andel, M; Vink, W D; McFadden, A M J

    2016-05-01

    To characterise New Zealand's livestock biosecurity databases, and investigate their compatibility and capacity to provide a single integrated data source for quantitative outbreak analysis. Contemporary snapshots of the data in three national livestock biosecurity databases, AgriBase, FarmsOnLine (FOL) and the National Animal Identification and Tracing Scheme (NAIT), were obtained on 16 September, 1 September and 30 April 2014, respectively, and loaded into a relational database. A frequency table of animal numbers per farm was calculated for the AgriBase and FOL datasets. A two dimensional kernel density estimate was calculated for farms reporting the presence of cattle, pigs, deer, and small ruminants in each database and the ratio of farm densities for AgriBase versus FOL calculated. The extent to which records in the three databases could be matched and linked was quantified, and the level of agreement amongst them for the presence of different species on properties assessed using Cohen's kappa statistic. AgriBase contained fewer records than FOL, but recorded animal numbers present on each farm, whereas FOL contained more records, but captured only presence/absence of animals. The ratio of farm densities in AgriBase relative to FOL for pigs and deer was reasonably homogeneous across New Zealand, with AgriBase having a farm density approximately 80% of FOL. For cattle and small ruminants, there was considerable heterogeneity, with AgriBase showing a density of cattle farms in the Central Otago region that was 20% of FOL, and a density of small ruminant farms in the central West Coast area that was twice that of FOL. Only 37% of records in FOL could be linked to AgriBase, but the level of agreement for the presence of different species between these databases was substantial (kappa>0.6). Both NAIT and FOL shared common farm identifiers which could be used to georeference animal movements, and there was a fair to substantial agreement (kappa 0.32-0.69) between

  2. Sorghum for human food--a review.

    PubMed

    Anglani, C

    1998-01-01

    A brief review of literature on sorghum for human foods and on the relationship among some kernel characteristics and food quality is presented. The chief foods prepared with sorghum, such as tortilla, porridge, couscous and baked goods are described. Tortillas, prepared with 75% of whole sorghum and 25% of yellow maize, are better than those prepared with whole sorghum alone. A porridge formulation with a 30:40:30 mix of sorghum, maize and cassava respectively, has been shown to be the most acceptable combination. The cooked porridge Aceda has lower protein digestibility and higher biological value than the uncooked porridge Aceda. Sorghum is not considered breadmaking flour but the addition of 30% sorghum flour to wheat flour of 72% extraction rate produces a bread, evaluated as good to excellent.

  3. Land cover changes and forest landscape evolution (1985-2009) in a typical Mediterranean agroforestry system (high Agri Valley)

    NASA Astrophysics Data System (ADS)

    Simoniello, T.; Coluzzi, R.; Imbrenda, V.; Lanfredi, M.

    2015-06-01

    The present study focuses on the transformations of a typical Mediterranean agroforestry landscape of southern Italy (high Agri Valley - Basilicata region) that occurred over 24 years. In this period, the valuable agricultural and natural areas that compose such a landscape were subjected to intensive industry-related activities linked to the exploitation of the largest European onshore oil reservoir. Landsat imagery acquired in 1985 and 2009 were used to detect changes in forest areas and major land use trajectories. Landscape metrics indicators were adopted to characterize landscape structure and evolution of both the complex ecomosaic (14 land cover classes) and the forest/non-forest arrangement. Our results indicate a net increase of 11% of forest areas between 1985 and 2009. The major changes concern increase of all forest covers at the expense of pastures and grasses, enlargement of riparian vegetation, and expansion of artificial areas. The observed expansion of forests was accompanied by a decrease of the fragmentation levels likely due to the reduction of small glades that break forest homogeneity and to the recolonization of herbaceous areas. Overall, we observe an evolution towards a more stable configuration depicting a satisfactory picture of vegetation health.

  4. Land cover changes and forest landscape evolution (1985-2009) in a typical Mediterranean agroforestry system (High Agri Valley)

    NASA Astrophysics Data System (ADS)

    Simoniello, T.; Coluzzi, R.; Imbrenda, V.; Lanfredi, M.

    2014-08-01

    The present study focuses on the transformations of a typical Mediterranean agroforestry landscape of southern Italy (High Agri Valley - Basilicata region) occurred during 24 years. In this period, the valuable agricultural and natural areas that compose such a landscape were subjected to intensive industry-related activities linked to the exploitation of the largest European on-shore oil reservoir. Landsat imagery acquired in 1985 and 2009 were used to detect changes in forest areas and major land use trajectories. Landscape metrics indicators were adopted to characterize landscape structure and evolution of both the complex ecomosaic (14 land cover classes) and the Forest/Non Forest arrangement. Our results indicate a net increase of 11% of forest areas between 1985 and 2009. The major changes concern: increase of all forest covers at the expense of pastures and grasses, enlargement of riparian vegetation, expansion of artificial areas. The observed expansion of forests was accompanied by a decrease of the fragmentation levels likely due to the reduction of small glades that break forest homogeneity and to the recolonization of herbaceous areas. Overall, we observe an evolution towards a more stable configuration depicting a satisfactory picture of vegetation health.

  5. Water flows in the Spanish economy: agri-food sectors, trade and households diets in an input-output framework.

    PubMed

    Cazcarro, Ignacio; Duarte, Rosa; Sánchez-Chóliz, Julio

    2012-06-19

    Seeking to advance our knowledge of water flows and footprints and the factors underlying them, we apply, on the basis of an extended 2004 Social Accounting Matrix for Spain, an open Leontief model in which households and foreign trade are the exogenous accounts. The model shows the water embodied in products bought by consumers (which we identify with the Water Footprint) and in trade (identified with virtual water trade). Activities with relevant water inflows and outflows such as the agrarian sector, textiles, and the agri-food industry are examined in detail using breakdowns of the relevant accounts. The data reflect only physical consumption, differentiating between green and blue water. The results reveal that Spain is a net importer of water. Flows are then related to key trading partners to show the large quantities involved. The focus on embodied (or virtual) water by activity is helpful to distinguish indirect from direct consumption as embodied water can be more than 300 times direct consumption in some food industry activities. Finally, a sensitivity analysis applied to changes in diets shows the possibility of reducing water uses by modifying households' behavior to encourage healthier eating.

  6. Inversion of inherited thrusts by wastewater injection induced seismicity at the Val d’Agri oilfield (Italy)

    NASA Astrophysics Data System (ADS)

    Buttinelli, M.; Improta, L.; Bagh, S.; Chiarabba, C.

    2016-11-01

    Since 2006 wastewater has been injected below the Val d’Agri Quaternary basin, the largest on-land oilfield in Europe, inducing micro-seismicity in the proximity of a high-rate injection well. In this study, we have the rare opportunity to revise a massive set of 2D/3D seismic and deep borehole data in order to investigate the relationship between the active faults that bound the basin and the induced earthquakes. Below the injection site we identify a Pliocene thrusts and back-thrusts system inherited by the Apennines compression, with no relation with faults bounding the basin. The induced seismicity is mostly confined within the injection reservoir, and aligns coherently with a NE-dipping back-thrust favorably oriented within the current extensional stress field. Earthquakes spread upwards from the back-thrust deep portion activating a 2.5-km wide patch. Focal mechanisms show a predominant extensional kinematic testifying to an on-going inversion of the back-thrust, while a minor strike-slip compound suggests a control exerted by a high angle inherited transverse fault developed within the compressional system, possibly at the intersection between the two fault sets. We stress that where wastewater injection is active, understanding the complex interaction between injection-linked seismicity and pre-existing faults is a strong requisite for safe oilfield exploitation.

  7. Effects of agri-environmental schemes on farmland birds: do food availability measurements improve patterns obtained from simple habitat models?

    PubMed Central

    Ponce, Carlos; Bravo, Carolina; Alonso, Juan Carlos

    2014-01-01

    Studies evaluating agri-environmental schemes (AES) usually focus on responses of single species or functional groups. Analyses are generally based on simple habitat measurements but ignore food availability and other important factors. This can limit our understanding of the ultimate causes determining the reactions of birds to AES. We investigated these issues in detail and throughout the main seasons of a bird's annual cycle (mating, postfledging and wintering) in a dry cereal farmland in a Special Protection Area for farmland birds in central Spain. First, we modeled four bird response parameters (abundance, species richness, diversity and “Species of European Conservation Concern” [SPEC]-score), using detailed food availability and vegetation structure measurements (food models). Second, we fitted new models, built using only substrate composition variables (habitat models). Whereas habitat models revealed that both, fields included and not included in the AES benefited birds, food models went a step further and included seed and arthropod biomass as important predictors, respectively, in winter and during the postfledging season. The validation process showed that food models were on average 13% better (up to 20% in some variables) in predicting bird responses. However, the cost of obtaining data for food models was five times higher than for habitat models. This novel approach highlighted the importance of food availability-related causal processes involved in bird responses to AES, which remained undetected when using conventional substrate composition assessment models. Despite their higher costs, measurements of food availability add important details to interpret the reactions of the bird community to AES interventions and thus facilitate evaluating the real efficiency of AES programs. PMID:25165523

  8. Polymerization of epoxidized triglycerides with fluorosulfonic acid

    USDA-ARS?s Scientific Manuscript database

    The use of triglycerides as agri-based renewable raw materials for the development of new products is highly desirable in view of uncertain future petroleum prices. A new method of polymerizing epoxidized soybean oil has been devised with the use of fluorosulfonic acid. Depending on the reaction con...

  9. The role of agri-environment schemes in conservation and environmental management

    PubMed Central

    Batáry, Péter; Dicks, Lynn V; Kleijn, David; Sutherland, William J

    2015-01-01

    Over half of the European landscape is under agricultural management and has been for millennia. Many species and ecosystems of conservation concern in Europe depend on agricultural management and are showing ongoing declines. Agri-environment schemes (AES) are designed partly to address this. They are a major source of nature conservation funding within the European Union (EU) and the highest conservation expenditure in Europe. We reviewed the structure of current AES across Europe. Since a 2003 review questioned the overall effectiveness of AES for biodiversity, there has been a plethora of case studies and meta-analyses examining their effectiveness. Most syntheses demonstrate general increases in farmland biodiversity in response to AES, with the size of the effect depending on the structure and management of the surrounding landscape. This is important in the light of successive EU enlargement and ongoing reforms of AES. We examined the change in effect size over time by merging the data sets of 3 recent meta-analyses and found that schemes implemented after revision of the EU's agri-environmental programs in 2007 were not more effective than schemes implemented before revision. Furthermore, schemes aimed at areas out of production (such as field margins and hedgerows) are more effective at enhancing species richness than those aimed at productive areas (such as arable crops or grasslands). Outstanding research questions include whether AES enhance ecosystem services, whether they are more effective in agriculturally marginal areas than in intensively farmed areas, whether they are more or less cost-effective for farmland biodiversity than protected areas, and how much their effectiveness is influenced by farmer training and advice? The general lesson from the European experience is that AES can be effective for conserving wildlife on farmland, but they are expensive and need to be carefully designed and targeted. El Papel de los Esquemas Agro-Ambientales en

  10. Edible safety requirements and assessment standards for agricultural genetically modified organisms.

    PubMed

    Deng, Pingjian; Zhou, Xiangyang; Zhou, Peng; Du, Zhong; Hou, Hongli; Yang, Dongyan; Tan, Jianjun; Wu, Xiaojin; Zhang, Jinzhou; Yang, Yongcun; Liu, Jin; Liu, Guihua; Li, Yonghong; Liu, Jianjun; Yu, Lei; Fang, Shisong; Yang, Xiaoke

    2008-05-01

    This paper describes the background, principles, concepts and methods of framing the technical regulation for edible safety requirement and assessment of agricultural genetically modified organisms (agri-GMOs) for Shenzhen Special Economic Zone in the People's Republic of China. It provides a set of systematic criteria for edible safety requirements and the assessment process for agri-GMOs. First, focusing on the degree of risk and impact of different agri-GMOs, we developed hazard grades for toxicity, allergenicity, anti-nutrition effects, and unintended effects and standards for the impact type of genetic manipulation. Second, for assessing edible safety, we developed indexes and standards for different hazard grades of recipient organisms, for the influence of types of genetic manipulation and hazard grades of agri-GMOs. To evaluate the applicability of these criteria and their congruency with other safety assessment systems for GMOs applied by related organizations all over the world, we selected some agri-GMOs (soybean, maize, potato, capsicum and yeast) as cases to put through our new assessment system, and compared our results with the previous assessments. It turned out that the result of each of the cases was congruent with the original assessment.

  11. Enabling a sustainable and prosperous future through science and innovation in the bioeconomy at Agriculture and Agri-Food Canada.

    PubMed

    Sarkar, Sara F; Poon, Jacquelyne S; Lepage, Etienne; Bilecki, Lori; Girard, Benoit

    2018-01-25

    Science and innovation are important components underpinning the agricultural and agri-food system in Canada. Canada's vast geographical area presents diverse, regionally specific requirements in addition to the 21st century agricultural challenges facing the overall sector. As the broader needs of the agricultural landscape have evolved and will continue to do so in the next few decades, there is a trend in place to transition towards a sustainable bioeconomy, contributing to reducing greenhouse gas emission and our dependency on non-renewable resources. We highlight some of the key policy drivers on an overarching national scale and those specific to agricultural research and innovation that are critical to fostering a supportive environment for innovation and a sustainable bioeconomy. As well, we delineate some major challenges and opportunities facing agriculture in Canada, including climate change, sustainable agriculture, clean technologies, and agricultural productivity, and some scientific initiatives currently underway to tackle these challenges. The use of various technologies and scientific efforts, such as Next Generation Sequencing, metagenomics analysis, satellite image analysis and mapping of soil moisture, and value-added bioproduct development will accelerate scientific development and innovation and its contribution to a sustainable and prosperous bioeconomy. Crown Copyright © 2017. Published by Elsevier B.V. All rights reserved.

  12. Phosphate additives in food--a health risk.

    PubMed

    Ritz, Eberhard; Hahn, Kai; Ketteler, Markus; Kuhlmann, Martin K; Mann, Johannes

    2012-01-01

    Hyperphosphatemia has been identified in the past decade as a strong predictor of mortality in advanced chronic kidney disease (CKD). For example, a study of patients in stage CKD 5 (with an annual mortality of about 20%) revealed that 12% of all deaths in this group were attributable to an elevated serum phosphate concentration. Recently, a high-normal serum phosphate concentration has also been found to be an independent predictor of cardiovascular events and mortality in the general population. Therefore, phosphate additives in food are a matter of concern, and their potential impact on health may well have been underappreciated. We reviewed pertinent literature retrieved by a selective search of the PubMed and EU databases (www.zusatzstoffe-online.de, www.codexalimentarius.de), with the search terms "phosphate additives" and "hyperphosphatemia." There is no need to lower the content of natural phosphate, i.e. organic esters, in food, because this type of phosphate is incompletely absorbed; restricting its intake might even lead to protein malnutrition. On the other hand, inorganic phosphate in food additives is effectively absorbed and can measurably elevate the serum phosphate concentration in patients with advanced CKD. Foods with added phosphate tend to be eaten by persons at the lower end of the socioeconomic scale, who consume more processed and "fast" food. The main pathophysiological effect of phosphate is vascular damage, e.g. endothelial dysfunction and vascular calcification. Aside from the quality of phosphate in the diet (which also requires attention), the quantity of phosphate consumed by patients with advanced renal failure should not exceed 1000 mg per day, according to the guidelines. Prospective controlled trials are currently unavailable. In view of the high prevalence of CKD and the potential harm caused by phosphate additives to food, the public should be informed that added phosphate is damaging to health. Furthermore, calls for labeling the content of added phosphate in food are appropriate.

  13. Concentration and health risk evaluation of heavy metals in market-sold vegetables and fishes based on questionnaires in Beijing, China.

    PubMed

    Fang, Yanyan; Nie, Zhiqiang; Liu, Feng; Die, Qingqi; He, Jie; Huang, Qifei

    2014-10-01

    Concentrations of heavy metals (As, Cd, Pb, Cu, Ni, Fe, Mn, and Zn) in market vegetables and fishes in Beijing, China, are investigated, and their health risk to local consumers is evaluated by calculating the target hazard quotient (THQ). The heavy metal concentrations in vegetables and fishes ranged from not detectable (ND) to 0.21 mg/kg fresh weight (f.w.) (As), ND to 0.10 mg/kg f.w. (Cd), and n.d to 0.57 mg/kg f.w. (Pb), with average concentrations of 0.17, 0.04, and 0.24 mg/kg f.w., respectively. The measured concentrations of As, Cd, Pb, Cu, Ni, Fe, Mn, and Zn are generally lower than the safety limits given by the Chinese regulation safety and quality standards of agriculture products (GB2762-2012). As, Cd, and Pb contaminations are found in vegetables and fishes. The exceeding standard rates are 19 % for As, 3 % for Cd, and 25 % for Pb. Pb contaminations are found quite focused on the fish samples from traditional agri-product markets. The paper further analyzed the health risk of heavy metals in vegetables and fishes respectively from supermarkets and traditional agri-product markets; the results showed that the fishes of traditional agri-product markets have higher health risk, while the supermarkets have vegetables of higher heavy metal risk, and the supervision should be strengthened in the fish supply channels in traditional agri-product markets.

  14. Starting a learning progression for agricultural literacy: A qualitative study of urban elementary student understandings of agricultural and science education benchmarks

    NASA Astrophysics Data System (ADS)

    Hess, Alexander Jay

    Science and agriculture professional organizations have argued for agricultural literacy as a goal for K-12 public education. Due to the complexity of our modern agri-food system, with social, economic, and environmental concerns embedded, an agriculturally literate society is needed for informed decision making, democratic participation, and system reform. While grade-span specific benchmarks for gauging agri-food system literacy have been developed, little attention has been paid to existing ideas individuals hold about the agri-food system, how these existing ideas relate to benchmarks, how experience shapes such ideas, or how ideas change overtime. Developing a body of knowledge on students' agri-food system understandings as they develop across K-12 grades can ground efforts seeking to promote a learning progression toward agricultural literacy. This study compares existing perceptions held by 18 upper elementary students from a large urban center in California to agri-food system literacy benchmarks and examines the perceptions against student background and experiences. Data were collected via semi-structured interviews and analyzed using the constant comparative method. Constructivist theoretical perspectives framed the study. No student had ever grown their own food, raised a plant, or cared for an animal. Participation in school fieldtrips to farms or visits to a relative's garden were agricultural experiences most frequently mentioned. Students were able to identify common food items, but could not elaborate on their origins, especially those that were highly processed. Students' understanding of post-production activities (i.e. food processing, manufacturing, or food marketing) was not apparent. Students' understanding of farms reflected a 1900's subsistence farming operation commonly found in a literature written for the primary grades. Students were unaware that plants and animals were selected for production based on desired genetic traits. Obtaining

  15. Butterfly Density and Behaviour in Uncut Hay Meadow Strips: Behavioural Ecological Consequences of an Agri-Environmental Scheme.

    PubMed

    Lebeau, Julie; Wesselingh, Renate A; Van Dyck, Hans

    2015-01-01

    Sparing zones from mowing has been proposed, and applied, to improve local conditions for survival and reproduction of insects in hay meadows. However, little is known about the efficiency of refuge zones and the consequences for local populations. We studied population densities of butterflies before and after mowing in the refuge zone of 15 meadows in 2009 and 2011. We also studied the behaviour of the meadow brown (Maniola jurtina) comparing nectar use, interactions and flights in the refuge zone before and after mowing. Densities of grassland butterflies in this zone doubled on average after mowing. The density of females of M. jurtina increased on average fourfold, while males showed a more modest increase. In line with the idea of increased scramble competition in the refuge zone after mowing, M. jurtina increased the time spent on nectar feeding, the preferred nectar source was visited more frequently, and females made more use of non-preferred nectar sources. Maniola jurtina did not interact more with conspecifics after mowing, but interactions lasted longer. Flight tracks did not change in linearity, but were faster and shorter after mowing. After mowing, only a part of the local grassland butterflies moved to the uncut refuge zone. The resulting concentration effect alters the time allocated to different activities, nectar use and movements. These aspects have been largely ignored for agri-environmental schemes and grassland management in nature reserves and raise questions about optimal quantities and quality of uncut refuge sites for efficient conservation of grassland arthropods in agricultural landscapes.

  16. Butterfly Density and Behaviour in Uncut Hay Meadow Strips: Behavioural Ecological Consequences of an Agri-Environmental Scheme

    PubMed Central

    Lebeau, Julie; Wesselingh, Renate A.; Van Dyck, Hans

    2015-01-01

    Sparing zones from mowing has been proposed, and applied, to improve local conditions for survival and reproduction of insects in hay meadows. However, little is known about the efficiency of refuge zones and the consequences for local populations. We studied population densities of butterflies before and after mowing in the refuge zone of 15 meadows in 2009 and 2011. We also studied the behaviour of the meadow brown (Maniola jurtina) comparing nectar use, interactions and flights in the refuge zone before and after mowing. Densities of grassland butterflies in this zone doubled on average after mowing. The density of females of M. jurtina increased on average fourfold, while males showed a more modest increase. In line with the idea of increased scramble competition in the refuge zone after mowing, M. jurtina increased the time spent on nectar feeding, the preferred nectar source was visited more frequently, and females made more use of non-preferred nectar sources. Maniola jurtina did not interact more with conspecifics after mowing, but interactions lasted longer. Flight tracks did not change in linearity, but were faster and shorter after mowing. After mowing, only a part of the local grassland butterflies moved to the uncut refuge zone. The resulting concentration effect alters the time allocated to different activities, nectar use and movements. These aspects have been largely ignored for agri-environmental schemes and grassland management in nature reserves and raise questions about optimal quantities and quality of uncut refuge sites for efficient conservation of grassland arthropods in agricultural landscapes. PMID:26284618

  17. Hydrological impact of rainwater harvesting in the Modder river basin of central South Africa

    NASA Astrophysics Data System (ADS)

    Welderufael, W. A.; Woyessa, Y. E.; Edossa, D. C.

    2011-05-01

    Along the path of water flowing in a river basin are many water-related human interventions that modify the natural systems. Rainwater harvesting is one such intervention that involves harnessing of water in the upstream catchment. Increased water usage at upstream level is an issue of concern for downstream water availability to sustain ecosystem services. The upstream Modder River basin, located in a semi arid region in the central South Africa, is experiencing intermittent meteorological droughts causing water shortages for agriculture, livestock and domestic purpose. To address this problem a technique was developed for small scale farmers with the objective of harnessing rainwater for crop production. However, the hydrological impact of a wider adoption of this technique by farmers has not been well quantified. In this regard, the SWAT hydrological model was used to simulate the hydrological impact of such practices. The scenarios studied were: (1) Baseline scenario, based on the actual land use of 2000, which is dominated by pasture (combination of natural and some improved grass lands) (PAST); (2) Partial conversion of Land use 2000 (PAST) to conventional agriculture (Agri-CON); and (3) Partial conversion of Land use 2000 (PAST) to in-field rainwater harvesting which was aimed at improving the precipitation use efficiency (Agri-IRWH). SWAT was calibrated using observed daily mean stream flow data of a sub-catchment (419 km2) in the study area. SWAT performed well in simulating the stream flow giving Nash and Sutcliffe (1970) efficiency index of 0.57 for the monthly stream flow calibration. The simulated water balance results showed that the highest peak mean monthly direct flow was obtained on Agri-CON land use (18 mm), followed by PAST (12 mm) and Agri-IRWH land use (9 mm). These were 19 %, 13 % and 11 % of the mean annual rainfall, respectively. The Agri-IRWH scenario reduced direct flow by 38 % compared to Agri-CON. On the other hand it was found that the

  18. An ontology-based collaborative service framework for agricultural information

    USDA-ARS?s Scientific Manuscript database

    In recent years, China has developed modern agriculture energetically. An effective information framework is an important way to provide farms with agricultural information services and improve farmer's production technology and their income. The mountain areas in central China are dominated by agri...

  19. Targeting the impact of agri-environmental policy - Future scenarios in two less favoured areas in Portugal.

    PubMed

    Jones, Nadia; Fleskens, Luuk; Stroosnijder, Leo

    2016-10-01

    Targeting agri-environmental measures (AEM) improves their effectiveness in the delivery of public goods, provided the necessary coordination with other incentives. In less favoured areas (LFA) measures focusing on the conservation of extensive farming contribute to sustainable land management in these areas. In this paper we investigate the implementation of a possible AEM supporting the improvement of permanent pastures coordinated with the extensive livestock and single farm payments actually in place. Through applying a spatially-explicit mixed integer optimisation model we simulate future land use scenarios for two less favoured areas in Portugal (Centro and Alentejo) considering two policy scenarios: a 'targeted AEM', and a 'non-targeted AEM'. We then compare the results with a 'basic policy' option (reflecting a situation without AEM). This is done with regard to landscape-scale effects on the reduction of fire hazard and erosion risk, as well as effects on farm income. The results show that an AEM for permanent pastures would be more cost-effective for erosion and fire hazard mitigation if implemented within a spatially targeted framework. However when cost-effectiveness is assessed with other indicators (e.g. net farm income and share of grazing livestock) 'non-targeted AEM' implementation delivers the best outcome in Alentejo. In Centro the implementation of an AEM involves important losses of income compared to the 'basic policy'. 'Targeted AEM' tends to favour farms in very marginal conditions, i.e. targeting is demonstrated to perform best in landscapes where spatial heterogeneity is higher. The results also show the risk of farm abandonment in the two studied less favoured areas: in all three scenarios more than 30% of arable land is deemed to be abandoned. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. 76 FR 31571 - Notice of Funds Availability (NOFA) Inviting Applications for the 2011 Farmers' Market Promotion...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-01

    ..., agri-tourism activities, and other direct producer-to-consumer infrastructure. AMS hereby requests... farmers markets, roadside stands, community-supported agriculture programs, agri-tourism activities and... new farmers markets, roadside stands, community-supported agriculture programs, agri-tourism...

  1. Modelling the bioaccumulation of persistent organic pollutants in agricultural food chains for regulatory exposure assessment.

    PubMed

    Takaki, Koki; Wade, Andrew J; Collins, Chris D

    2017-02-01

    New models for estimating bioaccumulation of persistent organic pollutants in the agricultural food chain were developed using recent improvements to plant uptake and cattle transfer models. One model named AgriSim was based on K OW regressions of bioaccumulation in plants and cattle, while the other was a steady-state mechanistic model, AgriCom. The two developed models and European Union System for the Evaluation of Substances (EUSES), as a benchmark, were applied to four reported food chain (soil/air-grass-cow-milk) scenarios to evaluate the performance of each model simulation against the observed data. The four scenarios considered were as follows: (1) polluted soil and air, (2) polluted soil, (3) highly polluted soil surface and polluted subsurface and (4) polluted soil and air at different mountain elevations. AgriCom reproduced observed milk bioaccumulation well for all four scenarios, as did AgriSim for scenarios 1 and 2, but EUSES only did this for scenario 1. The main causes of the deviation for EUSES and AgriSim were the lack of the soil-air-plant pathway and the ambient air-plant pathway, respectively. Based on the results, it is recommended that soil-air-plant and ambient air-plant pathway should be calculated separately and the K OW regression of transfer factor to milk used in EUSES be avoided. AgriCom satisfied the recommendations that led to the low residual errors between the simulated and the observed bioaccumulation in agricultural food chain for the four scenarios considered. It is therefore recommended that this model should be incorporated into regulatory exposure assessment tools. The model uncertainty of the three models should be noted since the simulated concentration in milk from 5th to 95th percentile of the uncertainty analysis often varied over two orders of magnitude. Using a measured value of soil organic carbon content was effective to reduce this uncertainty by one order of magnitude.

  2. 75 FR 9155 - Notice of Funds Availability (NOFA) Inviting Applications for the 2010 Farmers' Market Promotion...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-01

    ... farmers' markets, roadside stands, community supported agriculture programs, agri-tourism activities, and...-supported agriculture programs, agri-tourism activities and other direct producer-to-consumer market..., community-supported agriculture programs, agri-tourism activities, and other direct producer-to-consumer...

  3. Progressive Plant Growing Has Business Blooming

    NASA Technical Reports Server (NTRS)

    2006-01-01

    In 1997, AgriHouse, Inc. (d.b.a. Aeroponics International), a leading agri-biology company, united with NASA and BioServe Space Technologies, a nonprofit, NASA-sponsored partnership research center, to design a soil-less plant-growth experiment to be performed in microgravity, aboard the Mir space station. This experiment aimed to gauge the effectiveness of a non-pesticide solution on the immune responses of bean plants. In essence, the research consortium was looking for a means of keeping plants free from infection, without having to rely on the use of pesticides. This research, combined with follow-on grants from NASA, has helped Berthoud, Colorado-based AgriHouse gain credibility in the commercial marketplace with related technology and gross the capital necessary to conduct further research in a new-age field known as bio-pharming.

  4. Monitoring the Restart of a High-Rate Wastewater Disposal Well in the Val d'Agri Oilfield (Italy)

    NASA Astrophysics Data System (ADS)

    De Gori, P.; Improta, L.; Moretti, M.; Colasanti, G.; Criscuoli, F.

    2015-12-01

    The Val d'Agri Quaternary basin in the Southern Apennine range of Italy hosts the largest inland oil field in Europe. Wastewater coming from the oil exploitation is re-injected by a high-rate disposal well into strongly fractured limestones of the hydrocarbon carbonate reservoir. Disposal activity has induced micro-seismicity since the beginning of injection in June 2006. Around 220 small magnitude events (ML < 2.3) were recorded between 2006 and 2013 by the trigger-mode monitoring local network managed by the oil company and by the National Seismic Network of Istituto Nazionale di Geofisica e Vulcanologia. The induced micro-seismicity illuminated a pre-existing high-angle fault located 1 km below the well. Since June 2006, wastewater has been re-injected with only short interruptions due acid stimulations. In January 2015 disposal activity was halted due to technical operations in the oil refinery and wastewater injection restarted after two weeks. We installed 5 short-period stations within 10 km of the disposal well to carefully monitor the re-start phase and the subsequent 3 months of disposal activity. This temporary network was complemented by stations of the National Seismic Network giving this final configuration:9 stations within 10 km of the well with the closest station 2 km apart, 13 stations within 20 km. Here we report on the preliminary analysis of the local earthquake recorded during the survey focusing on the events occurred in the injection area. The seismicity rate is compared with injection data.In spite of the dense network, we found that the rate of induced seismicity (both the number and energy of events) is very low when compared to the seismicity recorded during the first 5 years of injection activity carried out with comparable rate and pressure.

  5. 'Stolo' Strawberry

    USDA-ARS?s Scientific Manuscript database

    ‘Stolo’ is a new June-bearing strawberry (Fragaria x ananassa Duchesne ex Rozier) cultivar from the Agriculture and Agri-Food Canada (AAFC), Pacific Agri-Food Research Center (PARC) in Agassiz, released in cooperation with the B.C. Ministry of Agriculture, Fisheries and Food, U.S. Department of Agri...

  6. Imaging quality analysis of computer-generated holograms using the point-based method and slice-based method

    NASA Astrophysics Data System (ADS)

    Zhang, Zhen; Chen, Siqing; Zheng, Huadong; Sun, Tao; Yu, Yingjie; Gao, Hongyue; Asundi, Anand K.

    2017-06-01

    Computer holography has made a notably progress in recent years. The point-based method and slice-based method are chief calculation algorithms for generating holograms in holographic display. Although both two methods are validated numerically and optically, the differences of the imaging quality of these methods have not been specifically analyzed. In this paper, we analyze the imaging quality of computer-generated phase holograms generated by point-based Fresnel zone plates (PB-FZP), point-based Fresnel diffraction algorithm (PB-FDA) and slice-based Fresnel diffraction algorithm (SB-FDA). The calculation formula and hologram generation with three methods are demonstrated. In order to suppress the speckle noise, sequential phase-only holograms are generated in our work. The results of reconstructed images numerically and experimentally are also exhibited. By comparing the imaging quality, the merits and drawbacks with three methods are analyzed. Conclusions are given by us finally.

  7. SEPHYDRO: An Integrated Multi-Filter Web-Based Tool for Baseflow Separation

    NASA Astrophysics Data System (ADS)

    Serban, D.; MacQuarrie, K. T. B.; Popa, A.

    2017-12-01

    Knowledge of baseflow contributions to streamflow is important for understanding watershed scale hydrology, including groundwater-surface water interactions, impact of geology and landforms on baseflow, estimation of groundwater recharge rates, etc. Baseflow (or hydrograph) separation methods can be used as supporting tools in many areas of environmental research, such as the assessment of the impact of agricultural practices, urbanization and climate change on surface water and groundwater. Over the past few decades various digital filtering and graphically-based methods have been developed in an attempt to improve the assessment of the dynamics of the various sources of streamflow (e.g. groundwater, surface runoff, subsurface flow); however, these methods are not available under an integrated platform and, individually, often require significant effort for implementation. Here we introduce SEPHYDRO, an open access, customizable web-based tool, which integrates 11 algorithms allowing for separation of streamflow hydrographs. The streamlined interface incorporates a reference guide as well as additional information that allows users to import their own data, customize the algorithms, and compare, visualise and export results. The tool includes one-, two- and three-parameter digital filters as well as graphical separation methods and has been successfully applied in Atlantic Canada, in studies dealing with nutrient loading to fresh water and coastal water ecosystems. Future developments include integration of additional separation algorithms as well as incorporation of geochemical separation methods. SEPHYDRO has been developed through a collaborative research effort between the Canadian Rivers Institute, University of New Brunswick (Fredericton, New Brunswick, Canada), Agriculture and Agri-Food Canada and Environment and Climate Change Canada and is currently available at http://canadianriversinstitute.com/tool/

  8. Model of refrigerated display-space allocation for multi agro-perishable products considering markdown policy

    NASA Astrophysics Data System (ADS)

    Satiti, D.; Rusdiansyah, A.

    2018-04-01

    Problems that need more attention in the agri-food supply chain are loss and waste as consequences from improper quality control and excessive inventories. The use of cold storage is still being one of favourite technologies in controlling product quality by majority of retailers. We considerate the temperature of cold storage in determining the inventory and pricing strategies based on identified product quality. This study aims to minimize the agri-food waste, utility of cold storage facilities and maximize retailer’s profit through determining the refrigerated display-space allocation and markdown policy based on identified food shelf life. The proposed model evaluated with several different scenarios to find out the right strategy.

  9. An Inexpensive, Stable, and Accurate Relative Humidity Measurement Method for Challenging Environments.

    PubMed

    Zhang, Wei; Ma, Hong; Yang, Simon X

    2016-03-18

    In this research, an improved psychrometer is developed to solve practical issues arising in the relative humidity measurement of challenging drying environments for meat manufacturing in agricultural and agri-food industries. The design in this research focused on the structure of the improved psychrometer, signal conversion, and calculation methods. The experimental results showed the effect of varying psychrometer structure on relative humidity measurement accuracy. An industrial application to dry-cured meat products demonstrated the effective performance of the improved psychrometer being used as a relative humidity measurement sensor in meat-drying rooms. In a drying environment for meat manufacturing, the achieved measurement accuracy for relative humidity using the improved psychrometer was ±0.6%. The system test results showed that the improved psychrometer can provide reliable and long-term stable relative humidity measurements with high accuracy in the drying system of meat products.

  10. Handling Hot Water, With a Payoff

    ERIC Educational Resources Information Center

    Stewart, Ronald; Mathur, S. P.

    1970-01-01

    Discusses methods of utilizing waste heat from the increasing number of power stations. Possible uses include agri- and mariculture, centralized urban and industrial heating, and deicing of airports and marine facilities. (AL)

  11. Forthcoming Challenges in Mycotoxins Toxicology Research for Safer Food-A Need for Multi-Omics Approach.

    PubMed

    Dellafiora, Luca; Dall'Asta, Chiara

    2017-01-04

    The presence of mycotoxins in food represents a severe threat for public health and welfare, and poses relevant research challenges in the food toxicology field. Nowadays, food toxicologists have to provide answers to food-related toxicological issues, but at the same time they should provide the appropriate knowledge in background to effectively support the evidence-based decision-making in food safety. Therefore, keeping in mind that regulatory actions should be based on sound scientific findings, the present opinion addresses the main challenges in providing reliable data for supporting the risk assessment of foodborne mycotoxins.

  12. Language Practitioners' Reflections on Method-Based and Post-Method Pedagogies

    ERIC Educational Resources Information Center

    Soomro, Abdul Fattah; Almalki, Mansoor S.

    2017-01-01

    Method-based pedagogies are commonly applied in teaching English as a foreign language all over the world. However, in the last quarter of the 20th century, the concept of such pedagogies based on the application of a single best method in EFL started to be viewed with concerns by some scholars. In response to the growing concern against the…

  13. An Inexpensive, Stable, and Accurate Relative Humidity Measurement Method for Challenging Environments

    PubMed Central

    Zhang, Wei; Ma, Hong; Yang, Simon X.

    2016-01-01

    In this research, an improved psychrometer is developed to solve practical issues arising in the relative humidity measurement of challenging drying environments for meat manufacturing in agricultural and agri-food industries. The design in this research focused on the structure of the improved psychrometer, signal conversion, and calculation methods. The experimental results showed the effect of varying psychrometer structure on relative humidity measurement accuracy. An industrial application to dry-cured meat products demonstrated the effective performance of the improved psychrometer being used as a relative humidity measurement sensor in meat-drying rooms. In a drying environment for meat manufacturing, the achieved measurement accuracy for relative humidity using the improved psychrometer was ±0.6%. The system test results showed that the improved psychrometer can provide reliable and long-term stable relative humidity measurements with high accuracy in the drying system of meat products. PMID:26999161

  14. Model-Based Method for Sensor Validation

    NASA Technical Reports Server (NTRS)

    Vatan, Farrokh

    2012-01-01

    Fault detection, diagnosis, and prognosis are essential tasks in the operation of autonomous spacecraft, instruments, and in situ platforms. One of NASA s key mission requirements is robust state estimation. Sensing, using a wide range of sensors and sensor fusion approaches, plays a central role in robust state estimation, and there is a need to diagnose sensor failure as well as component failure. Sensor validation can be considered to be part of the larger effort of improving reliability and safety. The standard methods for solving the sensor validation problem are based on probabilistic analysis of the system, from which the method based on Bayesian networks is most popular. Therefore, these methods can only predict the most probable faulty sensors, which are subject to the initial probabilities defined for the failures. The method developed in this work is based on a model-based approach and provides the faulty sensors (if any), which can be logically inferred from the model of the system and the sensor readings (observations). The method is also more suitable for the systems when it is hard, or even impossible, to find the probability functions of the system. The method starts by a new mathematical description of the problem and develops a very efficient and systematic algorithm for its solution. The method builds on the concepts of analytical redundant relations (ARRs).

  15. The Base 32 Method: An Improved Method for Coding Sibling Constellations.

    ERIC Educational Resources Information Center

    Perfetti, Lawrence J. Carpenter

    1990-01-01

    Offers new sibling constellation coding method (Base 32) for genograms using binary and base 32 numbers that saves considerable microcomputer memory. Points out that new method will result in greater ability to store and analyze larger amounts of family data. (Author/CM)

  16. Accelerated, microwave-assisted, and conventional solvent extraction methods affect anthocyanin composition from colored grains.

    PubMed

    Abdel-Aal, El-Sayed M; Akhtar, Humayoun; Rabalski, Iwona; Bryan, Michael

    2014-02-01

    Anthocyanins are important dietary components with diverse positive functions in human health. This study investigates effects of accelerated solvent extraction (ASE) and microwave-assisted extraction (MAE) on anthocyanin composition and extraction efficiency from blue wheat, purple corn, and black rice in comparison with the commonly used solvent extraction (CSE). Factorial experimental design was employed to study effects of ASE and MAE variables, and anthocyanin extracts were analyzed by spectrophotometry, high-performance liquid chromatography-diode array detector (DAD), and liquid chromatography-mass spectrometry chromatography. The extraction efficiency of ASE and MAE was comparable with CSE at the optimal conditions. The greatest extraction by ASE was achieved at 50 °C, 2500 psi, 10 min using 5 cycles, and 100% flush. For MAE, a combination of 70 °C, 300 W, and 10 min in MAE was the most effective in extracting anthocyanins from blue wheat and purple corn compared with 50 °C, 1200 W, and 20 min for black rice. The anthocyanin composition of grain extracts was influenced by the extraction method. The ASE extraction method seems to be more appropriate in extracting anthocyanins from the colored grains as being comparable with the CSE method based on changes in anthocyanin composition. The method caused lower structural changes in anthocaynins compared with the MAE method. Changes in blue wheat anthocyanins were lower in comparison with purple corn or black rice perhaps due to the absence of acylated anthocyanin compounds in blue wheat. The results show significant differences in anthocyanins among the 3 extraction methods, which indicate a need to standardize a method for valid comparisons among studies and for quality assurance purposes. © 2014 Her Majesty the Queen in Right of Canada Journal of Food Science © 2014 Institute of Food Technologists® Reproduced with the permission of the Minister of Agriculture and Agri-Food Canada.

  17. High Resolution Vp and Vp/Vs Local Earthquake Tomography of the Val d'Agri Region (Southern Apennines, Italy).

    NASA Astrophysics Data System (ADS)

    Improta, L.; Bagh, S.; De Gori, P.; Pastori, M.; Piccinini, D.; Valoroso, L.; Anselmi, M.; Buttinelli, M.; Chiarabba, C.

    2015-12-01

    The Val d'Agri (VA) Quaternary basin in the southern Apennines extensional belt hosts the largest oilfield in onshore Europe and normal-fault systems with high (up to M7) seismogenic potential. Frequent small-magnitude swarms related to both active crustal extension and anthropogenic activity have occurred in the region. Causal factors for induced seismicity are a water impoundment with severe seasonal oscillations and a high-rate wastewater injection well. We analyzed around 1200 earthquakes (ML<3.3) occurred in the VA and surrounding regions between 2001-2014. We integrated waveforms recorded at 46 seismic stations belonging to 3 different networks: a dense temporary network installed by INGV in 2005-2006, the permanent national network of INGV, and the trigger-mode monitoring network managed by the local operator ENI petroleum company. We used local earthquake tomography to investigate static and transient features of the crustal velocity structure and to accurately locate earthquakes. Vp and Vp/Vs models are parameterized by a 3x3x2 km spacing and well resolved down to about 12 km depth. The complex Vp model illuminates broad antiformal structures corresponding to wide ramp-anticlines involving Mesozoic carbonates of the Apulia hydrocarbon reservoir, and NW-SE trending low Vp regions related to thrust-sheet-top clastic basins. The VA basin corresponds to shallow low-Vp region. Focal mechanisms show normal faulting kinematics with minor strike slip solutions in agreement with the local extensional regime. Earthquake locations and focal solutions depict shallow (< 5 km depth) E-dipping extensional structures beneath the artificial lake located in the southern sector of the basin, and along the western margin of the VA. A few swarms define relatively deep transfer structures accommodating the differential extension between main normal faults. The spatio-temporal distribution of around 220 events correlates with wastewater disposal activity, illuminating a NE

  18. The key role of the meat industry in transformation to a low-carbon, climate resilient, sustainable economy.

    PubMed

    Rijsberman, Frank

    2017-10-01

    Climate change, air pollution and refugees have become key global challenges threatening sustainability of lifestyles, economies and ecosystems. Agri-food systems are the number one driver of environmental change. Livestock production is the world's largest land user, responsible for half of greenhouse gas emissions from agri-food systems, and the source of repeated health crises. Poor diets have become the number one cause of ill health. Recommendations for a healthy diet emphasize plant-based food. Rapidly falling costs in information technology, biotechnology, renewable energy and battery technology will disrupt current energy and transportation systems and offer opportunities for responsible meat production. Growing consumer interest in healthy food, combined with innovative information systems, offer opportunities to create value through quality control and consumer information in integrated value chains. Meat scientists have a major role to play in the necessary transformation of global agri-food systems towards a new model of green economic growth that is climate resilient, sustainable and provides green jobs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. FISHing for bacteria in food--a promising tool for the reliable detection of pathogenic bacteria?

    PubMed

    Rohde, Alexander; Hammerl, Jens Andre; Appel, Bernd; Dieckmann, Ralf; Al Dahouk, Sascha

    2015-04-01

    Foodborne pathogens cause millions of infections every year and are responsible for considerable economic losses worldwide. The current gold standard for the detection of bacterial pathogens in food is still the conventional cultivation following standardized and generally accepted protocols. However, these methods are time-consuming and do not provide fast information about food contaminations and thus are limited in their ability to protect consumers in time from potential microbial hazards. Fluorescence in situ hybridization (FISH) represents a rapid and highly specific technique for whole-cell detection. This review aims to summarize the current data on FISH-testing for the detection of pathogenic bacteria in different food matrices and to evaluate its suitability for the implementation in routine testing. In this context, the use of FISH in different matrices and their pretreatment will be presented, the sensitivity and specificity of FISH tests will be considered and the need for automation shall be discussed as well as the use of technological improvements to overcome current hurdles for a broad application in monitoring food safety. In addition, the overall economical feasibility will be assessed in a rough calculation of costs, and strengths and weaknesses of FISH are considered in comparison with traditional and well-established detection methods. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.

  20. Highly efficient preparation of sphingoid bases from glucosylceramides by chemoenzymatic method[S

    PubMed Central

    Gowda, Siddabasave Gowda B.; Usuki, Seigo; Hammam, Mostafa A. S.; Murai, Yuta; Igarashi, Yasuyuki; Monde, Kenji

    2016-01-01

    Sphingoid base derivatives have attracted increasing attention as promising chemotherapeutic candidates against lifestyle diseases such as diabetes and cancer. Natural sphingoid bases can be a potential resource instead of those derived by time-consuming total organic synthesis. In particular, glucosylceramides (GlcCers) in food plants are enriched sources of sphingoid bases, differing from those of animals. Several chemical methodologies to transform GlcCers to sphingoid bases have already investigated; however, these conventional methods using acid or alkaline hydrolysis are not efficient due to poor reaction yield, producing complex by-products and resulting in separation problems. In this study, an extremely efficient and practical chemoenzymatic transformation method has been developed using microwave-enhanced butanolysis of GlcCers and a large amount of readily available almond β-glucosidase for its deglycosylation reaction of lysoGlcCers. The method is superior to conventional acid/base hydrolysis methods in its rapidity and its reaction cleanness (no isomerization, no rearrangement) with excellent overall yield. PMID:26667669

  1. An RBF-based compression method for image-based relighting.

    PubMed

    Leung, Chi-Sing; Wong, Tien-Tsin; Lam, Ping-Man; Choy, Kwok-Hung

    2006-04-01

    In image-based relighting, a pixel is associated with a number of sampled radiance values. This paper presents a two-level compression method. In the first level, the plenoptic property of a pixel is approximated by a spherical radial basis function (SRBF) network. That means that the spherical plenoptic function of each pixel is represented by a number of SRBF weights. In the second level, we apply a wavelet-based method to compress these SRBF weights. To reduce the visual artifact due to quantization noise, we develop a constrained method for estimating the SRBF weights. Our proposed approach is superior to JPEG, JPEG2000, and MPEG. Compared with the spherical harmonics approach, our approach has a lower complexity, while the visual quality is comparable. The real-time rendering method for our SRBF representation is also discussed.

  2. Ratio-based vs. model-based methods to correct for urinary creatinine concentrations.

    PubMed

    Jain, Ram B

    2016-08-01

    Creatinine-corrected urinary analyte concentration is usually computed as the ratio of the observed level of analyte concentration divided by the observed level of the urinary creatinine concentration (UCR). This ratio-based method is flawed since it implicitly assumes that hydration is the only factor that affects urinary creatinine concentrations. On the contrary, it has been shown in the literature, that age, gender, race/ethnicity, and other factors also affect UCR. Consequently, an optimal method to correct for UCR should correct for hydration as well as other factors like age, gender, and race/ethnicity that affect UCR. Model-based creatinine correction in which observed UCRs are used as an independent variable in regression models has been proposed. This study was conducted to evaluate the performance of ratio-based and model-based creatinine correction methods when the effects of gender, age, and race/ethnicity are evaluated one factor at a time for selected urinary analytes and metabolites. It was observed that ratio-based method leads to statistically significant pairwise differences, for example, between males and females or between non-Hispanic whites (NHW) and non-Hispanic blacks (NHB), more often than the model-based method. However, depending upon the analyte of interest, the reverse is also possible. The estimated ratios of geometric means (GM), for example, male to female or NHW to NHB, were also compared for the two methods. When estimated UCRs were higher for the group (for example, males) in the numerator of this ratio, these ratios were higher for the model-based method, for example, male to female ratio of GMs. When estimated UCR were lower for the group (for example, NHW) in the numerator of this ratio, these ratios were higher for the ratio-based method, for example, NHW to NHB ratio of GMs. Model-based method is the method of choice if all factors that affect UCR are to be accounted for.

  3. Case-based explanation of non-case-based learning methods.

    PubMed Central

    Caruana, R.; Kangarloo, H.; Dionisio, J. D.; Sinha, U.; Johnson, D.

    1999-01-01

    We show how to generate case-based explanations for non-case-based learning methods such as artificial neural nets or decision trees. The method uses the trained model (e.g., the neural net or the decision tree) as a distance metric to determine which cases in the training set are most similar to the case that needs to be explained. This approach is well suited to medical domains, where it is important to understand predictions made by complex machine learning models, and where training and clinical practice makes users adept at case interpretation. PMID:10566351

  4. Treecode-based generalized Born method

    NASA Astrophysics Data System (ADS)

    Xu, Zhenli; Cheng, Xiaolin; Yang, Haizhao

    2011-02-01

    We have developed a treecode-based O(Nlog N) algorithm for the generalized Born (GB) implicit solvation model. Our treecode-based GB (tGB) is based on the GBr6 [J. Phys. Chem. B 111, 3055 (2007)], an analytical GB method with a pairwise descreening approximation for the R6 volume integral expression. The algorithm is composed of a cutoff scheme for the effective Born radii calculation, and a treecode implementation of the GB charge-charge pair interactions. Test results demonstrate that the tGB algorithm can reproduce the vdW surface based Poisson solvation energy with an average relative error less than 0.6% while providing an almost linear-scaling calculation for a representative set of 25 proteins with different sizes (from 2815 atoms to 65456 atoms). For a typical system of 10k atoms, the tGB calculation is three times faster than the direct summation as implemented in the original GBr6 model. Thus, our tGB method provides an efficient way for performing implicit solvent GB simulations of larger biomolecular systems at longer time scales.

  5. 77 FR 20779 - Notice of Funds Availability (NOFA) Inviting Applications for the 2012 Farmers' Market Promotion...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-06

    ...-tourism activities, and other direct producer-to-consumer infrastructures. AMS hereby requests proposals... of domestic farmers markets, roadside stands, community-supported agriculture programs, agri-tourism... development of new farmers markets, roadside stands, community-supported agriculture programs, agri-tourism...

  6. Pyrolyzed-parylene based sensors and method of manufacture

    NASA Technical Reports Server (NTRS)

    Tai, Yu-Chong (Inventor); Liger, Matthieu (Inventor); Miserendino, Scott (Inventor); Konishi, Satoshi (Inventor)

    2007-01-01

    A method (and resulting structure) for fabricating a sensing device. The method includes providing a substrate comprising a surface region and forming an insulating material overlying the surface region. The method also includes forming a film of carbon based material overlying the insulating material and treating to the film of carbon based material to pyrolyzed the carbon based material to cause formation of a film of substantially carbon based material having a resistivity ranging within a predetermined range. The method also provides at least a portion of the pyrolyzed carbon based material in a sensor application and uses the portion of the pyrolyzed carbon based material in the sensing application. In a specific embodiment, the sensing application is selected from chemical, humidity, piezoelectric, radiation, mechanical strain or temperature.

  7. METHOD OF JOINING CARBIDES TO BASE METALS

    DOEpatents

    Krikorian, N.H.; Farr, J.D.; Witteman, W.G.

    1962-02-13

    A method is described for joining a refractory metal carbide such as UC or ZrC to a refractory metal base such as Ta or Nb. The method comprises carburizing the surface of the metal base and then sintering the base and carbide at temperatures of about 2000 deg C in a non-oxidizing atmosphere, the base and carbide being held in contact during the sintering step. To reduce the sintering temperature and time, a sintering aid such as iron, nickel, or cobait is added to the carbide, not to exceed 5 wt%. (AEC)

  8. Reservoir Structure and Wastewater-Induced Seismicity at the Val d'Agri Oilfield (Italy) Shown by Three-Dimensional Vp and Vp/Vs Local Earthquake Tomography

    NASA Astrophysics Data System (ADS)

    Improta, L.; Bagh, S.; De Gori, P.; Valoroso, L.; Pastori, M.; Piccinini, D.; Chiarabba, C.; Anselmi, M.; Buttinelli, M.

    2017-11-01

    Wastewater injection into a high-rate well in the Val d'Agri oilfield, the largest in onshore Europe, has induced swarm microseismicity since the initiation of disposal in 2006. To investigate the reservoir structure and to track seismicity, we performed a high-spatial resolution local earthquake tomography using 1,281 natural and induced earthquakes recorded by local networks. The properties of the carbonate reservoir (rock fracturing, pore fluid pressure) and inherited faults control the occurrence and spatiotemporal distribution of seismicity. A low-Vp, high-Vp/Vs region under the well represents a fluid saturated fault zone ruptured by induced seismicity. High-Vp, high-Vp/Vs bumps match reservoir culminations indicating saturated liquid-bearing zones, whereas a very low Vp, low Vp/Vs anomaly might represent a strongly fractured and depleted zone of the hydrocarbon reservoir characterized by significant fluid withdrawal. The comprehensive picture of the injection-linked seismicity obtained by integrating reservoir-scale tomography, high-precision earthquake locations, and geophysical and injection data suggests that the driving mechanism is the channeling of pore pressure perturbations through a high permeable fault damage zone within the reservoir. The damage zone surrounds a Pliocene reverse fault optimally oriented in the current extensional stress field. The ruptured damage zone measures 2 km along strike and 3 km along dip and is confined between low permeability ductile formations. Injection pressure is the primary parameter controlling seismicity rate. Our study underlines that local earthquake tomography also using wastewater-induced seismicity can give useful insights into the physical mechanism leading to these earthquakes.

  9. Adaptive Set-Based Methods for Association Testing

    PubMed Central

    Su, Yu-Chen; Gauderman, W. James; Kiros, Berhane; Lewinger, Juan Pablo

    2017-01-01

    With a typical sample size of a few thousand subjects, a single genomewide association study (GWAS) using traditional one-SNP-at-a-time methods can only detect genetic variants conferring a sizable effect on disease risk. Set-based methods, which analyze sets of SNPs jointly, can detect variants with smaller effects acting within a gene, a pathway, or other biologically relevant sets. While self-contained set-based methods (those that test sets of variants without regard to variants not in the set) are generally more powerful than competitive set-based approaches (those that rely on comparison of variants in the set of interest with variants not in the set), there is no consensus as to which self-contained methods are best. In particular, several self-contained set tests have been proposed to directly or indirectly ‘adapt’ to the a priori unknown proportion and distribution of effects of the truly associated SNPs in the set, which is a major determinant of their power. A popular adaptive set-based test is the adaptive rank truncated product (ARTP), which seeks the set of SNPs that yields the best-combined evidence of association. We compared the standard ARTP, several ARTP variations we introduced, and other adaptive methods in a comprehensive simulation study to evaluate their performance. We used permutations to assess significance for all the methods and thus provide a level playing field for comparison. We found the standard ARTP test to have the highest power across our simulations followed closely by the global model of random effects (GMRE) and a LASSO based test. PMID:26707371

  10. Comparing Methods for UAV-Based Autonomous Surveillance

    NASA Technical Reports Server (NTRS)

    Freed, Michael; Harris, Robert; Shafto, Michael

    2004-01-01

    We describe an approach to evaluating algorithmic and human performance in directing UAV-based surveillance. Its key elements are a decision-theoretic framework for measuring the utility of a surveillance schedule and an evaluation testbed consisting of 243 scenarios covering a well-defined space of possible missions. We apply this approach to two example UAV-based surveillance methods, a TSP-based algorithm and a human-directed approach, then compare them to identify general strengths, and weaknesses of each method.

  11. Handling Practicalities in Agricultural Policy Optimization for Water Quality Improvements

    EPA Science Inventory

    Bilevel and multi-objective optimization methods are often useful to spatially target agri-environmental policy throughout a watershed. This type of problem is complex and is comprised of a number of practicalities: (i) a large number of decision variables, (ii) at least two inte...

  12. Description logic-based methods for auditing frame-based medical terminological systems.

    PubMed

    Cornet, Ronald; Abu-Hanna, Ameen

    2005-07-01

    Medical terminological systems (TSs) play an increasingly important role in health care by supporting recording, retrieval and analysis of patient information. As the size and complexity of TSs are growing, the need arises for means to audit them, i.e. verify and maintain (logical) consistency and (semantic) correctness of their contents. This is not only important for the management of TSs but also for providing their users with confidence about the reliability of their contents. Formal methods have the potential to play an important role in the audit of TSs, although there are few empirical studies to assess the benefits of using these methods. In this paper we propose a method based on description logics (DLs) for the audit of TSs. This method is based on the migration of the medical TS from a frame-based representation to a DL-based one. Our method is characterized by a process in which initially stringent assumptions are made about concept definitions. The assumptions allow the detection of concepts and relations that might comprise a source of logical inconsistency. If the assumptions hold then definitions are to be altered to eliminate the inconsistency, otherwise the assumptions are revised. In order to demonstrate the utility of the approach in a real-world case study we audit a TS in the intensive care domain and discuss decisions pertaining to building DL-based representations. This case study demonstrates that certain types of inconsistencies can indeed be detected by applying the method to a medical terminological system. The added value of the method described in this paper is that it provides a means to evaluate the compliance to a number of common modeling principles in a formal manner. The proposed method reveals potential modeling inconsistencies, helping to audit and (if possible) improve the medical TS. In this way, it contributes to providing confidence in the contents of the terminological system.

  13. Comparison of Text-Based and Visual-Based Programming Input Methods for First-Time Learners

    ERIC Educational Resources Information Center

    Saito, Daisuke; Washizaki, Hironori; Fukazawa, Yoshiaki

    2017-01-01

    Aim/Purpose: When learning to program, both text-based and visual-based input methods are common. However, it is unclear which method is more appropriate for first-time learners (first learners). Background: The differences in the learning effect between text-based and visual-based input methods for first learners are compared the using a…

  14. Graph reconstruction using covariance-based methods.

    PubMed

    Sulaimanov, Nurgazy; Koeppl, Heinz

    2016-12-01

    Methods based on correlation and partial correlation are today employed in the reconstruction of a statistical interaction graph from high-throughput omics data. These dedicated methods work well even for the case when the number of variables exceeds the number of samples. In this study, we investigate how the graphs extracted from covariance and concentration matrix estimates are related by using Neumann series and transitive closure and through discussing concrete small examples. Considering the ideal case where the true graph is available, we also compare correlation and partial correlation methods for large realistic graphs. In particular, we perform the comparisons with optimally selected parameters based on the true underlying graph and with data-driven approaches where the parameters are directly estimated from the data.

  15. EEG feature selection method based on decision tree.

    PubMed

    Duan, Lijuan; Ge, Hui; Ma, Wei; Miao, Jun

    2015-01-01

    This paper aims to solve automated feature selection problem in brain computer interface (BCI). In order to automate feature selection process, we proposed a novel EEG feature selection method based on decision tree (DT). During the electroencephalogram (EEG) signal processing, a feature extraction method based on principle component analysis (PCA) was used, and the selection process based on decision tree was performed by searching the feature space and automatically selecting optimal features. Considering that EEG signals are a series of non-linear signals, a generalized linear classifier named support vector machine (SVM) was chosen. In order to test the validity of the proposed method, we applied the EEG feature selection method based on decision tree to BCI Competition II datasets Ia, and the experiment showed encouraging results.

  16. Earth Science Goes E-Commerce

    NASA Technical Reports Server (NTRS)

    2000-01-01

    Software packages commercially marketed by Agri ImaGIS allow customers to analyze farm fields. Agri ImaGIS provides satellite images of farmland and agricultural views to US clients. The company approached NASA-MSU TechLink for access to technology that would improve the company's capabilities to deliver satellite images over the Internet. TechLink found that software with the desired functions had already been developed through NASA's Remote Sensing Database Program. Agri ImaGIS formed a partnership with the University of Minnesota group that allows the company to further develop the software to meet its Internet commerce needs.

  17. Adaptive Set-Based Methods for Association Testing.

    PubMed

    Su, Yu-Chen; Gauderman, William James; Berhane, Kiros; Lewinger, Juan Pablo

    2016-02-01

    With a typical sample size of a few thousand subjects, a single genome-wide association study (GWAS) using traditional one single nucleotide polymorphism (SNP)-at-a-time methods can only detect genetic variants conferring a sizable effect on disease risk. Set-based methods, which analyze sets of SNPs jointly, can detect variants with smaller effects acting within a gene, a pathway, or other biologically relevant sets. Although self-contained set-based methods (those that test sets of variants without regard to variants not in the set) are generally more powerful than competitive set-based approaches (those that rely on comparison of variants in the set of interest with variants not in the set), there is no consensus as to which self-contained methods are best. In particular, several self-contained set tests have been proposed to directly or indirectly "adapt" to the a priori unknown proportion and distribution of effects of the truly associated SNPs in the set, which is a major determinant of their power. A popular adaptive set-based test is the adaptive rank truncated product (ARTP), which seeks the set of SNPs that yields the best-combined evidence of association. We compared the standard ARTP, several ARTP variations we introduced, and other adaptive methods in a comprehensive simulation study to evaluate their performance. We used permutations to assess significance for all the methods and thus provide a level playing field for comparison. We found the standard ARTP test to have the highest power across our simulations followed closely by the global model of random effects (GMRE) and a least absolute shrinkage and selection operator (LASSO)-based test. © 2015 WILEY PERIODICALS, INC.

  18. Shrinkage regression-based methods for microarray missing value imputation.

    PubMed

    Wang, Hsiuying; Chiu, Chia-Chun; Wu, Yi-Ching; Wu, Wei-Sheng

    2013-01-01

    Missing values commonly occur in the microarray data, which usually contain more than 5% missing values with up to 90% of genes affected. Inaccurate missing value estimation results in reducing the power of downstream microarray data analyses. Many types of methods have been developed to estimate missing values. Among them, the regression-based methods are very popular and have been shown to perform better than the other types of methods in many testing microarray datasets. To further improve the performances of the regression-based methods, we propose shrinkage regression-based methods. Our methods take the advantage of the correlation structure in the microarray data and select similar genes for the target gene by Pearson correlation coefficients. Besides, our methods incorporate the least squares principle, utilize a shrinkage estimation approach to adjust the coefficients of the regression model, and then use the new coefficients to estimate missing values. Simulation results show that the proposed methods provide more accurate missing value estimation in six testing microarray datasets than the existing regression-based methods do. Imputation of missing values is a very important aspect of microarray data analyses because most of the downstream analyses require a complete dataset. Therefore, exploring accurate and efficient methods for estimating missing values has become an essential issue. Since our proposed shrinkage regression-based methods can provide accurate missing value estimation, they are competitive alternatives to the existing regression-based methods.

  19. Extension Online: Utilizing Technology to Enhance Educational Outreach

    ERIC Educational Resources Information Center

    Green, Stephen

    2012-01-01

    Extension Online is an Internet-based online course platform that enables the Texas AgriLife Extension Service's Family Development and Resource Management (FDRM) unit to reach tens of thousands of users across the U.S. annually with research-based information. This article introduces readers to Extension Online by describing the history of its…

  20. Ontology-Based Method for Fault Diagnosis of Loaders.

    PubMed

    Xu, Feixiang; Liu, Xinhui; Chen, Wei; Zhou, Chen; Cao, Bingwei

    2018-02-28

    This paper proposes an ontology-based fault diagnosis method which overcomes the difficulty of understanding complex fault diagnosis knowledge of loaders and offers a universal approach for fault diagnosis of all loaders. This method contains the following components: (1) An ontology-based fault diagnosis model is proposed to achieve the integrating, sharing and reusing of fault diagnosis knowledge for loaders; (2) combined with ontology, CBR (case-based reasoning) is introduced to realize effective and accurate fault diagnoses following four steps (feature selection, case-retrieval, case-matching and case-updating); and (3) in order to cover the shortages of the CBR method due to the lack of concerned cases, ontology based RBR (rule-based reasoning) is put forward through building SWRL (Semantic Web Rule Language) rules. An application program is also developed to implement the above methods to assist in finding the fault causes, fault locations and maintenance measures of loaders. In addition, the program is validated through analyzing a case study.

  1. Ontology-Based Method for Fault Diagnosis of Loaders

    PubMed Central

    Liu, Xinhui; Chen, Wei; Zhou, Chen; Cao, Bingwei

    2018-01-01

    This paper proposes an ontology-based fault diagnosis method which overcomes the difficulty of understanding complex fault diagnosis knowledge of loaders and offers a universal approach for fault diagnosis of all loaders. This method contains the following components: (1) An ontology-based fault diagnosis model is proposed to achieve the integrating, sharing and reusing of fault diagnosis knowledge for loaders; (2) combined with ontology, CBR (case-based reasoning) is introduced to realize effective and accurate fault diagnoses following four steps (feature selection, case-retrieval, case-matching and case-updating); and (3) in order to cover the shortages of the CBR method due to the lack of concerned cases, ontology based RBR (rule-based reasoning) is put forward through building SWRL (Semantic Web Rule Language) rules. An application program is also developed to implement the above methods to assist in finding the fault causes, fault locations and maintenance measures of loaders. In addition, the program is validated through analyzing a case study. PMID:29495646

  2. Integrating Agricultural and Ecological Goals into the Management of Species-Rich Grasslands: Learning from the Flowering Meadows Competition in France

    NASA Astrophysics Data System (ADS)

    Magda, Danièle; de Sainte Marie, Christine; Plantureux, Sylvain; Agreil, Cyril; Amiaud, Bernard; Mestelan, Philippe; Mihout, Sarah

    2015-11-01

    Current agri-environmental schemes for reconciling agricultural production with biodiversity conservation are proving ineffective Europe-wide, increasing interest in results-based schemes (RBSs). We describe here the French "Flowering Meadows" competition, rewarding the "best agroecological balance" in semi-natural grasslands managed by livestock farmers. This competition, which was entered by about a thousand farmers in 50 regional nature parks between 2007 and 2014, explicitly promotes a new style of agri-environmental scheme focusing on an ability to reach the desired outcome rather than adherence to prescriptive management rules. Building on our experience in the design and monitoring of the competition, we argue that the cornerstone of successful RBSs is a collective learning process in which the reconciliation of agriculture and environment is reconsidered in terms of synergistic relationships between agricultural and ecological functioning. We present the interactive, iterative process by which we defined an original method for assessing species-rich grasslands in agroecological terms. This approach was based on the integration of new criteria, such as flexibility, feeding value, and consistency of use, into the assessment of forage production performance and the consideration of biodiversity conservation through its functional role within the grassland ecosystem, rather than simply noting the presence or abundance of species. We describe the adaptation of this methodology on the basis of competition feedback, to bring about a significant shift in the conventional working methods of agronomists and conservationists (including researchers).The potential and efficacy of RBSs for promoting ecologically sound livestock systems are discussed in the concluding remarks, and they relate to the ecological intensification debate.

  3. Evaluation of medical students of teacher-based and student-based teaching methods in Infectious diseases course.

    PubMed

    Ghasemzadeh, I; Aghamolaei, T; Hosseini-Parandar, F

    2015-01-01

    Introduction: In recent years, medical education has changed dramatically and many medical schools in the world have been trying for expand modern training methods. Purpose of the research is to appraise the medical students of teacher-based and student-based teaching methods in Infectious diseases course, in the Medical School of Hormozgan Medical Sciences University. Methods: In this interventional study, a total of 52 medical scholars that used Section in this Infectious diseases course were included. About 50% of this course was presented by a teacher-based teaching method (lecture) and 50% by a student-based teaching method (problem-based learning). The satisfaction of students regarding these methods was assessed by a questionnaire and a test was used to measure their learning. information are examined with using SPSS 19 and paired t-test. Results: The satisfaction of students of student-based teaching method (problem-based learning) was more positive than their satisfaction of teacher-based teaching method (lecture).The mean score of students in teacher-based teaching method was 12.03 (SD=4.08) and in the student-based teaching method it was 15.50 (SD=4.26) and where is a considerable variation among them (p<0.001). Conclusion: The use of the student-based teaching method (problem-based learning) in comparison with the teacher-based teaching method (lecture) to present the Infectious diseases course led to the student satisfaction and provided additional learning opportunities.

  4. Comparison of biochemical parameters of rainbow trout (Oncorhynchus mykiss) reared in two different trout farms'

    NASA Astrophysics Data System (ADS)

    Karatas, Tayfun

    2016-04-01

    The aim of this study was to compare biochemical parameters of cultured rainbow trouts (Oncorhynchus mykiss, Walbaum, 1972) reared in two different trout farms' (Agri and Erzurum). The average weights of fish were 150±10gr for first station (Agri), 230±10gr for second station (Erzurum). Fishes used in research were randomly caught from pools, and fifteen pieces were used for each group. Fishes were fed with commercial trout feed with 45-50% crude protein twice a day. The levels of AST, ALT, LDL, total cholesterol and triglyceride in the second station (Erzurum) were found to be higher (p<0.05) than that of first station (Agri). Whereas, the levels of HDL in the second station (Erzurum) were found to be lower (p<0.05) than that of first station (Agri). Differences in the levels of total cholesterol and AST, ALT, HDL, LDL, triglyceride may be associated with size, sex, sexual maturity and environmental conditions (temperature, pH, hardness and dissolved oxygen).

  5. OmniGen-AF supplementation modulated the physiological and acute phase responses of Brahman heifers to an endotoxin challenge

    USDA-ARS?s Scientific Manuscript database

    This study examined the effect of feeding OmniGen-AF (OG; Prince Agri Products) on the physiological and acute phase responses (APR) of newly-weaned heifers to an endotoxin (lipopolysaccharide; LPS) challenge. Brahman heifers (n=24; 183±5 kilograms) from the Texas AgriLife Research Center in Overton...

  6. Drug exposure in register-based research—An expert-opinion based evaluation of methods

    PubMed Central

    Taipale, Heidi; Koponen, Marjaana; Tolppanen, Anna-Maija; Hartikainen, Sirpa; Ahonen, Riitta; Tiihonen, Jari

    2017-01-01

    Background In register-based pharmacoepidemiological studies, construction of drug exposure periods from drug purchases is a major methodological challenge. Various methods have been applied but their validity is rarely evaluated. Our objective was to conduct an expert-opinion based evaluation of the correctness of drug use periods produced by different methods. Methods Drug use periods were calculated with three fixed methods: time windows, assumption of one Defined Daily Dose (DDD) per day and one tablet per day, and with PRE2DUP that is based on modelling of individual drug purchasing behavior. Expert-opinion based evaluation was conducted with 200 randomly selected purchase histories of warfarin, bisoprolol, simvastatin, risperidone and mirtazapine in the MEDALZ-2005 cohort (28,093 persons with Alzheimer’s disease). Two experts reviewed purchase histories and judged which methods had joined correct purchases and gave correct duration for each of 1000 drug exposure periods. Results The evaluated correctness of drug use periods was 70–94% for PRE2DUP, and depending on grace periods and time window lengths 0–73% for tablet methods, 0–41% for DDD methods and 0–11% for time window methods. The highest rate of evaluated correct solutions for each method class were observed for 1 tablet per day with 180 days grace period (TAB_1_180, 43–73%), and 1 DDD per day with 180 days grace period (1–41%). Time window methods produced at maximum only 11% correct solutions. The best performing fixed method TAB_1_180 reached highest correctness for simvastatin 73% (95% CI 65–81%) whereas 89% (95% CI 84–94%) of PRE2DUP periods were judged as correct. Conclusions This study shows inaccuracy of fixed methods and the urgent need for new data-driven methods. In the expert-opinion based evaluation, the lowest error rates were observed with data-driven method PRE2DUP. PMID:28886089

  7. Crack image segmentation based on improved DBC method

    NASA Astrophysics Data System (ADS)

    Cao, Ting; Yang, Nan; Wang, Fengping; Gao, Ting; Wang, Weixing

    2017-11-01

    With the development of computer vision technology, crack detection based on digital image segmentation method arouses global attentions among researchers and transportation ministries. Since the crack always exhibits the random shape and complex texture, it is still a challenge to accomplish reliable crack detection results. Therefore, a novel crack image segmentation method based on fractal DBC (differential box counting) is introduced in this paper. The proposed method can estimate every pixel fractal feature based on neighborhood information which can consider the contribution from all possible direction in the related block. The block moves just one pixel every time so that it could cover all the pixels in the crack image. Unlike the classic DBC method which only describes fractal feature for the related region, this novel method can effectively achieve crack image segmentation according to the fractal feature of every pixel. The experiment proves the proposed method can achieve satisfactory results in crack detection.

  8. A shape-based inter-layer contours correspondence method for ICT-based reverse engineering

    PubMed Central

    Duan, Liming; Yang, Shangpeng; Zhang, Gui; Feng, Fei; Gu, Minghui

    2017-01-01

    The correspondence of a stack of planar contours in ICT (industrial computed tomography)-based reverse engineering, a key step in surface reconstruction, is difficult when the contours or topology of the object are complex. Given the regularity of industrial parts and similarity of the inter-layer contours, a specialized shape-based inter-layer contours correspondence method for ICT-based reverse engineering was presented to solve the above problem based on the vectorized contours. In this paper, the vectorized contours extracted from the slices consist of three graphical primitives: circles, arcs and segments. First, the correspondence of the inter-layer primitives is conducted based on the characteristics of the primitives. Second, based on the corresponded primitives, the inter-layer contours correspond with each other using the proximity rules and exhaustive search. The proposed method can make full use of the shape information to handle industrial parts with complex structures. The feasibility and superiority of this method have been demonstrated via the related experiments. This method can play an instructive role in practice and provide a reference for the related research. PMID:28489867

  9. A shape-based inter-layer contours correspondence method for ICT-based reverse engineering.

    PubMed

    Duan, Liming; Yang, Shangpeng; Zhang, Gui; Feng, Fei; Gu, Minghui

    2017-01-01

    The correspondence of a stack of planar contours in ICT (industrial computed tomography)-based reverse engineering, a key step in surface reconstruction, is difficult when the contours or topology of the object are complex. Given the regularity of industrial parts and similarity of the inter-layer contours, a specialized shape-based inter-layer contours correspondence method for ICT-based reverse engineering was presented to solve the above problem based on the vectorized contours. In this paper, the vectorized contours extracted from the slices consist of three graphical primitives: circles, arcs and segments. First, the correspondence of the inter-layer primitives is conducted based on the characteristics of the primitives. Second, based on the corresponded primitives, the inter-layer contours correspond with each other using the proximity rules and exhaustive search. The proposed method can make full use of the shape information to handle industrial parts with complex structures. The feasibility and superiority of this method have been demonstrated via the related experiments. This method can play an instructive role in practice and provide a reference for the related research.

  10. [Comparison of two algorithms for development of design space-overlapping method and probability-based method].

    PubMed

    Shao, Jing-Yuan; Qu, Hai-Bin; Gong, Xing-Chu

    2018-05-01

    In this work, two algorithms (overlapping method and the probability-based method) for design space calculation were compared by using the data collected from extraction process of Codonopsis Radix as an example. In the probability-based method, experimental error was simulated to calculate the probability of reaching the standard. The effects of several parameters on the calculated design space were studied, including simulation number, step length, and the acceptable probability threshold. For the extraction process of Codonopsis Radix, 10 000 times of simulation and 0.02 for the calculation step length can lead to a satisfactory design space. In general, the overlapping method is easy to understand, and can be realized by several kinds of commercial software without coding programs, but the reliability of the process evaluation indexes when operating in the design space is not indicated. Probability-based method is complex in calculation, but can provide the reliability to ensure that the process indexes can reach the standard within the acceptable probability threshold. In addition, there is no probability mutation in the edge of design space by probability-based method. Therefore, probability-based method is recommended for design space calculation. Copyright© by the Chinese Pharmaceutical Association.

  11. Information fusion methods based on physical laws.

    PubMed

    Rao, Nageswara S V; Reister, David B; Barhen, Jacob

    2005-01-01

    We consider systems whose parameters satisfy certain easily computable physical laws. Each parameter is directly measured by a number of sensors, or estimated using measurements, or both. The measurement process may introduce both systematic and random errors which may then propagate into the estimates. Furthermore, the actual parameter values are not known since every parameter is measured or estimated, which makes the existing sample-based fusion methods inapplicable. We propose a fusion method for combining the measurements and estimators based on the least violation of physical laws that relate the parameters. Under fairly general smoothness and nonsmoothness conditions on the physical laws, we show the asymptotic convergence of our method and also derive distribution-free performance bounds based on finite samples. For suitable choices of the fuser classes, we show that for each parameter the fused estimate is probabilistically at least as good as its best measurement as well as best estimate. We illustrate the effectiveness of this method for a practical problem of fusing well-log data in methane hydrate exploration.

  12. Modulation of the metabolic response to an endotoxin challenge in Brahman heifers through OmniGen-AF supplementation

    USDA-ARS?s Scientific Manuscript database

    This study examined the effect of feeding OmniGen-AF (OG; Prince Agri Products) on the metabolic response of newly-weaned heifers to an endotoxin (lipopolysaccharide; LPS) challenge. Brahman heifers (n=24; 183±5 kilograms) from the Texas AgriLife Research Center in Overton, TX, were separated into 2...

  13. Ultrasound body composition traits response to an endotoxin challenge in Brahman heifers supplemented with Omnigen-AF

    USDA-ARS?s Scientific Manuscript database

    This study examined the effect of feeding OmniGen-AF (OG; Prince Agri Products) on the body composition traits response of newly-weaned heifers to an endotoxin (lipopolysaccharide; LPS) challenge. Brahman heifers (n=24; 183 ± 5 kg) from the Texas AgriLife Research Center in Overton, TX, were separat...

  14. USSR Report, Agriculture

    DTIC Science & Technology

    1984-08-03

    range of foodstuffs. They improve the baking qualities of average grain. They are indispensable for the production of macaroni and confectionery ...million tons of marketable product per year. The material- 20 technical base of USSR Goskomsel’khoztekhnika [State Committee of the Agri- cultural

  15. An XML-based method for astronomy software designing

    NASA Astrophysics Data System (ADS)

    Liao, Mingxue; Aili, Yusupu; Zhang, Jin

    XML-based method for standardization of software designing is introduced and analyzed and successfully applied to renovating the hardware and software of the digital clock at Urumqi Astronomical Station. Basic strategy for eliciting time information from the new digital clock of FT206 in the antenna control program is introduced. By FT206, the need to compute how many centuries passed since a certain day with sophisticated formulas is eliminated and it is no longer necessary to set right UT time for the computer holding control over antenna because the information about year, month, day are all deduced from Julian day dwelling in FT206, rather than from computer time. With XML-based method and standard for software designing, various existing designing methods are unified, communications and collaborations between developers are facilitated, and thus Internet-based mode of developing software becomes possible. The trend of development of XML-based designing method is predicted.

  16. The toxicity of glyphosate alone and glyphosate-surfactant mixtures to western toad (Anaxyrus boreas) tadpoles.

    PubMed

    Vincent, Kim; Davidson, Carlos

    2015-12-01

    Pesticide choice based on toxicity to nontarget wildlife is reliant on available toxicity data. Despite a number of recent studies examining the effects of glyphosate on amphibians, very few have aimed to understand the toxicological effects of glyphosate in combination with surfactants as it is commonly applied in the field. Land managers interested in making pesticide choices based on minimizing impacts to nontarget wildlife are hindered by a lack of published toxicity data. Short-term acute toxicity trials were conducted for glyphosate in the form of isopropylamine salt (IPA) alone and mixed with 2 surfactants: Agri-dex and Competitor with western toad (Anaxyrus [Bufo] boreas) tadpoles. Glyphosate IPA mixed with Competitor was 6 times more toxic than glyphosate IPA mixed with Agri-dex, and both mixtures were more toxic than glyphosate IPA alone. The median lethal concentrations reported for 24-h and 48-h exposures were 8279 mg/L (24 h) and 6392 mg/L (48 h) for glyphosate IPA alone; 5092 mg/L (24 h) and 4254 mg/L (48 h) for glyphosate IPA mixed with Agri-dex; and 853 mg/L (24 h) and 711 mg/L (48 h) for glyphosate IPA mixed with Competitor. The present study indicates that the toxicity of a tank mix may be greatly increased by the addition of surfactants and may vary widely depending on the specific surfactant. © 2015 SETAC.

  17. Agrileisure: Exploring the "fun" of local food

    Treesearch

    Ben Amsden; Jesse McEntee

    2012-01-01

    Farm-based agri-tourism, home-based hobby farming, rural/urban farmers markets, and community-supported agriculture (CSA) are all examples of agriculturally themed activities that, for many, include a measure of leisure and recreation. While people once saw the farm primarily as a space for work and production, a new generation is beginning to see farms as family-...

  18. Gel-based methods in redox proteomics.

    PubMed

    Charles, Rebecca; Jayawardhana, Tamani; Eaton, Philip

    2014-02-01

    The key to understanding the full significance of oxidants in health and disease is the development of tools and methods that allow the study of proteins that sense and transduce changes in cellular redox. Oxidant-reactive deprotonated thiols commonly operate as redox sensors in proteins and a variety of methods have been developed that allow us to monitor their oxidative modification. This outline review specifically focuses on gel-based methods used to detect, quantify and identify protein thiol oxidative modifications. The techniques we discuss fall into one of two broad categories. Firstly, methods that allow oxidation of thiols in specific proteins or the global cellular pool to be monitored are discussed. These typically utilise thiol-labelling reagents that add a reporter moiety (e.g. affinity tag, fluorophore, chromophore), in which loss of labelling signifies oxidation. Secondly, we outline methods that allow specific thiol oxidation states of proteins (e.g. S-sulfenylation, S-nitrosylation, S-thionylation and interprotein disulfide bond formation) to be investigated. A variety of different gel-based methods for identifying thiol proteins that are sensitive to oxidative modifications have been developed. These methods can aid the detection and quantification of thiol redox state, as well as identifying the sensor protein. By understanding how cellular redox is sensed and transduced to a functional effect by protein thiol redox sensors, this will help us better appreciate the role of oxidants in health and disease. This article is part of a Special Issue entitled Current methods to study reactive oxygen species - pros and cons and biophysics of membrane proteins. Guest Editor: Christine Winterbourn. Copyright © 2013 Elsevier B.V. All rights reserved.

  19. Integrating a Microcomputer for Agriculture Data and Telecommunications into the Agriculture Classroom. Final Report.

    ERIC Educational Resources Information Center

    Northeast Arkansas Educational Cooperative, Strawberry.

    A program was conducted to integrate computer technology into agriculture classrooms in 25 secondary schools and 4 universities in Arkansas. State-of-the-art technology from the AgriData Network of Milwaukee was used. The AgriData Network includes the Ag Ed Network, which offers more than 800 "live textbook" lessons with current…

  20. Reconsidering Conceptualisations of Farm Conservation Activity: The Case of Conserving Hay Meadows

    ERIC Educational Resources Information Center

    Riley, Mark

    2006-01-01

    Changes to agri-environmental policy, with an emphasis on encouraging more environmentally friendly farming practices, have been paralleled in the last two decades by a body of research into agri-environment scheme adoption. To date much of this research has considered conservation behaviour as a static issue across whole farms, and viewed…

  1. Adaptive reconnection-based arbitrary Lagrangian Eulerian method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bo, Wurigen; Shashkov, Mikhail

    We present a new adaptive Arbitrary Lagrangian Eulerian (ALE) method. This method is based on the reconnection-based ALE (ReALE) methodology of Refs. [35], [34] and [6]. The main elements in a standard ReALE method are: an explicit Lagrangian phase on an arbitrary polygonal (in 2D) mesh in which the solution and positions of grid nodes are updated; a rezoning phase in which a new grid is defined by changing the connectivity (using Voronoi tessellation) but not the number of cells; and a remapping phase in which the Lagrangian solution is transferred onto the new grid. Furthermore, in the standard ReALEmore » method, the rezoned mesh is smoothed by using one or several steps toward centroidal Voronoi tessellation, but it is not adapted to the solution in any way.« less

  2. Adaptive reconnection-based arbitrary Lagrangian Eulerian method

    DOE PAGES

    Bo, Wurigen; Shashkov, Mikhail

    2015-07-21

    We present a new adaptive Arbitrary Lagrangian Eulerian (ALE) method. This method is based on the reconnection-based ALE (ReALE) methodology of Refs. [35], [34] and [6]. The main elements in a standard ReALE method are: an explicit Lagrangian phase on an arbitrary polygonal (in 2D) mesh in which the solution and positions of grid nodes are updated; a rezoning phase in which a new grid is defined by changing the connectivity (using Voronoi tessellation) but not the number of cells; and a remapping phase in which the Lagrangian solution is transferred onto the new grid. Furthermore, in the standard ReALEmore » method, the rezoned mesh is smoothed by using one or several steps toward centroidal Voronoi tessellation, but it is not adapted to the solution in any way.« less

  3. Lunar-base construction equipment and methods evaluation

    NASA Technical Reports Server (NTRS)

    Boles, Walter W.; Ashley, David B.; Tucker, Richard L.

    1993-01-01

    A process for evaluating lunar-base construction equipment and methods concepts is presented. The process is driven by the need for more quantitative, systematic, and logical methods for assessing further research and development requirements in an area where uncertainties are high, dependence upon terrestrial heuristics is questionable, and quantitative methods are seldom applied. Decision theory concepts are used in determining the value of accurate information and the process is structured as a construction-equipment-and-methods selection methodology. Total construction-related, earth-launch mass is the measure of merit chosen for mathematical modeling purposes. The work is based upon the scope of the lunar base as described in the National Aeronautics and Space Administration's Office of Exploration's 'Exploration Studies Technical Report, FY 1989 Status'. Nine sets of conceptually designed construction equipment are selected as alternative concepts. It is concluded that the evaluation process is well suited for assisting in the establishment of research agendas in an approach that is first broad, with a low level of detail, followed by more-detailed investigations into areas that are identified as critical due to high degrees of uncertainty and sensitivity.

  4. Monitoring of the risk of farmland abandonment as an efficient tool to assess the environmental and socio-economic impact of the Common Agriculture Policy

    NASA Astrophysics Data System (ADS)

    Milenov, Pavel; Vassilev, Vassil; Vassileva, Anna; Radkov, Radko; Samoungi, Vessela; Dimitrov, Zlatomir; Vichev, Nikola

    2014-10-01

    Farmland abandonment (FLA) could be defined as the cessation of agricultural activities on a given surface of land (Pointereau et al., 2008). FLA, often associated with social and economic problems in rural areas, has significant environmental consequences. During the 1990s, millions of hectares of farmland in the new EU Member States, from Central and Eastern Europe, were abandoned as a result of the transition process from centralized and planned to market economy. The policy tools adopted gradually within the Common Agricultural Policy of the European Union (EU CAP), as well as the EU environmental and structural policies, aimed to prevent further expansion of this phenomenon and to facilitate the revival of the agriculture land, being abandoned (ComReg 1122/2009). The Agri-Environment (AGRI-ENV) component of the Core Information Service (CIS), developed within the scope of the FP7-funded project "geoland2" were designed to support the agricultural user community at pan-European and national levels by contributing to the improvement of more accurate and timely monitoring of the status of agricultural land use in Europe and its change. The purpose of the product 'Farmland abandonment', as part of the AGRI-ENV package, is to detect potentially abandoned agriculture land, based on multi-annual SPOT data with several acquisitions per year. It provides essential independent information on the status of the agricultural land as recorded in the Land Parcel Identification System (LPIS), which is one of the core instruments of the implementation of CAP. The production line is based on object-based image analysis and benefits from the extensive availability of Biophysical parameters derived from the satellite data (geoland2). The method detects/tracks those land (or so-called reference) parcels in the LPIS, holding significant amount of land agriculture found as potentially abandoned. Reference parcels with such change are flagged and reported, enabling the National

  5. Utility of Combining a Simulation-Based Method With a Lecture-Based Method for Fundoscopy Training in Neurology Residency.

    PubMed

    Gupta, Deepak K; Khandker, Namir; Stacy, Kristin; Tatsuoka, Curtis M; Preston, David C

    2017-10-01

    Fundoscopic examination is an essential component of the neurologic examination. Competence in its performance is mandated as a required clinical skill for neurology residents by the American Council of Graduate Medical Education. Government and private insurance agencies require its performance and documentation for moderate- and high-level neurologic evaluations. Traditionally, assessment and teaching of this key clinical examination technique have been difficult in neurology residency training. To evaluate the utility of a simulation-based method and the traditional lecture-based method for assessment and teaching of fundoscopy to neurology residents. This study was a prospective, single-blinded, education research study of 48 neurology residents recruited from July 1, 2015, through June 30, 2016, at a large neurology residency training program. Participants were equally divided into control and intervention groups after stratification by training year. Baseline and postintervention assessments were performed using questionnaire, survey, and fundoscopy simulators. After baseline assessment, both groups initially received lecture-based training, which covered fundamental knowledge on the components of fundoscopy and key neurologic findings observed on fundoscopic examination. The intervention group additionally received simulation-based training, which consisted of an instructor-led, hands-on workshop that covered practical skills of performing fundoscopic examination and identifying neurologically relevant findings on another fundoscopy simulator. The primary outcome measures were the postintervention changes in fundoscopy knowledge, skills, and total scores. A total of 30 men and 18 women were equally distributed between the 2 groups. The intervention group had significantly higher mean (SD) increases in skills (2.5 [2.3] vs 0.8 [1.8], P = .01) and total (9.3 [4.3] vs 5.3 [5.8], P = .02) scores compared with the control group. Knowledge scores (6.8 [3

  6. Full Spectrum Operations: A Running Start

    DTIC Science & Technology

    2009-03-31

    looking like nails. —MAJ Curt Taylor , S3 2-8 IN, Diwaniyah, Iraq, August 2006. To avoid the hammer and nails dynamic that may plague maneuver...Gasification System from Princeton Environmental Group; the AgriPower system, based on the “open” Brayton Cycle technology; and Thermogenics

  7. From Post-Productionism to Reflexive Governance: Contested Transitions in Securing More Sustainable Food Futures

    ERIC Educational Resources Information Center

    Marsden, Terry

    2013-01-01

    The paper critically assesses the more turbulent period in agri-food since 2007- by applying a transitions perspective to a range of empirical data collected from key private and public stakeholders in the UK during that period. It argues that increased volatility and a series of interdependent landscape pressures on the dominant agri-food regime…

  8. Challenges and issues concerning mycotoxins contamination in oil seeds and their edible oils: Updates from last decade.

    PubMed

    Bhat, Rajeev; Reddy, Kasa Ravindra Nadha

    2017-01-15

    Safety concerns pertaining towards fungal occurrence and mycotoxins contamination in agri-food commodities has been an issue of high apprehension. With the increase in evidence based research knowledge on health effects posed by ingestion of mycotoxins-contaminated food and feed by humans and livestock, concerns have been raised towards providing more insights on screening of agri-food commodities to benefit consumers. Available reports indicate majority of edible oil-yielding seeds to be contaminated by various fungi, capable of producing mycotoxins. These mycotoxins can enter human food chain via use of edible oils or via animals fed with contaminated oil cake residues. In this review, we have decisively evaluated available data (from the past decade) pertaining towards fungal occurrence and level of mycotoxins in various oil seeds and their edible oils. This review can be of practical use to justify the prevailing gaps, especially relevant to the research on presence of mycotoxins in edible plant based oils. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Optimizing distance-based methods for large data sets

    NASA Astrophysics Data System (ADS)

    Scholl, Tobias; Brenner, Thomas

    2015-10-01

    Distance-based methods for measuring spatial concentration of industries have received an increasing popularity in the spatial econometrics community. However, a limiting factor for using these methods is their computational complexity since both their memory requirements and running times are in {{O}}(n^2). In this paper, we present an algorithm with constant memory requirements and shorter running time, enabling distance-based methods to deal with large data sets. We discuss three recent distance-based methods in spatial econometrics: the D&O-Index by Duranton and Overman (Rev Econ Stud 72(4):1077-1106, 2005), the M-function by Marcon and Puech (J Econ Geogr 10(5):745-762, 2010) and the Cluster-Index by Scholl and Brenner (Reg Stud (ahead-of-print):1-15, 2014). Finally, we present an alternative calculation for the latter index that allows the use of data sets with millions of firms.

  10. A Channelization-Based DOA Estimation Method for Wideband Signals

    PubMed Central

    Guo, Rui; Zhang, Yue; Lin, Qianqiang; Chen, Zengping

    2016-01-01

    In this paper, we propose a novel direction of arrival (DOA) estimation method for wideband signals with sensor arrays. The proposed method splits the wideband array output into multiple frequency sub-channels and estimates the signal parameters using a digital channelization receiver. Based on the output sub-channels, a channelization-based incoherent signal subspace method (Channelization-ISM) and a channelization-based test of orthogonality of projected subspaces method (Channelization-TOPS) are proposed. Channelization-ISM applies narrowband signal subspace methods on each sub-channel independently. Then the arithmetic mean or geometric mean of the estimated DOAs from each sub-channel gives the final result. Channelization-TOPS measures the orthogonality between the signal and the noise subspaces of the output sub-channels to estimate DOAs. The proposed channelization-based method isolates signals in different bandwidths reasonably and improves the output SNR. It outperforms the conventional ISM and TOPS methods on estimation accuracy and dynamic range, especially in real environments. Besides, the parallel processing architecture makes it easy to implement on hardware. A wideband digital array radar (DAR) using direct wideband radio frequency (RF) digitization is presented. Experiments carried out in a microwave anechoic chamber with the wideband DAR are presented to demonstrate the performance. The results verify the effectiveness of the proposed method. PMID:27384566

  11. Fine Increment Soil Collector (FISC): A new device to support high resolution soil and sediment sampling for agri-environmental assessments

    NASA Astrophysics Data System (ADS)

    Mabit, Lionel; Meusburger, Katrin; Iurian, Andra-Rada; Owens, Philip N.; Toloza, Arsenio; Alewell, Christine

    2014-05-01

    Soil and sediment related research for terrestrial agri-environmental assessments requires accurate depth incremental sampling of soil and exposed sediment profiles. Existing coring equipment does not allow collecting soil/sediment increments at millimetre resolution. Therefore, the authors have designed an economic, portable, hand-operated surface soil/sediment sampler - the Fine Increment Soil Collector (FISC) - which allows extensive control of soil/sediment sampling process and easy recovery of the material collected by using a simple screw-thread extraction system. In comparison with existing sampling tools, the FISC has the following advantages and benefits: (i) it permits sampling of soil/sediment samples at the top of the profile; (ii) it is easy to adjust so as to collect soil/sediment at mm resolution; (iii) it is simple to operate by one single person; (iv) incremental samples can be performed in the field or at the laboratory; (v) it permits precise evaluation of bulk density at millimetre vertical resolution; and (vi) sample size can be tailored to analytical requirements. To illustrate the usefulness of the FISC in sampling soil and sediments for 7Be - a well-known cosmogenic soil tracer and fingerprinting tool - measurements, the sampler was tested in a forested soil located 45 km southeast of Vienna in Austria. The fine resolution increments of 7Be (i.e. 2.5 mm) affects directly the measurement of the 7Be total inventory but above all impacts the shape of the 7Be exponential profile which is needed to assess soil movement rates. The FISC can improve the determination of the depth distributions of other Fallout Radionuclides (FRN) - such as 137Cs, 210Pbexand239+240Pu - which are frequently used for soil erosion and sediment transport studies and/or sediment fingerprinting. Such a device also offers great potential to investigate FRN depth distributions associated with fallout events such as that associated with nuclear emergencies. Furthermore, prior

  12. Propensity Score-Based Methods versus MTE-Based Methods in Causal Inference: Identification, Estimation, and Application

    ERIC Educational Resources Information Center

    Zhou, Xiang; Xie, Yu

    2016-01-01

    Since the seminal introduction of the propensity score (PS) by Rosenbaum and Rubin, PS-based methods have been widely used for drawing causal inferences in the behavioral and social sciences. However, the PS approach depends on the ignorability assumption: there are no unobserved confounders once observed covariates are taken into account. For…

  13. Registration of sorghum germplasm Tx3408 and Tx3409 with tolerance to sugarcane aphid [Melanaphis saccari (Zehntner)

    USDA-ARS?s Scientific Manuscript database

    The sorghum (Sorghum bicolor L. Moench) germplasm lines Tx3408 and Tx3409 were developed and released from Texas A&M AgriLife Research and the USDA-ARS in 2015. Both of these lines were developed from intentional crosses using the pedigree method of plant breeding. The breeding crosses for these l...

  14. A flower image retrieval method based on ROI feature.

    PubMed

    Hong, An-Xiang; Chen, Gang; Li, Jun-Li; Chi, Zhe-Ru; Zhang, Dan

    2004-07-01

    Flower image retrieval is a very important step for computer-aided plant species recognition. In this paper, we propose an efficient segmentation method based on color clustering and domain knowledge to extract flower regions from flower images. For flower retrieval, we use the color histogram of a flower region to characterize the color features of flower and two shape-based features sets, Centroid-Contour Distance (CCD) and Angle Code Histogram (ACH), to characterize the shape features of a flower contour. Experimental results showed that our flower region extraction method based on color clustering and domain knowledge can produce accurate flower regions. Flower retrieval results on a database of 885 flower images collected from 14 plant species showed that our Region-of-Interest (ROI) based retrieval approach using both color and shape features can perform better than a method based on the global color histogram proposed by Swain and Ballard (1991) and a method based on domain knowledge-driven segmentation and color names proposed by Das et al.(1999).

  15. Thyrotoxicosis induced by iodine contamination of food--a common unrecognised condition?

    PubMed Central

    Stewart, J C; Vidor, G I

    1976-01-01

    The incidence of thyrotoxicosis in northern Tasmania rose significantly in 1964, two years before an epidemic of iodine-induced thyrotoxicosis was precipitated by the addition of iodate to bread to prevent goitre. Each time older patients accounted for most of the increase. The 1964 increase was probably iodine-induced as the use of iodophor disinfectants on dairy farms, which causes iodine residues in milk, began in 1963 and a fall in the prevalence of goitre in young children suggested an increase in dietary iodine at about that time. A further small increase in thyrotoxicosis in 1971 may also have been iodine-induced as it followed an extension of the use of iodophors. Dietary iodine is rising substantially in many places because of high iodine levels in milk and the use of iodine compounds in automated bread making, and this may be causing unsuspected iodine-induced thyrotoxicosis. Dietary iodine should be monitored regularly and clinicans alerted to any rise. Contamination of common foods with iodine should be more strictly controlled. PMID:946162

  16. Risk/Benefit Communication about Food-A Systematic Review of the Literature.

    PubMed

    Frewer, L J; Fischer, A R H; Brennan, M; Bánáti, D; Lion, R; Meertens, R M; Rowe, G; Siegrist, M; Verbeke, W; Vereijken, C M J L

    2016-07-26

    A systematic review relevant to the following research questions was conducted (1) the extent to which different theoretical frameworks have been applied to food risk/benefit communication and (2) the impact such food risk/benefit communication interventions have had on related risk/benefit attitudes and behaviors. Fifty four papers were identified. The analysis revealed that (primarily European or US) research interest has been relatively recent. Certain food issues were of greater interest to researchers than others, perhaps reflecting the occurrence of a crisis, or policy concern. Three broad themes relevant to the development of best practice in risk (benefit) communication were identified: the characteristics of the target population; the contents of the information; and the characteristics of the information sources. Within these themes, independent and dependent variables differed considerably. Overall, acute risk (benefit) communication will require advances in communication process whereas chronic communication needs to identify audience requirements. Both citizen's risk/benefit perceptions and (if relevant) related behaviors need to be taken into account, and recommendations for behavioral change need to be concrete and actionable. The application of theoretical frameworks to the study of risk (benefit) communication was infrequent, and developing predictive models of effective risk (benefit) communication may be contingent on improved theoretical perspectives.

  17. Consumer acceptance of and willingness to pay for food nanotechnology: a systematic review

    NASA Astrophysics Data System (ADS)

    Giles, Emma L.; Kuznesof, Sharron; Clark, Beth; Hubbard, Carmen; Frewer, Lynn J.

    2015-12-01

    Consumer's attitudes to, and acceptance of, emerging technologies and their applications, are important determinants of their successful implementation and commercialisation. Understanding the range of socio-psychological, cultural and affective factors which may influence consumer responses to applications of nanotechnology will help "fine-tune" the development of consumer products in line with their expectations and preferences. This is particularly true of applications in the food area, where consumer concerns about technologies applied to food production may be elevated. This research applied systematic review methodology to synthesise current knowledge regarding societal acceptance or rejection of nanotechnology applied to agri-food production. The objective was to aggregate knowledge derived from different research areas to gain an overall picture of consumer responses to nanotechnology applied to food production. Relevant electronic databases of peer-reviewed literature were searched from the earliest date available, for peer-reviewed papers which reported primary empirical data on consumer and expert acceptance of agri-food nanotechnology, using a formal systematic review protocol. Inclusion criteria for papers to be included in the review were: empirical peer-reviewed papers written in English; a population sample of adults aged 18 years and over used in the research; a research focus on consumer and expert acceptance of agri-food nanotechnology; and research on attitudes towards, and willingness to pay for, different applications of agri-food nanotechnology. Two researchers independently appraised the papers using NVivo 10 QSR software. Studies examining consumer and expert acceptance were thematically analysed, and key information was collated. The results were synthesised in order to identify trends in information relevant to consumer acceptance of nanotechnology applied to food production. Eight key themes were identified from the 32 papers which were

  18. Consumer acceptance of and willingness to pay for food nanotechnology: a systematic review.

    PubMed

    Giles, Emma L; Kuznesof, Sharron; Clark, Beth; Hubbard, Carmen; Frewer, Lynn J

    Consumer's attitudes to, and acceptance of, emerging technologies and their applications, are important determinants of their successful implementation and commercialisation. Understanding the range of socio-psychological, cultural and affective factors which may influence consumer responses to applications of nanotechnology will help "fine-tune" the development of consumer products in line with their expectations and preferences. This is particularly true of applications in the food area, where consumer concerns about technologies applied to food production may be elevated. This research applied systematic review methodology to synthesise current knowledge regarding societal acceptance or rejection of nanotechnology applied to agri-food production. The objective was to aggregate knowledge derived from different research areas to gain an overall picture of consumer responses to nanotechnology applied to food production. Relevant electronic databases of peer-reviewed literature were searched from the earliest date available, for peer-reviewed papers which reported primary empirical data on consumer and expert acceptance of agri-food nanotechnology, using a formal systematic review protocol. Inclusion criteria for papers to be included in the review were: empirical peer-reviewed papers written in English; a population sample of adults aged 18 years and over used in the research; a research focus on consumer and expert acceptance of agri-food nanotechnology; and research on attitudes towards, and willingness to pay for, different applications of agri-food nanotechnology. Two researchers independently appraised the papers using NVivo 10 QSR software. Studies examining consumer and expert acceptance were thematically analysed, and key information was collated. The results were synthesised in order to identify trends in information relevant to consumer acceptance of nanotechnology applied to food production. Eight key themes were identified from the 32 papers which were

  19. DNA-Based Methods in the Immunohematology Reference Laboratory

    PubMed Central

    Denomme, Gregory A

    2010-01-01

    Although hemagglutination serves the immunohematology reference laboratory well, when used alone, it has limited capability to resolve complex problems. This overview discusses how molecular approaches can be used in the immunohematology reference laboratory. In order to apply molecular approaches to immunohematology, knowledge of genes, DNA-based methods, and the molecular bases of blood groups are required. When applied correctly, DNA-based methods can predict blood groups to resolve ABO/Rh discrepancies, identify variant alleles, and screen donors for antigen-negative units. DNA-based testing in immunohematology is a valuable tool used to resolve blood group incompatibilities and to support patients in their transfusion needs. PMID:21257350

  20. From Words to Deeds: Enforcing Farmers' Conservation Cost-Sharing Commitments

    ERIC Educational Resources Information Center

    Marshall, Graham R.

    2004-01-01

    Compliance with many agri-environmental programs fails to meet expectations due to enforcement difficulties. One response has been devolution of enforcement rights and responsibilities to industry organisations. This kind of response is based on a belief that farmers cooperate more with their industry organisations than with government in ensuring…

  1. Optimization of the gypsum-based materials by the sequential simplex method

    NASA Astrophysics Data System (ADS)

    Doleželová, Magdalena; Vimmrová, Alena

    2017-11-01

    The application of the sequential simplex optimization method for the design of gypsum based materials is described. The principles of simplex method are explained and several examples of the method usage for the optimization of lightweight gypsum and ternary gypsum based materials are given. By this method lightweight gypsum based materials with desired properties and ternary gypsum based material with higher strength (16 MPa) were successfully developed. Simplex method is a useful tool for optimizing of gypsum based materials, but the objective of the optimization has to be formulated appropriately.

  2. Deterministic and fuzzy-based methods to evaluate community resilience

    NASA Astrophysics Data System (ADS)

    Kammouh, Omar; Noori, Ali Zamani; Taurino, Veronica; Mahin, Stephen A.; Cimellaro, Gian Paolo

    2018-04-01

    Community resilience is becoming a growing concern for authorities and decision makers. This paper introduces two indicator-based methods to evaluate the resilience of communities based on the PEOPLES framework. PEOPLES is a multi-layered framework that defines community resilience using seven dimensions. Each of the dimensions is described through a set of resilience indicators collected from literature and they are linked to a measure allowing the analytical computation of the indicator's performance. The first method proposed in this paper requires data on previous disasters as an input and returns as output a performance function for each indicator and a performance function for the whole community. The second method exploits a knowledge-based fuzzy modeling for its implementation. This method allows a quantitative evaluation of the PEOPLES indicators using descriptive knowledge rather than deterministic data including the uncertainty involved in the analysis. The output of the fuzzy-based method is a resilience index for each indicator as well as a resilience index for the community. The paper also introduces an open source online tool in which the first method is implemented. A case study illustrating the application of the first method and the usage of the tool is also provided in the paper.

  3. Evaluation of a physically based quasi-linear and a conceptually based nonlinear Muskingum methods

    NASA Astrophysics Data System (ADS)

    Perumal, Muthiah; Tayfur, Gokmen; Rao, C. Madhusudana; Gurarslan, Gurhan

    2017-03-01

    Two variants of the Muskingum flood routing method formulated for accounting nonlinearity of the channel routing process are investigated in this study. These variant methods are: (1) The three-parameter conceptual Nonlinear Muskingum (NLM) method advocated by Gillin 1978, and (2) The Variable Parameter McCarthy-Muskingum (VPMM) method recently proposed by Perumal and Price in 2013. The VPMM method does not require rigorous calibration and validation procedures as required in the case of NLM method due to established relationships of its parameters with flow and channel characteristics based on hydrodynamic principles. The parameters of the conceptual nonlinear storage equation used in the NLM method were calibrated using the Artificial Intelligence Application (AIA) techniques, such as the Genetic Algorithm (GA), the Differential Evolution (DE), the Particle Swarm Optimization (PSO) and the Harmony Search (HS). The calibration was carried out on a given set of hypothetical flood events obtained by routing a given inflow hydrograph in a set of 40 km length prismatic channel reaches using the Saint-Venant (SV) equations. The validation of the calibrated NLM method was investigated using a different set of hypothetical flood hydrographs obtained in the same set of channel reaches used for calibration studies. Both the sets of solutions obtained in the calibration and validation cases using the NLM method were compared with the corresponding solutions of the VPMM method based on some pertinent evaluation measures. The results of the study reveal that the physically based VPMM method is capable of accounting for nonlinear characteristics of flood wave movement better than the conceptually based NLM method which requires the use of tedious calibration and validation procedures.

  4. Web-Based Training Methods for Behavioral Health Providers: A Systematic Review.

    PubMed

    Jackson, Carrie B; Quetsch, Lauren B; Brabson, Laurel A; Herschell, Amy D

    2018-07-01

    There has been an increase in the use of web-based training methods to train behavioral health providers in evidence-based practices. This systematic review focuses solely on the efficacy of web-based training methods for training behavioral health providers. A literature search yielded 45 articles meeting inclusion criteria. Results indicated that the serial instruction training method was the most commonly studied web-based training method. While the current review has several notable limitations, findings indicate that participating in a web-based training may result in greater post-training knowledge and skill, in comparison to baseline scores. Implications and recommendations for future research on web-based training methods are discussed.

  5. Qualitative Assessment of Inquiry-Based Teaching Methods

    ERIC Educational Resources Information Center

    Briggs, Michael; Long, George; Owens, Katrina

    2011-01-01

    A new approach to teaching method assessment using student focused qualitative studies and the theoretical framework of mental models is proposed. The methodology is considered specifically for the advantages it offers when applied to the assessment of inquiry-based teaching methods. The theoretical foundation of mental models is discussed, and…

  6. Object Recognition using Feature- and Color-Based Methods

    NASA Technical Reports Server (NTRS)

    Duong, Tuan; Duong, Vu; Stubberud, Allen

    2008-01-01

    An improved adaptive method of processing image data in an artificial neural network has been developed to enable automated, real-time recognition of possibly moving objects under changing (including suddenly changing) conditions of illumination and perspective. The method involves a combination of two prior object-recognition methods one based on adaptive detection of shape features and one based on adaptive color segmentation to enable recognition in situations in which either prior method by itself may be inadequate. The chosen prior feature-based method is known as adaptive principal-component analysis (APCA); the chosen prior color-based method is known as adaptive color segmentation (ACOSE). These methods are made to interact with each other in a closed-loop system to obtain an optimal solution of the object-recognition problem in a dynamic environment. One of the results of the interaction is to increase, beyond what would otherwise be possible, the accuracy of the determination of a region of interest (containing an object that one seeks to recognize) within an image. Another result is to provide a minimized adaptive step that can be used to update the results obtained by the two component methods when changes of color and apparent shape occur. The net effect is to enable the neural network to update its recognition output and improve its recognition capability via an adaptive learning sequence. In principle, the improved method could readily be implemented in integrated circuitry to make a compact, low-power, real-time object-recognition system. It has been proposed to demonstrate the feasibility of such a system by integrating a 256-by-256 active-pixel sensor with APCA, ACOSE, and neural processing circuitry on a single chip. It has been estimated that such a system on a chip would have a volume no larger than a few cubic centimeters, could operate at a rate as high as 1,000 frames per second, and would consume in the order of milliwatts of power.

  7. From daily to sub-daily time steps - Creating a high temporal and spatial resolution climate reference data set for hydrological modeling and bias-correction of RCM data

    NASA Astrophysics Data System (ADS)

    Willkofer, Florian; Wood, Raul R.; Schmid, Josef; von Trentini, Fabian; Ludwig, Ralf

    2016-04-01

    The ClimEx project (Climate change and hydrological extreme events - risks and perspectives for water management in Bavaria and Québec) focuses on the effects of climate change on hydro-meteorological extreme events and their implications for water management in Bavaria and Québec. It builds on the conjoint analysis of a large ensemble of the CRCM5, driven by 50 members of the CanESM2, and the latest information provided through the CORDEX-initiative, to better assess the influence of natural climate variability and climatic change on the dynamics of extreme events. A critical point in the entire project is the preparation of a meteorological reference dataset with the required temporal (1-6h) and spatial (500m) resolution to be able to better evaluate hydrological extreme events in mesoscale river basins. For Bavaria a first reference data set (daily, 1km) used for bias-correction of RCM data was created by combining raster based data (E-OBS [1], HYRAS [2], MARS [3]) and interpolated station data using the meteorological interpolation schemes of the hydrological model WaSiM [4]. Apart from the coarse temporal and spatial resolution, this mosaic of different data sources is considered rather inconsistent and hence, not applicable for modeling of hydrological extreme events. Thus, the objective is to create a dataset with hourly data of temperature, precipitation, radiation, relative humidity and wind speed, which is then used for bias-correction of the RCM data being used as driver for hydrological modeling in the river basins. Therefore, daily data is disaggregated to hourly time steps using the 'Method of fragments' approach [5], based on available training stations. The disaggregation chooses fragments of daily values from observed hourly datasets, based on similarities in magnitude and behavior of previous and subsequent events. The choice of a certain reference station (hourly data, provision of fragments) for disaggregating daily station data (application

  8. International Agri-Awareness.

    ERIC Educational Resources Information Center

    Indiana State Dept. of Education, Indianapolis.

    Technological progress in communication and transportation within the past 30 years has made many areas of the world accessible, as well as interdependent. Failure to understand the concept of interdependency greatly diminishes the potential of all nations and each citizen to appreciate this world community. This agricultural awareness program…

  9. A Quantum-Based Similarity Method in Virtual Screening.

    PubMed

    Al-Dabbagh, Mohammed Mumtaz; Salim, Naomie; Himmat, Mubarak; Ahmed, Ali; Saeed, Faisal

    2015-10-02

    One of the most widely-used techniques for ligand-based virtual screening is similarity searching. This study adopted the concepts of quantum mechanics to present as state-of-the-art similarity method of molecules inspired from quantum theory. The representation of molecular compounds in mathematical quantum space plays a vital role in the development of quantum-based similarity approach. One of the key concepts of quantum theory is the use of complex numbers. Hence, this study proposed three various techniques to embed and to re-represent the molecular compounds to correspond with complex numbers format. The quantum-based similarity method that developed in this study depending on complex pure Hilbert space of molecules called Standard Quantum-Based (SQB). The recall of retrieved active molecules were at top 1% and top 5%, and significant test is used to evaluate our proposed methods. The MDL drug data report (MDDR), maximum unbiased validation (MUV) and Directory of Useful Decoys (DUD) data sets were used for experiments and were represented by 2D fingerprints. Simulated virtual screening experiment show that the effectiveness of SQB method was significantly increased due to the role of representational power of molecular compounds in complex numbers forms compared to Tanimoto benchmark similarity measure.

  10. New mobile methods for dietary assessment: review of image-assisted and image-based dietary assessment methods.

    PubMed

    Boushey, C J; Spoden, M; Zhu, F M; Delp, E J; Kerr, D A

    2017-08-01

    For nutrition practitioners and researchers, assessing dietary intake of children and adults with a high level of accuracy continues to be a challenge. Developments in mobile technologies have created a role for images in the assessment of dietary intake. The objective of this review was to examine peer-reviewed published papers covering development, evaluation and/or validation of image-assisted or image-based dietary assessment methods from December 2013 to January 2016. Images taken with handheld devices or wearable cameras have been used to assist traditional dietary assessment methods for portion size estimations made by dietitians (image-assisted methods). Image-assisted approaches can supplement either dietary records or 24-h dietary recalls. In recent years, image-based approaches integrating application technology for mobile devices have been developed (image-based methods). Image-based approaches aim at capturing all eating occasions by images as the primary record of dietary intake, and therefore follow the methodology of food records. The present paper reviews several image-assisted and image-based methods, their benefits and challenges; followed by details on an image-based mobile food record. Mobile technology offers a wide range of feasible options for dietary assessment, which are easier to incorporate into daily routines. The presented studies illustrate that image-assisted methods can improve the accuracy of conventional dietary assessment methods by adding eating occasion detail via pictures captured by an individual (dynamic images). All of the studies reduced underreporting with the help of images compared with results with traditional assessment methods. Studies with larger sample sizes are needed to better delineate attributes with regards to age of user, degree of error and cost.

  11. Filmless versus film-based systems in radiographic examination costs: an activity-based costing method

    PubMed Central

    2011-01-01

    Background Since the shift from a radiographic film-based system to that of a filmless system, the change in radiographic examination costs and costs structure have been undetermined. The activity-based costing (ABC) method measures the cost and performance of activities, resources, and cost objects. The purpose of this study is to identify the cost structure of a radiographic examination comparing a filmless system to that of a film-based system using the ABC method. Methods We calculated the costs of radiographic examinations for both a filmless and a film-based system, and assessed the costs or cost components by simulating radiographic examinations in a health clinic. The cost objects of the radiographic examinations included lumbar (six views), knee (three views), wrist (two views), and other. Indirect costs were allocated to cost objects using the ABC method. Results The costs of a radiographic examination using a filmless system are as follows: lumbar 2,085 yen; knee 1,599 yen; wrist 1,165 yen; and other 1,641 yen. The costs for a film-based system are: lumbar 3,407 yen; knee 2,257 yen; wrist 1,602 yen; and other 2,521 yen. The primary activities were "calling patient," "explanation of scan," "take photographs," and "aftercare" for both filmless and film-based systems. The cost of these activities cost represented 36.0% of the total cost for a filmless system and 23.6% of a film-based system. Conclusions The costs of radiographic examinations using a filmless system and a film-based system were calculated using the ABC method. Our results provide clear evidence that the filmless system is more effective than the film-based system in providing greater value services directly to patients. PMID:21961846

  12. Evolutionary game theory using agent-based methods.

    PubMed

    Adami, Christoph; Schossau, Jory; Hintze, Arend

    2016-12-01

    Evolutionary game theory is a successful mathematical framework geared towards understanding the selective pressures that affect the evolution of the strategies of agents engaged in interactions with potential conflicts. While a mathematical treatment of the costs and benefits of decisions can predict the optimal strategy in simple settings, more realistic settings such as finite populations, non-vanishing mutations rates, stochastic decisions, communication between agents, and spatial interactions, require agent-based methods where each agent is modeled as an individual, carries its own genes that determine its decisions, and where the evolutionary outcome can only be ascertained by evolving the population of agents forward in time. While highlighting standard mathematical results, we compare those to agent-based methods that can go beyond the limitations of equations and simulate the complexity of heterogeneous populations and an ever-changing set of interactors. We conclude that agent-based methods can predict evolutionary outcomes where purely mathematical treatments cannot tread (for example in the weak selection-strong mutation limit), but that mathematics is crucial to validate the computational simulations. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. Local coding based matching kernel method for image classification.

    PubMed

    Song, Yan; McLoughlin, Ian Vince; Dai, Li-Rong

    2014-01-01

    This paper mainly focuses on how to effectively and efficiently measure visual similarity for local feature based representation. Among existing methods, metrics based on Bag of Visual Word (BoV) techniques are efficient and conceptually simple, at the expense of effectiveness. By contrast, kernel based metrics are more effective, but at the cost of greater computational complexity and increased storage requirements. We show that a unified visual matching framework can be developed to encompass both BoV and kernel based metrics, in which local kernel plays an important role between feature pairs or between features and their reconstruction. Generally, local kernels are defined using Euclidean distance or its derivatives, based either explicitly or implicitly on an assumption of Gaussian noise. However, local features such as SIFT and HoG often follow a heavy-tailed distribution which tends to undermine the motivation behind Euclidean metrics. Motivated by recent advances in feature coding techniques, a novel efficient local coding based matching kernel (LCMK) method is proposed. This exploits the manifold structures in Hilbert space derived from local kernels. The proposed method combines advantages of both BoV and kernel based metrics, and achieves a linear computational complexity. This enables efficient and scalable visual matching to be performed on large scale image sets. To evaluate the effectiveness of the proposed LCMK method, we conduct extensive experiments with widely used benchmark datasets, including 15-Scenes, Caltech101/256, PASCAL VOC 2007 and 2011 datasets. Experimental results confirm the effectiveness of the relatively efficient LCMK method.

  14. Measuring geographic access to health care: raster and network-based methods

    PubMed Central

    2012-01-01

    Background Inequalities in geographic access to health care result from the configuration of facilities, population distribution, and the transportation infrastructure. In recent accessibility studies, the traditional distance measure (Euclidean) has been replaced with more plausible measures such as travel distance or time. Both network and raster-based methods are often utilized for estimating travel time in a Geographic Information System. Therefore, exploring the differences in the underlying data models and associated methods and their impact on geographic accessibility estimates is warranted. Methods We examine the assumptions present in population-based travel time models. Conceptual and practical differences between raster and network data models are reviewed, along with methodological implications for service area estimates. Our case study investigates Limited Access Areas defined by Michigan’s Certificate of Need (CON) Program. Geographic accessibility is calculated by identifying the number of people residing more than 30 minutes from an acute care hospital. Both network and raster-based methods are implemented and their results are compared. We also examine sensitivity to changes in travel speed settings and population assignment. Results In both methods, the areas identified as having limited accessibility were similar in their location, configuration, and shape. However, the number of people identified as having limited accessibility varied substantially between methods. Over all permutations, the raster-based method identified more area and people with limited accessibility. The raster-based method was more sensitive to travel speed settings, while the network-based method was more sensitive to the specific population assignment method employed in Michigan. Conclusions Differences between the underlying data models help to explain the variation in results between raster and network-based methods. Considering that the choice of data model/method may

  15. A constraint optimization based virtual network mapping method

    NASA Astrophysics Data System (ADS)

    Li, Xiaoling; Guo, Changguo; Wang, Huaimin; Li, Zhendong; Yang, Zhiwen

    2013-03-01

    Virtual network mapping problem, maps different virtual networks onto the substrate network is an extremely challenging work. This paper proposes a constraint optimization based mapping method for solving virtual network mapping problem. This method divides the problem into two phases, node mapping phase and link mapping phase, which are all NP-hard problems. Node mapping algorithm and link mapping algorithm are proposed for solving node mapping phase and link mapping phase, respectively. Node mapping algorithm adopts the thinking of greedy algorithm, mainly considers two factors, available resources which are supplied by the nodes and distance between the nodes. Link mapping algorithm is based on the result of node mapping phase, adopts the thinking of distributed constraint optimization method, which can guarantee to obtain the optimal mapping with the minimum network cost. Finally, simulation experiments are used to validate the method, and results show that the method performs very well.

  16. Effects of a Format-based Second Language Teaching Method in Kindergarten.

    ERIC Educational Resources Information Center

    Uilenburg, Noelle; Plooij, Frans X.; de Glopper, Kees; Damhuis, Resi

    2001-01-01

    Focuses on second language teaching with a format-based method. The differences between a format-based teaching method and a standard approach used as treatments in a quasi-experimental, non-equivalent control group are described in detail. Examines whether the effects of a format-based teaching method and a standard foreign language method differ…

  17. Fast Reduction Method in Dominance-Based Information Systems

    NASA Astrophysics Data System (ADS)

    Li, Yan; Zhou, Qinghua; Wen, Yongchuan

    2018-01-01

    In real world applications, there are often some data with continuous values or preference-ordered values. Rough sets based on dominance relations can effectively deal with these kinds of data. Attribute reduction can be done in the framework of dominance-relation based approach to better extract decision rules. However, the computational cost of the dominance classes greatly affects the efficiency of attribute reduction and rule extraction. This paper presents an efficient method of computing dominance classes, and further compares it with traditional method with increasing attributes and samples. Experiments on UCI data sets show that the proposed algorithm obviously improves the efficiency of the traditional method, especially for large-scale data.

  18. Labeling the good: alternative visions and organic branding in Sweden in the late twentieth century.

    PubMed

    Broberg, Oskar

    2010-01-01

    The past decade's rapid expansion of a global market for organic food has set powerful economic and political forces in motion. The most important dividing line is whether organic food production should be an alternative to or a niche within a capitalist mode of production. To explore this conflict the article analyzes the formation of a market for eco-labeled milk in Sweden. The analysis draws on three aspects: the strategy of agri-business, the role of eco-labeling, and the importance of inter-organizational dynamics. Based on archival studies, daily press, and interviews, three processes are emphasized: the formative years of the alternative movement in the 1970s, the founding of an independent eco-label (KRAV) in the 1980s, and a discursive shift from alternative visions to organic branding in the early 1990s following the entry of agri-business.

  19. Imaging the Shallow Crust in the Epicentral Area of the 1857 M7 Agri Valley Earthquake (Southern Italy) by Combined Traveltime and Full-Waveform Tomography

    NASA Astrophysics Data System (ADS)

    Improta, L.; Operto, S.; Piromallo, C.; Valoroso, L.

    2008-12-01

    The Agri Valley is a Quaternary extensional basin located in the Southern Apennines range. This basin was struck by a M7 earthquake in 1857. In spite of extensive morphotectonic surveys and hydrocarbon exploration, major unsolved questions remain about the upper crustal structure, the recent tectonic evolution and seismotectonics of the area. Most authors consider a SW-dipping normal-fault system bordering the basin to the East as the major seismogenic source. Alternatively, some authors ascribe the high seismogenic potential of the region to NE-dipping normal faults identified by morphotectonic surveys along the ridge bounding the basin to the West. These uncertainties mainly derive from the poor performance of commercial reflection profiling that suffers from an extreme structural complexity and unfavorable near-surface conditions. To overcome these drawbacks, ENI and Shell Italia carried out a non-conventional wide-aperture survey with densely spaced sources (60 m) and receivers (90 m). The 18-km-long wide-aperture profile crosses the basin, yielding a unique opportunity to get new insights into the crustal structure by using advanced imaging techniques. Here, we apply a two-step imaging procedure. We start determining multi- scale Vp images down to 2.5 km depth by using a non-linear traveltime tomographic technique able to cope with strongly heterogeneous media. Assessment of an accurate reference Vp model is indeed crucial for the subsequent application of a frequency-domain full-waveform inversion aimed at improving spatial resolution of the velocity images. Frequency components of the data are then iteratively inverted from low to high frequency values in order to progressively incorporate smaller wavelength components into the model. Inversion results accurately image the shallow crust, yielding valuable constraints for a better understanding of the recent basin evolution and of the surrounding normal-fault systems.

  20. Chapter 11. Community analysis-based methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cao, Y.; Wu, C.H.; Andersen, G.L.

    2010-05-01

    Microbial communities are each a composite of populations whose presence and relative abundance in water or other environmental samples are a direct manifestation of environmental conditions, including the introduction of microbe-rich fecal material and factors promoting persistence of the microbes therein. As shown by culture-independent methods, different animal-host fecal microbial communities appear distinctive, suggesting that their community profiles can be used to differentiate fecal samples and to potentially reveal the presence of host fecal material in environmental waters. Cross-comparisons of microbial communities from different hosts also reveal relative abundances of genetic groups that can be used to distinguish sources. Inmore » increasing order of their information richness, several community analysis methods hold promise for MST applications: phospholipid fatty acid (PLFA) analysis, denaturing gradient gel electrophoresis (DGGE), terminal restriction fragment length polymorphism (TRFLP), cloning/sequencing, and PhyloChip. Specific case studies involving TRFLP and PhyloChip approaches demonstrate the ability of community-based analyses of contaminated waters to confirm a diagnosis of water quality based on host-specific marker(s). The success of community-based MST for comprehensively confirming fecal sources relies extensively upon using appropriate multivariate statistical approaches. While community-based MST is still under evaluation and development as a primary diagnostic tool, results presented herein demonstrate its promise. Coupled with its inherently comprehensive ability to capture an unprecedented amount of microbiological data that is relevant to water quality, the tools for microbial community analysis are increasingly accessible, and community-based approaches have unparalleled potential for translation into rapid, perhaps real-time, monitoring platforms.« less

  1. Therapy Decision Support Based on Recommender System Methods

    PubMed Central

    Gräßer, Felix; Beckert, Stefanie; Küster, Denise; Schmitt, Jochen; Abraham, Susanne; Malberg, Hagen

    2017-01-01

    We present a system for data-driven therapy decision support based on techniques from the field of recommender systems. Two methods for therapy recommendation, namely, Collaborative Recommender and Demographic-based Recommender, are proposed. Both algorithms aim to predict the individual response to different therapy options using diverse patient data and recommend the therapy which is assumed to provide the best outcome for a specific patient and time, that is, consultation. The proposed methods are evaluated using a clinical database incorporating patients suffering from the autoimmune skin disease psoriasis. The Collaborative Recommender proves to generate both better outcome predictions and recommendation quality. However, due to sparsity in the data, this approach cannot provide recommendations for the entire database. In contrast, the Demographic-based Recommender performs worse on average but covers more consultations. Consequently, both methods profit from a combination into an overall recommender system. PMID:29065657

  2. MR Imaging-based Semi-quantitative Methods for Knee Osteoarthritis

    PubMed Central

    JARRAYA, Mohamed; HAYASHI, Daichi; ROEMER, Frank Wolfgang; GUERMAZI, Ali

    2016-01-01

    Magnetic resonance imaging (MRI)-based semi-quantitative (SQ) methods applied to knee osteoarthritis (OA) have been introduced during the last decade and have fundamentally changed our understanding of knee OA pathology since then. Several epidemiological studies and clinical trials have used MRI-based SQ methods to evaluate different outcome measures. Interest in MRI-based SQ scoring system has led to continuous update and refinement. This article reviews the different SQ approaches for MRI-based whole organ assessment of knee OA and also discuss practical aspects of whole joint assessment. PMID:26632537

  3. The Latitudinal Analysis of Secondary School Students' Motivations towards Science Course

    ERIC Educational Resources Information Center

    Aydin, Suleyman; Keles, Pinar Ural

    2017-01-01

    The aim of this research was to investigate the comparison of different categories of secondary schools students' motivations for science lessons. In this research, the case study method was used latitudinally and it was carried out in the center schools of Agri in 2015-2016 academic years. The sample of the study was composed of totally 649…

  4. The Efficiency of Computer-Aided Instruction and Creative Drama on Academic Achievement in Teaching of Integers to Seventh Grade Students

    ERIC Educational Resources Information Center

    Kaplan, Abdullah; Özturk, Mesut; Ertör, Eren

    2013-01-01

    This study aims to compare computer-aided instruction, creative drama and traditional teaching methods in teaching of Integers to the seventh grade students. The study was conducted in a primary school with eighty-seven students (N=87) in a county of Agri, in spring term of academic year 2011-2012. A non equivalent control group quasi experimental…

  5. Design method of ARM based embedded iris recognition system

    NASA Astrophysics Data System (ADS)

    Wang, Yuanbo; He, Yuqing; Hou, Yushi; Liu, Ting

    2008-03-01

    With the advantages of non-invasiveness, uniqueness, stability and low false recognition rate, iris recognition has been successfully applied in many fields. Up to now, most of the iris recognition systems are based on PC. However, a PC is not portable and it needs more power. In this paper, we proposed an embedded iris recognition system based on ARM. Considering the requirements of iris image acquisition and recognition algorithm, we analyzed the design method of the iris image acquisition module, designed the ARM processing module and its peripherals, studied the Linux platform and the recognition algorithm based on this platform, finally actualized the design method of ARM-based iris imaging and recognition system. Experimental results show that the ARM platform we used is fast enough to run the iris recognition algorithm, and the data stream can flow smoothly between the camera and the ARM chip based on the embedded Linux system. It's an effective method of using ARM to actualize portable embedded iris recognition system.

  6. An image mosaic method based on corner

    NASA Astrophysics Data System (ADS)

    Jiang, Zetao; Nie, Heting

    2015-08-01

    In view of the shortcomings of the traditional image mosaic, this paper describes a new algorithm for image mosaic based on the Harris corner. Firstly, Harris operator combining the constructed low-pass smoothing filter based on splines function and circular window search is applied to detect the image corner, which allows us to have better localisation performance and effectively avoid the phenomenon of cluster. Secondly, the correlation feature registration is used to find registration pair, remove the false registration using random sampling consensus. Finally use the method of weighted trigonometric combined with interpolation function for image fusion. The experiments show that this method can effectively remove the splicing ghosting and improve the accuracy of image mosaic.

  7. An AIS-Based E-mail Classification Method

    NASA Astrophysics Data System (ADS)

    Qing, Jinjian; Mao, Ruilong; Bie, Rongfang; Gao, Xiao-Zhi

    This paper proposes a new e-mail classification method based on the Artificial Immune System (AIS), which is endowed with good diversity and self-adaptive ability by using the immune learning, immune memory, and immune recognition. In our method, the features of spam and non-spam extracted from the training sets are combined together, and the number of false positives (non-spam messages that are incorrectly classified as spam) can be reduced. The experimental results demonstrate that this method is effective in reducing the false rate.

  8. Functional connectivity analysis of the neural bases of emotion regulation: A comparison of independent component method with density-based k-means clustering method.

    PubMed

    Zou, Ling; Guo, Qian; Xu, Yi; Yang, Biao; Jiao, Zhuqing; Xiang, Jianbo

    2016-04-29

    Functional magnetic resonance imaging (fMRI) is an important tool in neuroscience for assessing connectivity and interactions between distant areas of the brain. To find and characterize the coherent patterns of brain activity as a means of identifying brain systems for the cognitive reappraisal of the emotion task, both density-based k-means clustering and independent component analysis (ICA) methods can be applied to characterize the interactions between brain regions involved in cognitive reappraisal of emotion. Our results reveal that compared with the ICA method, the density-based k-means clustering method provides a higher sensitivity of polymerization. In addition, it is more sensitive to those relatively weak functional connection regions. Thus, the study concludes that in the process of receiving emotional stimuli, the relatively obvious activation areas are mainly distributed in the frontal lobe, cingulum and near the hypothalamus. Furthermore, density-based k-means clustering method creates a more reliable method for follow-up studies of brain functional connectivity.

  9. Color image definition evaluation method based on deep learning method

    NASA Astrophysics Data System (ADS)

    Liu, Di; Li, YingChun

    2018-01-01

    In order to evaluate different blurring levels of color image and improve the method of image definition evaluation, this paper proposed a method based on the depth learning framework and BP neural network classification model, and presents a non-reference color image clarity evaluation method. Firstly, using VGG16 net as the feature extractor to extract 4,096 dimensions features of the images, then the extracted features and labeled images are employed in BP neural network to train. And finally achieve the color image definition evaluation. The method in this paper are experimented by using images from the CSIQ database. The images are blurred at different levels. There are 4,000 images after the processing. Dividing the 4,000 images into three categories, each category represents a blur level. 300 out of 400 high-dimensional features are trained in VGG16 net and BP neural network, and the rest of 100 samples are tested. The experimental results show that the method can take full advantage of the learning and characterization capability of deep learning. Referring to the current shortcomings of the major existing image clarity evaluation methods, which manually design and extract features. The method in this paper can extract the images features automatically, and has got excellent image quality classification accuracy for the test data set. The accuracy rate is 96%. Moreover, the predicted quality levels of original color images are similar to the perception of the human visual system.

  10. Methods for preparing colloidal nanocrystal-based thin films

    DOEpatents

    Kagan, Cherie R.; Fafarman, Aaron T.; Choi, Ji-Hyuk; Koh, Weon-kyu; Kim, David K.; Oh, Soong Ju; Lai, Yuming; Hong, Sung-Hoon; Saudari, Sangameshwar Rao; Murray, Christopher B.

    2016-05-10

    Methods of exchanging ligands to form colloidal nanocrystals (NCs) with chalcogenocyanate (xCN)-based ligands and apparatuses using the same are disclosed. The ligands may be exchanged by assembling NCs into a thin film and immersing the thin film in a solution containing xCN-based ligands. The ligands may also be exchanged by mixing a xCN-based solution with a dispersion of NCs, flocculating the mixture, centrifuging the mixture, discarding the supernatant, adding a solvent to the pellet, and dispersing the solvent and pellet to form dispersed NCs with exchanged xCN-ligands. The NCs with xCN-based ligands may be used to form thin film devices and/or other electronic, optoelectronic, and photonic devices. Devices comprising nanocrystal-based thin films and methods for forming such devices are also disclosed. These devices may be constructed by depositing NCs on to a substrate to form an NC thin film and then doping the thin film by evaporation and thermal diffusion.

  11. A LiDAR data-based camera self-calibration method

    NASA Astrophysics Data System (ADS)

    Xu, Lijun; Feng, Jing; Li, Xiaolu; Chen, Jianjun

    2018-07-01

    To find the intrinsic parameters of a camera, a LiDAR data-based camera self-calibration method is presented here. Parameters have been estimated using particle swarm optimization (PSO), enhancing the optimal solution of a multivariate cost function. The main procedure of camera intrinsic parameter estimation has three parts, which include extraction and fine matching of interest points in the images, establishment of cost function, based on Kruppa equations and optimization of PSO using LiDAR data as the initialization input. To improve the precision of matching pairs, a new method of maximal information coefficient (MIC) and maximum asymmetry score (MAS) was used to remove false matching pairs based on the RANSAC algorithm. Highly precise matching pairs were used to calculate the fundamental matrix so that the new cost function (deduced from Kruppa equations in terms of the fundamental matrix) was more accurate. The cost function involving four intrinsic parameters was minimized by PSO for the optimal solution. To overcome the issue of optimization pushed to a local optimum, LiDAR data was used to determine the scope of initialization, based on the solution to the P4P problem for camera focal length. To verify the accuracy and robustness of the proposed method, simulations and experiments were implemented and compared with two typical methods. Simulation results indicated that the intrinsic parameters estimated by the proposed method had absolute errors less than 1.0 pixel and relative errors smaller than 0.01%. Based on ground truth obtained from a meter ruler, the distance inversion accuracy in the experiments was smaller than 1.0 cm. Experimental and simulated results demonstrated that the proposed method was highly accurate and robust.

  12. A New Intrusion Detection Method Based on Antibody Concentration

    NASA Astrophysics Data System (ADS)

    Zeng, Jie; Li, Tao; Li, Guiyang; Li, Haibo

    Antibody is one kind of protein that fights against the harmful antigen in human immune system. In modern medical examination, the health status of a human body can be diagnosed by detecting the intrusion intensity of a specific antigen and the concentration indicator of corresponding antibody from human body’s serum. In this paper, inspired by the principle of antigen-antibody reactions, we present a New Intrusion Detection Method Based on Antibody Concentration (NIDMBAC) to reduce false alarm rate without affecting detection rate. In our proposed method, the basic definitions of self, nonself, antigen and detector in the intrusion detection domain are given. Then, according to the antigen intrusion intensity, the change of antibody number is recorded from the process of clone proliferation for detectors based on the antigen classified recognition. Finally, building upon the above works, a probabilistic calculation method for the intrusion alarm production, which is based on the correlation between the antigen intrusion intensity and the antibody concen-tration, is proposed. Our theoretical analysis and experimental results show that our proposed method has a better performance than traditional methods.

  13. A Review on Human Activity Recognition Using Vision-Based Method.

    PubMed

    Zhang, Shugang; Wei, Zhiqiang; Nie, Jie; Huang, Lei; Wang, Shuang; Li, Zhen

    2017-01-01

    Human activity recognition (HAR) aims to recognize activities from a series of observations on the actions of subjects and the environmental conditions. The vision-based HAR research is the basis of many applications including video surveillance, health care, and human-computer interaction (HCI). This review highlights the advances of state-of-the-art activity recognition approaches, especially for the activity representation and classification methods. For the representation methods, we sort out a chronological research trajectory from global representations to local representations, and recent depth-based representations. For the classification methods, we conform to the categorization of template-based methods, discriminative models, and generative models and review several prevalent methods. Next, representative and available datasets are introduced. Aiming to provide an overview of those methods and a convenient way of comparing them, we classify existing literatures with a detailed taxonomy including representation and classification methods, as well as the datasets they used. Finally, we investigate the directions for future research.

  14. Adaptive target binarization method based on a dual-camera system

    NASA Astrophysics Data System (ADS)

    Lei, Jing; Zhang, Ping; Xu, Jiangtao; Gao, Zhiyuan; Gao, Jing

    2018-01-01

    An adaptive target binarization method based on a dual-camera system that contains two dynamic vision sensors was proposed. First, a preprocessing procedure of denoising is introduced to remove the noise events generated by the sensors. Then, the complete edge of the target is retrieved and represented by events based on an event mosaicking method. Third, the region of the target is confirmed by an event-to-event method. Finally, a postprocessing procedure of image open and close operations of morphology methods is adopted to remove the artifacts caused by event-to-event mismatching. The proposed binarization method has been extensively tested on numerous degraded images with nonuniform illumination, low contrast, noise, or light spots and successfully compared with other well-known binarization methods. The experimental results, which are based on visual and misclassification error criteria, show that the proposed method performs well and has better robustness on the binarization of degraded images.

  15. A Review on Human Activity Recognition Using Vision-Based Method

    PubMed Central

    Nie, Jie

    2017-01-01

    Human activity recognition (HAR) aims to recognize activities from a series of observations on the actions of subjects and the environmental conditions. The vision-based HAR research is the basis of many applications including video surveillance, health care, and human-computer interaction (HCI). This review highlights the advances of state-of-the-art activity recognition approaches, especially for the activity representation and classification methods. For the representation methods, we sort out a chronological research trajectory from global representations to local representations, and recent depth-based representations. For the classification methods, we conform to the categorization of template-based methods, discriminative models, and generative models and review several prevalent methods. Next, representative and available datasets are introduced. Aiming to provide an overview of those methods and a convenient way of comparing them, we classify existing literatures with a detailed taxonomy including representation and classification methods, as well as the datasets they used. Finally, we investigate the directions for future research. PMID:29065585

  16. Biogas slurry pricing method based on nutrient content

    NASA Astrophysics Data System (ADS)

    Zhang, Chang-ai; Guo, Honghai; Yang, Zhengtao; Xin, Shurong

    2017-11-01

    In order to promote biogas-slurry commercialization, A method was put forward to valuate biogas slurry based on its nutrient contents. Firstly, element contents of biogas slurry was measured; Secondly, each element was valuated based on its market price, and then traffic cost, using cost and market effect were taken into account, the pricing method of biogas slurry were obtained lastly. This method could be useful in practical production. Taking cattle manure raw meterial biogas slurry and con stalk raw material biogas slurry for example, their price were 38.50 yuan RMB per ton and 28.80 yuan RMB per ton. This paper will be useful for recognizing the value of biogas projects, ensuring biogas project running, and instructing the cyclic utilization of biomass resources in China.

  17. Method of coating an iron-based article

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Magdefrau, Neal; Beals, James T.; Sun, Ellen Y.

    A method of coating an iron-based article includes a first heating step of heating a substrate that includes an iron-based material in the presence of an aluminum source material and halide diffusion activator. The heating is conducted in a substantially non-oxidizing environment, to cause the formation of an aluminum-rich layer in the iron-based material. In a second heating step, the substrate that has the aluminum-rich layer is heated in an oxidizing environment to oxidize the aluminum in the aluminum-rich layer.

  18. Comparison of Different Recruitment Methods for Sexual and Reproductive Health Research: Social Media-Based Versus Conventional Methods.

    PubMed

    Motoki, Yoko; Miyagi, Etsuko; Taguri, Masataka; Asai-Sato, Mikiko; Enomoto, Takayuki; Wark, John Dennis; Garland, Suzanne Marie

    2017-03-10

    Prior research about the sexual and reproductive health of young women has relied mostly on self-reported survey studies. Thus, participant recruitment using Web-based methods can improve sexual and reproductive health research about cervical cancer prevention. In our prior study, we reported that Facebook is a promising way to reach young women for sexual and reproductive health research. However, it remains unknown whether Web-based or other conventional recruitment methods (ie, face-to-face or flyer distribution) yield comparable survey responses from similar participants. We conducted a survey to determine whether there was a difference in the sexual and reproductive health survey responses of young Japanese women based on recruitment methods: social media-based and conventional methods. From July 2012 to March 2013 (9 months), we invited women of ages 16-35 years in Kanagawa, Japan, to complete a Web-based questionnaire. They were recruited through either a social media-based (social networking site, SNS, group) or by conventional methods (conventional group). All participants enrolled were required to fill out and submit their responses through a Web-based questionnaire about their sexual and reproductive health for cervical cancer prevention. Of the 243 participants, 52.3% (127/243) were recruited by SNS, whereas 47.7% (116/243) were recruited by conventional methods. We found no differences between recruitment methods in responses to behaviors and attitudes to sexual and reproductive health survey, although more participants from the conventional group (15%, 14/95) chose not to answer the age of first intercourse compared with those from the SNS group (5.2%, 6/116; P=.03). No differences were found between recruitment methods in the responses of young Japanese women to a Web-based sexual and reproductive health survey. ©Yoko Motoki, Etsuko Miyagi, Masataka Taguri, Mikiko Asai-Sato, Takayuki Enomoto, John Dennis Wark, Suzanne Marie Garland. Originally

  19. External Aiding Methods for IMU-Based Navigation

    DTIC Science & Technology

    2016-11-26

    Carlo simulation and particle filtering . This approach allows for the utilization of highly complex systems in a black box configuration with minimal...alternative method, which has the advantage of being less computationally demanding, is to use a Kalman filtering -based approach. The particular...Kalman filtering -based approach used here is known as linear covariance analysis. In linear covariance analysis, the nonlinear systems describing the

  20. Metamodel-based inverse method for parameter identification: elastic-plastic damage model

    NASA Astrophysics Data System (ADS)

    Huang, Changwu; El Hami, Abdelkhalak; Radi, Bouchaïb

    2017-04-01

    This article proposed a metamodel-based inverse method for material parameter identification and applies it to elastic-plastic damage model parameter identification. An elastic-plastic damage model is presented and implemented in numerical simulation. The metamodel-based inverse method is proposed in order to overcome the disadvantage in computational cost of the inverse method. In the metamodel-based inverse method, a Kriging metamodel is constructed based on the experimental design in order to model the relationship between material parameters and the objective function values in the inverse problem, and then the optimization procedure is executed by the use of a metamodel. The applications of the presented material model and proposed parameter identification method in the standard A 2017-T4 tensile test prove that the presented elastic-plastic damage model is adequate to describe the material's mechanical behaviour and that the proposed metamodel-based inverse method not only enhances the efficiency of parameter identification but also gives reliable results.

  1. Filmless versus film-based systems in radiographic examination costs: an activity-based costing method.

    PubMed

    Muto, Hiroshi; Tani, Yuji; Suzuki, Shigemasa; Yokooka, Yuki; Abe, Tamotsu; Sase, Yuji; Terashita, Takayoshi; Ogasawara, Katsuhiko

    2011-09-30

    Since the shift from a radiographic film-based system to that of a filmless system, the change in radiographic examination costs and costs structure have been undetermined. The activity-based costing (ABC) method measures the cost and performance of activities, resources, and cost objects. The purpose of this study is to identify the cost structure of a radiographic examination comparing a filmless system to that of a film-based system using the ABC method. We calculated the costs of radiographic examinations for both a filmless and a film-based system, and assessed the costs or cost components by simulating radiographic examinations in a health clinic. The cost objects of the radiographic examinations included lumbar (six views), knee (three views), wrist (two views), and other. Indirect costs were allocated to cost objects using the ABC method. The costs of a radiographic examination using a filmless system are as follows: lumbar 2,085 yen; knee 1,599 yen; wrist 1,165 yen; and other 1,641 yen. The costs for a film-based system are: lumbar 3,407 yen; knee 2,257 yen; wrist 1,602 yen; and other 2,521 yen. The primary activities were "calling patient," "explanation of scan," "take photographs," and "aftercare" for both filmless and film-based systems. The cost of these activities cost represented 36.0% of the total cost for a filmless system and 23.6% of a film-based system. The costs of radiographic examinations using a filmless system and a film-based system were calculated using the ABC method. Our results provide clear evidence that the filmless system is more effective than the film-based system in providing greater value services directly to patients.

  2. A marker-based watershed method for X-ray image segmentation.

    PubMed

    Zhang, Xiaodong; Jia, Fucang; Luo, Suhuai; Liu, Guiying; Hu, Qingmao

    2014-03-01

    Digital X-ray images are the most frequent modality for both screening and diagnosis in hospitals. To facilitate subsequent analysis such as quantification and computer aided diagnosis (CAD), it is desirable to exclude image background. A marker-based watershed segmentation method was proposed to segment background of X-ray images. The method consisted of six modules: image preprocessing, gradient computation, marker extraction, watershed segmentation from markers, region merging and background extraction. One hundred clinical direct radiograph X-ray images were used to validate the method. Manual thresholding and multiscale gradient based watershed method were implemented for comparison. The proposed method yielded a dice coefficient of 0.964±0.069, which was better than that of the manual thresholding (0.937±0.119) and that of multiscale gradient based watershed method (0.942±0.098). Special means were adopted to decrease the computational cost, including getting rid of few pixels with highest grayscale via percentile, calculation of gradient magnitude through simple operations, decreasing the number of markers by appropriate thresholding, and merging regions based on simple grayscale statistics. As a result, the processing time was at most 6s even for a 3072×3072 image on a Pentium 4 PC with 2.4GHz CPU (4 cores) and 2G RAM, which was more than one time faster than that of the multiscale gradient based watershed method. The proposed method could be a potential tool for diagnosis and quantification of X-ray images. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  3. A multivariate quadrature based moment method for LES based modeling of supersonic combustion

    NASA Astrophysics Data System (ADS)

    Donde, Pratik; Koo, Heeseok; Raman, Venkat

    2012-07-01

    The transported probability density function (PDF) approach is a powerful technique for large eddy simulation (LES) based modeling of scramjet combustors. In this approach, a high-dimensional transport equation for the joint composition-enthalpy PDF needs to be solved. Quadrature based approaches provide deterministic Eulerian methods for solving the joint-PDF transport equation. In this work, it is first demonstrated that the numerical errors associated with LES require special care in the development of PDF solution algorithms. The direct quadrature method of moments (DQMOM) is one quadrature-based approach developed for supersonic combustion modeling. This approach is shown to generate inconsistent evolution of the scalar moments. Further, gradient-based source terms that appear in the DQMOM transport equations are severely underpredicted in LES leading to artificial mixing of fuel and oxidizer. To overcome these numerical issues, a semi-discrete quadrature method of moments (SeQMOM) is formulated. The performance of the new technique is compared with the DQMOM approach in canonical flow configurations as well as a three-dimensional supersonic cavity stabilized flame configuration. The SeQMOM approach is shown to predict subfilter statistics accurately compared to the DQMOM approach.

  4. A MUSIC-based method for SSVEP signal processing.

    PubMed

    Chen, Kun; Liu, Quan; Ai, Qingsong; Zhou, Zude; Xie, Sheng Quan; Meng, Wei

    2016-03-01

    The research on brain computer interfaces (BCIs) has become a hotspot in recent years because it offers benefit to disabled people to communicate with the outside world. Steady state visual evoked potential (SSVEP)-based BCIs are more widely used because of higher signal to noise ratio and greater information transfer rate compared with other BCI techniques. In this paper, a multiple signal classification based method was proposed for multi-dimensional SSVEP feature extraction. 2-second data epochs from four electrodes achieved excellent accuracy rates including idle state detection. In some asynchronous mode experiments, the recognition accuracy reached up to 100%. The experimental results showed that the proposed method attained good frequency resolution. In most situations, the recognition accuracy was higher than canonical correlation analysis, which is a typical method for multi-channel SSVEP signal processing. Also, a virtual keyboard was successfully controlled by different subjects in an unshielded environment, which proved the feasibility of the proposed method for multi-dimensional SSVEP signal processing in practical applications.

  5. Frame synchronization methods based on channel symbol measurements

    NASA Technical Reports Server (NTRS)

    Dolinar, S.; Cheung, K.-M.

    1989-01-01

    The current DSN frame synchronization procedure is based on monitoring the decoded bit stream for the appearance of a sync marker sequence that is transmitted once every data frame. The possibility of obtaining frame synchronization by processing the raw received channel symbols rather than the decoded bits is explored. Performance results are derived for three channel symbol sync methods, and these are compared with results for decoded bit sync methods reported elsewhere. It is shown that each class of methods has advantages or disadvantages under different assumptions on the frame length, the global acquisition strategy, and the desired measure of acquisition timeliness. It is shown that the sync statistics based on decoded bits are superior to the statistics based on channel symbols, if the desired operating region utilizes a probability of miss many orders of magnitude higher than the probability of false alarm. This operating point is applicable for very large frame lengths and minimal frame-to-frame verification strategy. On the other hand, the statistics based on channel symbols are superior if the desired operating point has a miss probability only a few orders of magnitude greater than the false alarm probability. This happens for small frames or when frame-to-frame verifications are required.

  6. A velocity-correction projection method based immersed boundary method for incompressible flows

    NASA Astrophysics Data System (ADS)

    Cai, Shanggui

    2014-11-01

    In the present work we propose a novel direct forcing immersed boundary method based on the velocity-correction projection method of [J.L. Guermond, J. Shen, Velocity-correction projection methods for incompressible flows, SIAM J. Numer. Anal., 41 (1)(2003) 112]. The principal idea of immersed boundary method is to correct the velocity in the vicinity of the immersed object by using an artificial force to mimic the presence of the physical boundaries. Therefore, velocity-correction projection method is preferred to its pressure-correction counterpart in the present work. Since the velocity-correct projection method is considered as a dual class of pressure-correction method, the proposed method here can also be interpreted in the way that first the pressure is predicted by treating the viscous term explicitly without the consideration of the immersed boundary, and the solenoidal velocity is used to determine the volume force on the Lagrangian points, then the non-slip boundary condition is enforced by correcting the velocity with the implicit viscous term. To demonstrate the efficiency and accuracy of the proposed method, several numerical simulations are performed and compared with the results in the literature. China Scholarship Council.

  7. Recommendation advertising method based on behavior retargeting

    NASA Astrophysics Data System (ADS)

    Zhao, Yao; YIN, Xin-Chun; CHEN, Zhi-Min

    2011-10-01

    Online advertising has become an important business in e-commerce. Ad recommended algorithms are the most critical part in recommendation systems. We propose a recommendation advertising method based on behavior retargeting which can avoid leakage click of advertising due to objective reasons and can observe the changes of the user's interest in time. Experiments show that our new method can have a significant effect and can be further to apply to online system.

  8. Method for rapid base sequencing in DNA and RNA with two base labeling

    DOEpatents

    Jett, James H.; Keller, Richard A.; Martin, John C.; Posner, Richard G.; Marrone, Babetta L.; Hammond, Mark L.; Simpson, Daniel J.

    1995-01-01

    Method for rapid-base sequencing in DNA and RNA with two-base labeling and employing fluorescent detection of single molecules at two wavelengths. Bases modified to accept fluorescent labels are used to replicate a single DNA or RNA strand to be sequenced. The bases are then sequentially cleaved from the replicated strand, excited with a chosen spectrum of electromagnetic radiation, and the fluorescence from individual, tagged bases detected in the order of cleavage from the strand.

  9. A new image segmentation method based on multifractal detrended moving average analysis

    NASA Astrophysics Data System (ADS)

    Shi, Wen; Zou, Rui-biao; Wang, Fang; Su, Le

    2015-08-01

    In order to segment and delineate some regions of interest in an image, we propose a novel algorithm based on the multifractal detrended moving average analysis (MF-DMA). In this method, the generalized Hurst exponent h(q) is calculated for every pixel firstly and considered as the local feature of a surface. And then a multifractal detrended moving average spectrum (MF-DMS) D(h(q)) is defined by the idea of box-counting dimension method. Therefore, we call the new image segmentation method MF-DMS-based algorithm. The performance of the MF-DMS-based method is tested by two image segmentation experiments of rapeseed leaf image of potassium deficiency and magnesium deficiency under three cases, namely, backward (θ = 0), centered (θ = 0.5) and forward (θ = 1) with different q values. The comparison experiments are conducted between the MF-DMS method and other two multifractal segmentation methods, namely, the popular MFS-based and latest MF-DFS-based methods. The results show that our MF-DMS-based method is superior to the latter two methods. The best segmentation result for the rapeseed leaf image of potassium deficiency and magnesium deficiency is from the same parameter combination of θ = 0.5 and D(h(- 10)) when using the MF-DMS-based method. An interesting finding is that the D(h(- 10)) outperforms other parameters for both the MF-DMS-based method with centered case and MF-DFS-based algorithms. By comparing the multifractal nature between nutrient deficiency and non-nutrient deficiency areas determined by the segmentation results, an important finding is that the gray value's fluctuation in nutrient deficiency area is much severer than that in non-nutrient deficiency area.

  10. Determining the Views and Adequacy of the Preschool Teachers Related to Science Activities

    ERIC Educational Resources Information Center

    Akçay, Nilüfer Okur

    2016-01-01

    In this study, it is aimed to determine the views and adequacy of the preschool teachers related to science activities. The study is based on descriptive survey model. The sample of the study consists of 47 preschool teachers working in Agri city center. Personal information form, semi-structured interview form to determine the views of the…

  11. A geometrically based method for automated radiosurgery planning.

    PubMed

    Wagner, T H; Yi, T; Meeks, S L; Bova, F J; Brechner, B L; Chen, Y; Buatti, J M; Friedman, W A; Foote, K D; Bouchet, L G

    2000-12-01

    A geometrically based method of multiple isocenter linear accelerator radiosurgery treatment planning optimization was developed, based on a target's solid shape. Our method uses an edge detection process to determine the optimal sphere packing arrangement with which to cover the planning target. The sphere packing arrangement is converted into a radiosurgery treatment plan by substituting the isocenter locations and collimator sizes for the spheres. This method is demonstrated on a set of 5 irregularly shaped phantom targets, as well as a set of 10 clinical example cases ranging from simple to very complex in planning difficulty. Using a prototype implementation of the method and standard dosimetric radiosurgery treatment planning tools, feasible treatment plans were developed for each target. The treatment plans generated for the phantom targets showed excellent dose conformity and acceptable dose homogeneity within the target volume. The algorithm was able to generate a radiosurgery plan conforming to the Radiation Therapy Oncology Group (RTOG) guidelines on radiosurgery for every clinical and phantom target examined. This automated planning method can serve as a valuable tool to assist treatment planners in rapidly and consistently designing conformal multiple isocenter radiosurgery treatment plans.

  12. Effects of inoculation with organic-phosphorus-mineralizing bacteria on soybean (Glycine max) growth and indigenous bacterial community diversity.

    PubMed

    Sun, Wei; Qian, Xun; Gu, Jie; Wang, Xiao-Juan; Li, Yang; Duan, Man-Li

    2017-05-01

    Three different organic-phosphorus-mineralizing bacteria (OPMB) strains were inoculated to soil planted with soybean (Glycine max), and their effects on soybean growth and indigenous bacterial community diversity were investigated. Inoculation with Pseudomonas fluorescens Z4-1 and Brevibacillus agri L7-1 increased organic phosphorus degradation by 22% and 30%, respectively, compared with the control at the mature stage. Strains P. fluorescens Z4-1 and B. agri L7-1 significantly improved the soil alkaline phosphatase activity, average well color development, and the soybean root activity. Terminal restriction fragment length polymorphism analysis demonstrated that P. fluorescens Z4-1 and B. agri L7-1 could persist in the soil at relative abundances of 2.0%-6.4% throughout soybean growth. Thus, P. fluorescens Z4-1 and B. agri L7-1 could potentially be used in organic-phosphorus-mineralizing biofertilizers. OPMB inoculation altered the genetic structure of the soil bacterial communities but had no apparent influence on the carbon source utilization profiles of the soil bacterial communities. Principal components analysis showed that the changes in the carbon source utilization profiles of bacterial community depended mainly on the plant growth stages rather than inoculation with OPMB. The results help to understand the evolution of the soil bacterial community after OPMB inoculation.

  13. A threshold selection method based on edge preserving

    NASA Astrophysics Data System (ADS)

    Lou, Liantang; Dan, Wei; Chen, Jiaqi

    2015-12-01

    A method of automatic threshold selection for image segmentation is presented. An optimal threshold is selected in order to preserve edge of image perfectly in image segmentation. The shortcoming of Otsu's method based on gray-level histograms is analyzed. The edge energy function of bivariate continuous function is expressed as the line integral while the edge energy function of image is simulated by discretizing the integral. An optimal threshold method by maximizing the edge energy function is given. Several experimental results are also presented to compare with the Otsu's method.

  14. Fluence-based and microdosimetric event-based methods for radiation protection in space

    NASA Technical Reports Server (NTRS)

    Curtis, Stanley B.; Meinhold, C. B. (Principal Investigator)

    2002-01-01

    The National Council on Radiation Protection and Measurements (NCRP) has recently published a report (Report #137) that discusses various aspects of the concepts used in radiation protection and the difficulties in measuring the radiation environment in spacecraft for the estimation of radiation risk to space travelers. Two novel dosimetric methodologies, fluence-based and microdosimetric event-based methods, are discussed and evaluated, along with the more conventional quality factor/LET method. It was concluded that for the present, any reason to switch to a new methodology is not compelling. It is suggested that because of certain drawbacks in the presently-used conventional method, these alternative methodologies should be kept in mind. As new data become available and dosimetric techniques become more refined, the question should be revisited and that in the future, significant improvement might be realized. In addition, such concepts as equivalent dose and organ dose equivalent are discussed and various problems regarding the measurement/estimation of these quantities are presented.

  15. Method for rapid base sequencing in DNA and RNA with two base labeling

    DOEpatents

    Jett, J.H.; Keller, R.A.; Martin, J.C.; Posner, R.G.; Marrone, B.L.; Hammond, M.L.; Simpson, D.J.

    1995-04-11

    A method is described for rapid-base sequencing in DNA and RNA with two-base labeling and employing fluorescent detection of single molecules at two wavelengths. Bases modified to accept fluorescent labels are used to replicate a single DNA or RNA strand to be sequenced. The bases are then sequentially cleaved from the replicated strand, excited with a chosen spectrum of electromagnetic radiation, and the fluorescence from individual, tagged bases detected in the order of cleavage from the strand. 4 figures.

  16. Iron-based amorphous alloys and methods of synthesizing iron-based amorphous alloys

    DOEpatents

    Saw, Cheng Kiong; Bauer, William A.; Choi, Jor-Shan; Day, Dan; Farmer, Joseph C.

    2016-05-03

    A method according to one embodiment includes combining an amorphous iron-based alloy and at least one metal selected from a group consisting of molybdenum, chromium, tungsten, boron, gadolinium, nickel phosphorous, yttrium, and alloys thereof to form a mixture, wherein the at least one metal is present in the mixture from about 5 atomic percent (at %) to about 55 at %; and ball milling the mixture at least until an amorphous alloy of the iron-based alloy and the at least one metal is formed. Several amorphous iron-based metal alloys are also presented, including corrosion-resistant amorphous iron-based metal alloys and radiation-shielding amorphous iron-based metal alloys.

  17. Connecting clinical and actuarial prediction with rule-based methods.

    PubMed

    Fokkema, Marjolein; Smits, Niels; Kelderman, Henk; Penninx, Brenda W J H

    2015-06-01

    Meta-analyses comparing the accuracy of clinical versus actuarial prediction have shown actuarial methods to outperform clinical methods, on average. However, actuarial methods are still not widely used in clinical practice, and there has been a call for the development of actuarial prediction methods for clinical practice. We argue that rule-based methods may be more useful than the linear main effect models usually employed in prediction studies, from a data and decision analytic as well as a practical perspective. In addition, decision rules derived with rule-based methods can be represented as fast and frugal trees, which, unlike main effects models, can be used in a sequential fashion, reducing the number of cues that have to be evaluated before making a prediction. We illustrate the usability of rule-based methods by applying RuleFit, an algorithm for deriving decision rules for classification and regression problems, to a dataset on prediction of the course of depressive and anxiety disorders from Penninx et al. (2011). The RuleFit algorithm provided a model consisting of 2 simple decision rules, requiring evaluation of only 2 to 4 cues. Predictive accuracy of the 2-rule model was very similar to that of a logistic regression model incorporating 20 predictor variables, originally applied to the dataset. In addition, the 2-rule model required, on average, evaluation of only 3 cues. Therefore, the RuleFit algorithm appears to be a promising method for creating decision tools that are less time consuming and easier to apply in psychological practice, and with accuracy comparable to traditional actuarial methods. (c) 2015 APA, all rights reserved).

  18. A trust-based recommendation method using network diffusion processes

    NASA Astrophysics Data System (ADS)

    Chen, Ling-Jiao; Gao, Jian

    2018-09-01

    A variety of rating-based recommendation methods have been extensively studied including the well-known collaborative filtering approaches and some network diffusion-based methods, however, social trust relations are not sufficiently considered when making recommendations. In this paper, we contribute to the literature by proposing a trust-based recommendation method, named CosRA+T, after integrating the information of trust relations into the resource-redistribution process. Specifically, a tunable parameter is used to scale the resources received by trusted users before the redistribution back to the objects. Interestingly, we find an optimal scaling parameter for the proposed CosRA+T method to achieve its best recommendation accuracy, and the optimal value seems to be universal under several evaluation metrics across different datasets. Moreover, results of extensive experiments on the two real-world rating datasets with trust relations, Epinions and FriendFeed, suggest that CosRA+T has a remarkable improvement in overall accuracy, diversity and novelty. Our work takes a step towards designing better recommendation algorithms by employing multiple resources of social network information.

  19. Video Extrapolation Method Based on Time-Varying Energy Optimization and CIP.

    PubMed

    Sakaino, Hidetomo

    2016-09-01

    Video extrapolation/prediction methods are often used to synthesize new videos from images. For fluid-like images and dynamic textures as well as moving rigid objects, most state-of-the-art video extrapolation methods use non-physics-based models that learn orthogonal bases from a number of images but at high computation cost. Unfortunately, data truncation can cause image degradation, i.e., blur, artifact, and insufficient motion changes. To extrapolate videos that more strictly follow physical rules, this paper proposes a physics-based method that needs only a few images and is truncation-free. We utilize physics-based equations with image intensity and velocity: optical flow, Navier-Stokes, continuity, and advection equations. These allow us to use partial difference equations to deal with the local image feature changes. Image degradation during extrapolation is minimized by updating model parameters, where a novel time-varying energy balancer model that uses energy based image features, i.e., texture, velocity, and edge. Moreover, the advection equation is discretized by high-order constrained interpolation profile for lower quantization error than can be achieved by the previous finite difference method in long-term videos. Experiments show that the proposed energy based video extrapolation method outperforms the state-of-the-art video extrapolation methods in terms of image quality and computation cost.

  20. A method of extracting impervious surface based on rule algorithm

    NASA Astrophysics Data System (ADS)

    Peng, Shuangyun; Hong, Liang; Xu, Quanli

    2018-02-01

    The impervious surface has become an important index to evaluate the urban environmental quality and measure the development level of urbanization. At present, the use of remote sensing technology to extract impervious surface has become the main way. In this paper, a method to extract impervious surface based on rule algorithm is proposed. The main ideas of the method is to use the rule-based algorithm to extract impermeable surface based on the characteristics and the difference which is between the impervious surface and the other three types of objects (water, soil and vegetation) in the seven original bands, NDWI and NDVI. The steps can be divided into three steps: 1) Firstly, the vegetation is extracted according to the principle that the vegetation is higher in the near-infrared band than the other bands; 2) Then, the water is extracted according to the characteristic of the water with the highest NDWI and the lowest NDVI; 3) Finally, the impermeable surface is extracted based on the fact that the impervious surface has a higher NDWI value and the lowest NDVI value than the soil.In order to test the accuracy of the rule algorithm, this paper uses the linear spectral mixed decomposition algorithm, the CART algorithm, the NDII index algorithm for extracting the impervious surface based on six remote sensing image of the Dianchi Lake Basin from 1999 to 2014. Then, the accuracy of the above three methods is compared with the accuracy of the rule algorithm by using the overall classification accuracy method. It is found that the extraction method based on the rule algorithm is obviously higher than the above three methods.

  1. Real-time biscuit tile image segmentation method based on edge detection.

    PubMed

    Matić, Tomislav; Aleksi, Ivan; Hocenski, Željko; Kraus, Dieter

    2018-05-01

    In this paper we propose a novel real-time Biscuit Tile Segmentation (BTS) method for images from ceramic tile production line. BTS method is based on signal change detection and contour tracing with a main goal of separating tile pixels from background in images captured on the production line. Usually, human operators are visually inspecting and classifying produced ceramic tiles. Computer vision and image processing techniques can automate visual inspection process if they fulfill real-time requirements. Important step in this process is a real-time tile pixels segmentation. BTS method is implemented for parallel execution on a GPU device to satisfy the real-time constraints of tile production line. BTS method outperforms 2D threshold-based methods, 1D edge detection methods and contour-based methods. Proposed BTS method is in use in the biscuit tile production line. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  2. Bearing diagnostics: A method based on differential geometry

    NASA Astrophysics Data System (ADS)

    Tian, Ye; Wang, Zili; Lu, Chen; Wang, Zhipeng

    2016-12-01

    The structures around bearings are complex, and the working environment is variable. These conditions cause the collected vibration signals to become nonlinear, non-stationary, and chaotic characteristics that make noise reduction, feature extraction, fault diagnosis, and health assessment significantly challenging. Thus, a set of differential geometry-based methods with superiorities in nonlinear analysis is presented in this study. For noise reduction, the Local Projection method is modified by both selecting the neighborhood radius based on empirical mode decomposition and determining noise subspace constrained by neighborhood distribution information. For feature extraction, Hessian locally linear embedding is introduced to acquire manifold features from the manifold topological structures, and singular values of eigenmatrices as well as several specific frequency amplitudes in spectrograms are extracted subsequently to reduce the complexity of the manifold features. For fault diagnosis, information geometry-based support vector machine is applied to classify the fault states. For health assessment, the manifold distance is employed to represent the health information; the Gaussian mixture model is utilized to calculate the confidence values, which directly reflect the health status. Case studies on Lorenz signals and vibration datasets of bearings demonstrate the effectiveness of the proposed methods.

  3. Deviation-based spam-filtering method via stochastic approach

    NASA Astrophysics Data System (ADS)

    Lee, Daekyung; Lee, Mi Jin; Kim, Beom Jun

    2018-03-01

    In the presence of a huge number of possible purchase choices, ranks or ratings of items by others often play very important roles for a buyer to make a final purchase decision. Perfectly objective rating is an impossible task to achieve, and we often use an average rating built on how previous buyers estimated the quality of the product. The problem of using a simple average rating is that it can easily be polluted by careless users whose evaluation of products cannot be trusted, and by malicious spammers who try to bias the rating result on purpose. In this letter we suggest how trustworthiness of individual users can be systematically and quantitatively reflected to build a more reliable rating system. We compute the suitably defined reliability of each user based on the user's rating pattern for all products she evaluated. We call our proposed method as the deviation-based ranking, since the statistical significance of each user's rating pattern with respect to the average rating pattern is the key ingredient. We find that our deviation-based ranking method outperforms existing methods in filtering out careless random evaluators as well as malicious spammers.

  4. Error simulation of paired-comparison-based scaling methods

    NASA Astrophysics Data System (ADS)

    Cui, Chengwu

    2000-12-01

    Subjective image quality measurement usually resorts to psycho physical scaling. However, it is difficult to evaluate the inherent precision of these scaling methods. Without knowing the potential errors of the measurement, subsequent use of the data can be misleading. In this paper, the errors on scaled values derived form paired comparison based scaling methods are simulated with randomly introduced proportion of choice errors that follow the binomial distribution. Simulation results are given for various combinations of the number of stimuli and the sampling size. The errors are presented in the form of average standard deviation of the scaled values and can be fitted reasonably well with an empirical equation that can be sued for scaling error estimation and measurement design. The simulation proves paired comparison based scaling methods can have large errors on the derived scaled values when the sampling size and the number of stimuli are small. Examples are also given to show the potential errors on actually scaled values of color image prints as measured by the method of paired comparison.

  5. OWL-based reasoning methods for validating archetypes.

    PubMed

    Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás

    2013-04-01

    Some modern Electronic Healthcare Record (EHR) architectures and standards are based on the dual model-based architecture, which defines two conceptual levels: reference model and archetype model. Such architectures represent EHR domain knowledge by means of archetypes, which are considered by many researchers to play a fundamental role for the achievement of semantic interoperability in healthcare. Consequently, formal methods for validating archetypes are necessary. In recent years, there has been an increasing interest in exploring how semantic web technologies in general, and ontologies in particular, can facilitate the representation and management of archetypes, including binding to terminologies, but no solution based on such technologies has been provided to date to validate archetypes. Our approach represents archetypes by means of OWL ontologies. This permits to combine the two levels of the dual model-based architecture in one modeling framework which can also integrate terminologies available in OWL format. The validation method consists of reasoning on those ontologies to find modeling errors in archetypes: incorrect restrictions over the reference model, non-conformant archetype specializations and inconsistent terminological bindings. The archetypes available in the repositories supported by the openEHR Foundation and the NHS Connecting for Health Program, which are the two largest publicly available ones, have been analyzed with our validation method. For such purpose, we have implemented a software tool called Archeck. Our results show that around 1/5 of archetype specializations contain modeling errors, the most common mistakes being related to coded terms and terminological bindings. The analysis of each repository reveals that different patterns of errors are found in both repositories. This result reinforces the need for making serious efforts in improving archetype design processes. Copyright © 2012 Elsevier Inc. All rights reserved.

  6. Alternative methods of flexible base compaction acceptance.

    DOT National Transportation Integrated Search

    2012-05-01

    In the Texas Department of Transportation, flexible base construction is governed by a series of stockpile : and field tests. A series of concerns with these existing methods, along with some premature failures in the : field, led to this project inv...

  7. Method for extruding pitch based foam

    DOEpatents

    Klett, James W.

    2002-01-01

    A method and apparatus for extruding pitch based foam is disclosed. The method includes the steps of: forming a viscous pitch foam; passing the precursor through an extrusion tube; and subjecting the precursor in said extrusion tube to a temperature gradient which varies along the length of the extrusion tube to form an extruded carbon foam. The apparatus includes an extrusion tube having a passageway communicatively connected to a chamber in which a viscous pitch foam formed in the chamber paring through the extrusion tube, and a heating mechanism in thermal communication with the tube for heating the viscous pitch foam along the length of the tube in accordance with a predetermined temperature gradient.

  8. Developing a Self-Report-Based Sequential Analysis Method for Educational Technology Systems: A Process-Based Usability Evaluation

    ERIC Educational Resources Information Center

    Lin, Yi-Chun; Hsieh, Ya-Hui; Hou, Huei-Tse

    2015-01-01

    The development of a usability evaluation method for educational systems or applications, called the self-report-based sequential analysis, is described herein. The method aims to extend the current practice by proposing self-report-based sequential analysis as a new usability method, which integrates the advantages of self-report in survey…

  9. Do Examinees Understand Score Reports for Alternate Methods of Scoring Computer Based Tests?

    ERIC Educational Resources Information Center

    Whittaker, Tiffany A.; Williams, Natasha J.; Dodd, Barbara G.

    2011-01-01

    This study assessed the interpretability of scaled scores based on either number correct (NC) scoring for a paper-and-pencil test or one of two methods of scoring computer-based tests: an item pattern (IP) scoring method and a method based on equated NC scoring. The equated NC scoring method for computer-based tests was proposed as an alternative…

  10. A dynamic access control method based on QoS requirement

    NASA Astrophysics Data System (ADS)

    Li, Chunquan; Wang, Yanwei; Yang, Baoye; Hu, Chunyang

    2013-03-01

    A dynamic access control method is put forward to ensure the security of the sharing service in Cloud Manufacturing, according to the application characteristics of cloud manufacturing collaborative task. The role-based access control (RBAC) model is extended according to the characteristics of cloud manufacturing in this method. The constraints are considered, which are from QoS requirement of the task context to access control, based on the traditional static authorization. The fuzzy policy rules are established about the weighted interval value of permissions. The access control authorities of executable service by users are dynamically adjusted through the fuzzy reasoning based on the QoS requirement of task. The main elements of the model are described. The fuzzy reasoning algorithm of weighted interval value based QoS requirement is studied. An effective method is provided to resolve the access control of cloud manufacturing.

  11. Reconstruction of metabolic pathways by combining probabilistic graphical model-based and knowledge-based methods

    PubMed Central

    2014-01-01

    Automatic reconstruction of metabolic pathways for an organism from genomics and transcriptomics data has been a challenging and important problem in bioinformatics. Traditionally, known reference pathways can be mapped into an organism-specific ones based on its genome annotation and protein homology. However, this simple knowledge-based mapping method might produce incomplete pathways and generally cannot predict unknown new relations and reactions. In contrast, ab initio metabolic network construction methods can predict novel reactions and interactions, but its accuracy tends to be low leading to a lot of false positives. Here we combine existing pathway knowledge and a new ab initio Bayesian probabilistic graphical model together in a novel fashion to improve automatic reconstruction of metabolic networks. Specifically, we built a knowledge database containing known, individual gene / protein interactions and metabolic reactions extracted from existing reference pathways. Known reactions and interactions were then used as constraints for Bayesian network learning methods to predict metabolic pathways. Using individual reactions and interactions extracted from different pathways of many organisms to guide pathway construction is new and improves both the coverage and accuracy of metabolic pathway construction. We applied this probabilistic knowledge-based approach to construct the metabolic networks from yeast gene expression data and compared its results with 62 known metabolic networks in the KEGG database. The experiment showed that the method improved the coverage of metabolic network construction over the traditional reference pathway mapping method and was more accurate than pure ab initio methods. PMID:25374614

  12. Comparison of two DSC-based methods to predict drug-polymer solubility.

    PubMed

    Rask, Malte Bille; Knopp, Matthias Manne; Olesen, Niels Erik; Holm, René; Rades, Thomas

    2018-04-05

    The aim of the present study was to compare two DSC-based methods to predict drug-polymer solubility (melting point depression method and recrystallization method) and propose a guideline for selecting the most suitable method based on physicochemical properties of both the drug and the polymer. Using the two methods, the solubilities of celecoxib, indomethacin, carbamazepine, and ritonavir in polyvinylpyrrolidone, hydroxypropyl methylcellulose, and Soluplus® were determined at elevated temperatures and extrapolated to room temperature using the Flory-Huggins model. For the melting point depression method, it was observed that a well-defined drug melting point was required in order to predict drug-polymer solubility, since the method is based on the depression of the melting point as a function of polymer content. In contrast to previous findings, it was possible to measure melting point depression up to 20 °C below the glass transition temperature (T g ) of the polymer for some systems. Nevertheless, in general it was possible to obtain solubility measurements at lower temperatures using polymers with a low T g . Finally, for the recrystallization method it was found that the experimental composition dependence of the T g must be differentiable for compositions ranging from 50 to 90% drug (w/w) so that one T g corresponds to only one composition. Based on these findings, a guideline for selecting the most suitable thermal method to predict drug-polymer solubility based on the physicochemical properties of the drug and polymer is suggested in the form of a decision tree. Copyright © 2018 Elsevier B.V. All rights reserved.

  13. A rule-based named-entity recognition method for knowledge extraction of evidence-based dietary recommendations

    PubMed Central

    2017-01-01

    Evidence-based dietary information represented as unstructured text is a crucial information that needs to be accessed in order to help dietitians follow the new knowledge arrives daily with newly published scientific reports. Different named-entity recognition (NER) methods have been introduced previously to extract useful information from the biomedical literature. They are focused on, for example extracting gene mentions, proteins mentions, relationships between genes and proteins, chemical concepts and relationships between drugs and diseases. In this paper, we present a novel NER method, called drNER, for knowledge extraction of evidence-based dietary information. To the best of our knowledge this is the first attempt at extracting dietary concepts. DrNER is a rule-based NER that consists of two phases. The first one involves the detection and determination of the entities mention, and the second one involves the selection and extraction of the entities. We evaluate the method by using text corpora from heterogeneous sources, including text from several scientifically validated web sites and text from scientific publications. Evaluation of the method showed that drNER gives good results and can be used for knowledge extraction of evidence-based dietary recommendations. PMID:28644863

  14. Evaluating team-based, lecture-based, and hybrid learning methods for neurology clerkship in China: a method-comparison study

    PubMed Central

    2014-01-01

    Background Neurology is complex, abstract, and difficult for students to learn. However, a good learning method for neurology clerkship training is required to help students quickly develop strong clinical thinking as well as problem-solving skills. Both the traditional lecture-based learning (LBL) and the relatively new team-based learning (TBL) methods have inherent strengths and weaknesses when applied to neurology clerkship education. However, the strengths of each method may complement the weaknesses of the other. Combining TBL with LBL may produce better learning outcomes than TBL or LBL alone. We propose a hybrid method (TBL + LBL) and designed an experiment to compare the learning outcomes with those of pure LBL and pure TBL. Methods One hundred twenty-seven fourth-year medical students attended a two-week neurology clerkship program organized by the Department of Neurology, Sun Yat-Sen Memorial Hospital. All of the students were from Grade 2007, Department of Clinical Medicine, Zhongshan School of Medicine, Sun Yat-Sen University. These students were assigned to one of three groups randomly: Group A (TBL + LBL, with 41 students), Group B (LBL, with 43 students), and Group C (TBL, with 43 students). The learning outcomes were evaluated by a questionnaire and two tests covering basic knowledge of neurology and clinical practice. Results The practice test scores of Group A were similar to those of Group B, but significantly higher than those of Group C. The theoretical test scores and the total scores of Group A were significantly higher than those of Groups B and C. In addition, 100% of the students in Group A were satisfied with the combination of TBL + LBL. Conclusions Our results support our proposal that the combination of TBL + LBL is acceptable to students and produces better learning outcomes than either method alone in neurology clerkships. In addition, the proposed hybrid method may also be suited for other medical clerkships that

  15. Hyperspectral image compressing using wavelet-based method

    NASA Astrophysics Data System (ADS)

    Yu, Hui; Zhang, Zhi-jie; Lei, Bo; Wang, Chen-sheng

    2017-10-01

    Hyperspectral imaging sensors can acquire images in hundreds of continuous narrow spectral bands. Therefore each object presented in the image can be identified from their spectral response. However, such kind of imaging brings a huge amount of data, which requires transmission, processing, and storage resources for both airborne and space borne imaging. Due to the high volume of hyperspectral image data, the exploration of compression strategies has received a lot of attention in recent years. Compression of hyperspectral data cubes is an effective solution for these problems. Lossless compression of the hyperspectral data usually results in low compression ratio, which may not meet the available resources; on the other hand, lossy compression may give the desired ratio, but with a significant degradation effect on object identification performance of the hyperspectral data. Moreover, most hyperspectral data compression techniques exploits the similarities in spectral dimensions; which requires bands reordering or regrouping, to make use of the spectral redundancy. In this paper, we explored the spectral cross correlation between different bands, and proposed an adaptive band selection method to obtain the spectral bands which contain most of the information of the acquired hyperspectral data cube. The proposed method mainly consist three steps: First, the algorithm decomposes the original hyperspectral imagery into a series of subspaces based on the hyper correlation matrix of the hyperspectral images between different bands. And then the Wavelet-based algorithm is applied to the each subspaces. At last the PCA method is applied to the wavelet coefficients to produce the chosen number of components. The performance of the proposed method was tested by using ISODATA classification method.

  16. Distance-based microfluidic quantitative detection methods for point-of-care testing.

    PubMed

    Tian, Tian; Li, Jiuxing; Song, Yanling; Zhou, Leiji; Zhu, Zhi; Yang, Chaoyong James

    2016-04-07

    Equipment-free devices with quantitative readout are of great significance to point-of-care testing (POCT), which provides real-time readout to users and is especially important in low-resource settings. Among various equipment-free approaches, distance-based visual quantitative detection methods rely on reading the visual signal length for corresponding target concentrations, thus eliminating the need for sophisticated instruments. The distance-based methods are low-cost, user-friendly and can be integrated into portable analytical devices. Moreover, such methods enable quantitative detection of various targets by the naked eye. In this review, we first introduce the concept and history of distance-based visual quantitative detection methods. Then, we summarize the main methods for translation of molecular signals to distance-based readout and discuss different microfluidic platforms (glass, PDMS, paper and thread) in terms of applications in biomedical diagnostics, food safety monitoring, and environmental analysis. Finally, the potential and future perspectives are discussed.

  17. A content-based image retrieval method for optical colonoscopy images based on image recognition techniques

    NASA Astrophysics Data System (ADS)

    Nosato, Hirokazu; Sakanashi, Hidenori; Takahashi, Eiichi; Murakawa, Masahiro

    2015-03-01

    This paper proposes a content-based image retrieval method for optical colonoscopy images that can find images similar to ones being diagnosed. Optical colonoscopy is a method of direct observation for colons and rectums to diagnose bowel diseases. It is the most common procedure for screening, surveillance and treatment. However, diagnostic accuracy for intractable inflammatory bowel diseases, such as ulcerative colitis (UC), is highly dependent on the experience and knowledge of the medical doctor, because there is considerable variety in the appearances of colonic mucosa within inflammations with UC. In order to solve this issue, this paper proposes a content-based image retrieval method based on image recognition techniques. The proposed retrieval method can find similar images from a database of images diagnosed as UC, and can potentially furnish the medical records associated with the retrieved images to assist the UC diagnosis. Within the proposed method, color histogram features and higher order local auto-correlation (HLAC) features are adopted to represent the color information and geometrical information of optical colonoscopy images, respectively. Moreover, considering various characteristics of UC colonoscopy images, such as vascular patterns and the roughness of the colonic mucosa, we also propose an image enhancement method to highlight the appearances of colonic mucosa in UC. In an experiment using 161 UC images from 32 patients, we demonstrate that our method improves the accuracy of retrieving similar UC images.

  18. Particle-based and meshless methods with Aboria

    NASA Astrophysics Data System (ADS)

    Robinson, Martin; Bruna, Maria

    Aboria is a powerful and flexible C++ library for the implementation of particle-based numerical methods. The particles in such methods can represent actual particles (e.g. Molecular Dynamics) or abstract particles used to discretise a continuous function over a domain (e.g. Radial Basis Functions). Aboria provides a particle container, compatible with the Standard Template Library, spatial search data structures, and a Domain Specific Language to specify non-linear operators on the particle set. This paper gives an overview of Aboria's design, an example of use, and a performance benchmark.

  19. [Reconstituting evaluation methods based on both qualitative and quantitative paradigms].

    PubMed

    Miyata, Hiroaki; Okubo, Suguru; Yoshie, Satoru; Kai, Ichiro

    2011-01-01

    Debate about the relationship between quantitative and qualitative paradigms is often muddled and confusing and the clutter of terms and arguments has resulted in the concepts becoming obscure and unrecognizable. In this study we conducted content analysis regarding evaluation methods of qualitative healthcare research. We extracted descriptions on four types of evaluation paradigm (validity/credibility, reliability/credibility, objectivity/confirmability, and generalizability/transferability), and classified them into subcategories. In quantitative research, there has been many evaluation methods based on qualitative paradigms, and vice versa. Thus, it might not be useful to consider evaluation methods of qualitative paradigm are isolated from those of quantitative methods. Choosing practical evaluation methods based on the situation and prior conditions of each study is an important approach for researchers.

  20. Fast and robust standard-deviation-based method for bulk motion compensation in phase-based functional OCT.

    PubMed

    Wei, Xiang; Camino, Acner; Pi, Shaohua; Cepurna, William; Huang, David; Morrison, John C; Jia, Yali

    2018-05-01

    Phase-based optical coherence tomography (OCT), such as OCT angiography (OCTA) and Doppler OCT, is sensitive to the confounding phase shift introduced by subject bulk motion. Traditional bulk motion compensation methods are limited by their accuracy and computing cost-effectiveness. In this Letter, to the best of our knowledge, we present a novel bulk motion compensation method for phase-based functional OCT. Bulk motion associated phase shift can be directly derived by solving its equation using a standard deviation of phase-based OCTA and Doppler OCT flow signals. This method was evaluated on rodent retinal images acquired by a prototype visible light OCT and human retinal images acquired by a commercial system. The image quality and computational speed were significantly improved, compared to two conventional phase compensation methods.

  1. Guided filter-based fusion method for multiexposure images

    NASA Astrophysics Data System (ADS)

    Hou, Xinglin; Luo, Haibo; Qi, Feng; Zhou, Peipei

    2016-11-01

    It is challenging to capture a high-dynamic range (HDR) scene using a low-dynamic range camera. A weighted sum-based image fusion (IF) algorithm is proposed so as to express an HDR scene with a high-quality image. This method mainly includes three parts. First, two image features, i.e., gradients and well-exposedness are measured to estimate the initial weight maps. Second, the initial weight maps are refined by a guided filter, in which the source image is considered as the guidance image. This process could reduce the noise in initial weight maps and preserve more texture consistent with the original images. Finally, the fused image is constructed by a weighted sum of source images in the spatial domain. The main contributions of this method are the estimation of the initial weight maps and the appropriate use of the guided filter-based weight maps refinement. It provides accurate weight maps for IF. Compared to traditional IF methods, this algorithm avoids image segmentation, combination, and the camera response curve calibration. Furthermore, experimental results demonstrate the superiority of the proposed method in both subjective and objective evaluations.

  2. Surrogate Based Uni/Multi-Objective Optimization and Distribution Estimation Methods

    NASA Astrophysics Data System (ADS)

    Gong, W.; Duan, Q.; Huo, X.

    2017-12-01

    Parameter calibration has been demonstrated as an effective way to improve the performance of dynamic models, such as hydrological models, land surface models, weather and climate models etc. Traditional optimization algorithms usually cost a huge number of model evaluations, making dynamic model calibration very difficult, or even computationally prohibitive. With the help of a serious of recently developed adaptive surrogate-modelling based optimization methods: uni-objective optimization method ASMO, multi-objective optimization method MO-ASMO, and probability distribution estimation method ASMO-PODE, the number of model evaluations can be significantly reduced to several hundreds, making it possible to calibrate very expensive dynamic models, such as regional high resolution land surface models, weather forecast models such as WRF, and intermediate complexity earth system models such as LOVECLIM. This presentation provides a brief introduction to the common framework of adaptive surrogate-based optimization algorithms of ASMO, MO-ASMO and ASMO-PODE, a case study of Common Land Model (CoLM) calibration in Heihe river basin in Northwest China, and an outlook of the potential applications of the surrogate-based optimization methods.

  3. Method of plasma etching Ga-based compound semiconductors

    DOEpatents

    Qiu, Weibin; Goddard, Lynford L.

    2012-12-25

    A method of plasma etching Ga-based compound semiconductors includes providing a process chamber and a source electrode adjacent to the process chamber. The process chamber contains a sample comprising a Ga-based compound semiconductor. The sample is in contact with a platen which is electrically connected to a first power supply, and the source electrode is electrically connected to a second power supply. The method includes flowing SiCl.sub.4 gas into the chamber, flowing Ar gas into the chamber, and flowing H.sub.2 gas into the chamber. RF power is supplied independently to the source electrode and the platen. A plasma is generated based on the gases in the process chamber, and regions of a surface of the sample adjacent to one or more masked portions of the surface are etched to create a substantially smooth etched surface including features having substantially vertical walls beneath the masked portions.

  4. A Coarse-Alignment Method Based on the Optimal-REQUEST Algorithm

    PubMed Central

    Zhu, Yongyun

    2018-01-01

    In this paper, we proposed a coarse-alignment method for strapdown inertial navigation systems based on attitude determination. The observation vectors, which can be obtained by inertial sensors, usually contain various types of noise, which affects the convergence rate and the accuracy of the coarse alignment. Given this drawback, we studied an attitude-determination method named optimal-REQUEST, which is an optimal method for attitude determination that is based on observation vectors. Compared to the traditional attitude-determination method, the filtering gain of the proposed method is tuned autonomously; thus, the convergence rate of the attitude determination is faster than in the traditional method. Within the proposed method, we developed an iterative method for determining the attitude quaternion. We carried out simulation and turntable tests, which we used to validate the proposed method’s performance. The experiment’s results showed that the convergence rate of the proposed optimal-REQUEST algorithm is faster and that the coarse alignment’s stability is higher. In summary, the proposed method has a high applicability to practical systems. PMID:29337895

  5. PDEs on moving surfaces via the closest point method and a modified grid based particle method

    NASA Astrophysics Data System (ADS)

    Petras, A.; Ruuth, S. J.

    2016-05-01

    Partial differential equations (PDEs) on surfaces arise in a wide range of applications. The closest point method (Ruuth and Merriman (2008) [20]) is a recent embedding method that has been used to solve a variety of PDEs on smooth surfaces using a closest point representation of the surface and standard Cartesian grid methods in the embedding space. The original closest point method (CPM) was designed for problems posed on static surfaces, however the solution of PDEs on moving surfaces is of considerable interest as well. Here we propose solving PDEs on moving surfaces using a combination of the CPM and a modification of the grid based particle method (Leung and Zhao (2009) [12]). The grid based particle method (GBPM) represents and tracks surfaces using meshless particles and an Eulerian reference grid. Our modification of the GBPM introduces a reconstruction step into the original method to ensure that all the grid points within a computational tube surrounding the surface are active. We present a number of examples to illustrate the numerical convergence properties of our combined method. Experiments for advection-diffusion equations that are strongly coupled to the velocity of the surface are also presented.

  6. An efficient temporal database design method based on EER

    NASA Astrophysics Data System (ADS)

    Liu, Zhi; Huang, Jiping; Miao, Hua

    2007-12-01

    Many existing methods of modeling temporal information are based on logical model, which makes relational schema optimization more difficult and more complicated. In this paper, based on the conventional EER model, the author attempts to analyse and abstract temporal information in the phase of conceptual modelling according to the concrete requirement to history information. Then a temporal data model named BTEER is presented. BTEER not only retains all designing ideas and methods of EER which makes BTEER have good upward compatibility, but also supports the modelling of valid time and transaction time effectively at the same time. In addition, BTEER can be transformed to EER easily and automatically. It proves in practice, this method can model the temporal information well.

  7. A method for radiological characterization based on fluence conversion coefficients

    NASA Astrophysics Data System (ADS)

    Froeschl, Robert

    2018-06-01

    Radiological characterization of components in accelerator environments is often required to ensure adequate radiation protection during maintenance, transport and handling as well as for the selection of the proper disposal pathway. The relevant quantities are typical the weighted sums of specific activities with radionuclide-specific weighting coefficients. Traditional methods based on Monte Carlo simulations are radionuclide creation-event based or the particle fluences in the regions of interest are scored and then off-line weighted with radionuclide production cross sections. The presented method bases the radiological characterization on a set of fluence conversion coefficients. For a given irradiation profile and cool-down time, radionuclide production cross-sections, material composition and radionuclide-specific weighting coefficients, a set of particle type and energy dependent fluence conversion coefficients is computed. These fluence conversion coefficients can then be used in a Monte Carlo transport code to perform on-line weighting to directly obtain the desired radiological characterization, either by using built-in multiplier features such as in the PHITS code or by writing a dedicated user routine such as for the FLUKA code. The presented method has been validated against the standard event-based methods directly available in Monte Carlo transport codes.

  8. On evaluating the robustness of spatial-proximity-based regionalization methods

    NASA Astrophysics Data System (ADS)

    Lebecherel, Laure; Andréassian, Vazken; Perrin, Charles

    2016-08-01

    In absence of streamflow data to calibrate a hydrological model, its parameters are to be inferred by a regionalization method. In this technical note, we discuss a specific class of regionalization methods, those based on spatial proximity, which transfers hydrological information (typically calibrated parameter sets) from neighbor gauged stations to the target ungauged station. The efficiency of any spatial-proximity-based regionalization method will depend on the density of the available streamgauging network, and the purpose of this note is to discuss how to assess the robustness of the regionalization method (i.e., its resilience to an increasingly sparse hydrometric network). We compare two options: (i) the random hydrometrical reduction (HRand) method, which consists in sub-sampling the existing gauging network around the target ungauged station, and (ii) the hydrometrical desert method (HDes), which consists in ignoring the closest gauged stations. Our tests suggest that the HDes method should be preferred, because it provides a more realistic view on regionalization performance.

  9. Blind compressed sensing image reconstruction based on alternating direction method

    NASA Astrophysics Data System (ADS)

    Liu, Qinan; Guo, Shuxu

    2018-04-01

    In order to solve the problem of how to reconstruct the original image under the condition of unknown sparse basis, this paper proposes an image reconstruction method based on blind compressed sensing model. In this model, the image signal is regarded as the product of a sparse coefficient matrix and a dictionary matrix. Based on the existing blind compressed sensing theory, the optimal solution is solved by the alternative minimization method. The proposed method solves the problem that the sparse basis in compressed sensing is difficult to represent, which restrains the noise and improves the quality of reconstructed image. This method ensures that the blind compressed sensing theory has a unique solution and can recover the reconstructed original image signal from a complex environment with a stronger self-adaptability. The experimental results show that the image reconstruction algorithm based on blind compressed sensing proposed in this paper can recover high quality image signals under the condition of under-sampling.

  10. An Improved Image Matching Method Based on Surf Algorithm

    NASA Astrophysics Data System (ADS)

    Chen, S. J.; Zheng, S. Z.; Xu, Z. G.; Guo, C. C.; Ma, X. L.

    2018-04-01

    Many state-of-the-art image matching methods, based on the feature matching, have been widely studied in the remote sensing field. These methods of feature matching which get highly operating efficiency, have a disadvantage of low accuracy and robustness. This paper proposes an improved image matching method which based on the SURF algorithm. The proposed method introduces color invariant transformation, information entropy theory and a series of constraint conditions to increase feature points detection and matching accuracy. First, the model of color invariant transformation is introduced for two matching images aiming at obtaining more color information during the matching process and information entropy theory is used to obtain the most information of two matching images. Then SURF algorithm is applied to detect and describe points from the images. Finally, constraint conditions which including Delaunay triangulation construction, similarity function and projective invariant are employed to eliminate the mismatches so as to improve matching precision. The proposed method has been validated on the remote sensing images and the result benefits from its high precision and robustness.

  11. Lagrangian based methods for coherent structure detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allshouse, Michael R., E-mail: mallshouse@chaos.utexas.edu; Peacock, Thomas, E-mail: tomp@mit.edu

    There has been a proliferation in the development of Lagrangian analytical methods for detecting coherent structures in fluid flow transport, yielding a variety of qualitatively different approaches. We present a review of four approaches and demonstrate the utility of these methods via their application to the same sample analytic model, the canonical double-gyre flow, highlighting the pros and cons of each approach. Two of the methods, the geometric and probabilistic approaches, are well established and require velocity field data over the time interval of interest to identify particularly important material lines and surfaces, and influential regions, respectively. The other twomore » approaches, implementing tools from cluster and braid theory, seek coherent structures based on limited trajectory data, attempting to partition the flow transport into distinct regions. All four of these approaches share the common trait that they are objective methods, meaning that their results do not depend on the frame of reference used. For each method, we also present a number of example applications ranging from blood flow and chemical reactions to ocean and atmospheric flows.« less

  12. Lagrangian based methods for coherent structure detection

    NASA Astrophysics Data System (ADS)

    Allshouse, Michael R.; Peacock, Thomas

    2015-09-01

    There has been a proliferation in the development of Lagrangian analytical methods for detecting coherent structures in fluid flow transport, yielding a variety of qualitatively different approaches. We present a review of four approaches and demonstrate the utility of these methods via their application to the same sample analytic model, the canonical double-gyre flow, highlighting the pros and cons of each approach. Two of the methods, the geometric and probabilistic approaches, are well established and require velocity field data over the time interval of interest to identify particularly important material lines and surfaces, and influential regions, respectively. The other two approaches, implementing tools from cluster and braid theory, seek coherent structures based on limited trajectory data, attempting to partition the flow transport into distinct regions. All four of these approaches share the common trait that they are objective methods, meaning that their results do not depend on the frame of reference used. For each method, we also present a number of example applications ranging from blood flow and chemical reactions to ocean and atmospheric flows.

  13. Image Mosaic Method Based on SIFT Features of Line Segment

    PubMed Central

    Zhu, Jun; Ren, Mingwu

    2014-01-01

    This paper proposes a novel image mosaic method based on SIFT (Scale Invariant Feature Transform) feature of line segment, aiming to resolve incident scaling, rotation, changes in lighting condition, and so on between two images in the panoramic image mosaic process. This method firstly uses Harris corner detection operator to detect key points. Secondly, it constructs directed line segments, describes them with SIFT feature, and matches those directed segments to acquire rough point matching. Finally, Ransac method is used to eliminate wrong pairs in order to accomplish image mosaic. The results from experiment based on four pairs of images show that our method has strong robustness for resolution, lighting, rotation, and scaling. PMID:24511326

  14. Base oils and methods for making the same

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ohler, Nicholas; Fisher, Karl; Tirmizi, Shakeel

    Provided herein are isoparaffins derived from hydrocarbon terpenes such as myrcene, ocimene and farnesene, and methods for making the same. In certain variations, the isoparaffins have utility as lubricant base stocks.

  15. Gradient-based interpolation method for division-of-focal-plane polarimeters.

    PubMed

    Gao, Shengkui; Gruev, Viktor

    2013-01-14

    Recent advancements in nanotechnology and nanofabrication have allowed for the emergence of the division-of-focal-plane (DoFP) polarization imaging sensors. These sensors capture polarization properties of the optical field at every imaging frame. However, the DoFP polarization imaging sensors suffer from large registration error as well as reduced spatial-resolution output. These drawbacks can be improved by applying proper image interpolation methods for the reconstruction of the polarization results. In this paper, we present a new gradient-based interpolation method for DoFP polarimeters. The performance of the proposed interpolation method is evaluated against several previously published interpolation methods by using visual examples and root mean square error (RMSE) comparison. We found that the proposed gradient-based interpolation method can achieve better visual results while maintaining a lower RMSE than other interpolation methods under various dynamic ranges of a scene ranging from dim to bright conditions.

  16. Connectivity-based, all-hexahedral mesh generation method and apparatus

    DOEpatents

    Tautges, Timothy James; Mitchell, Scott A.; Blacker, Ted D.; Murdoch, Peter

    1998-01-01

    The present invention is a computer-based method and apparatus for constructing all-hexahedral finite element meshes for finite element analysis. The present invention begins with a three-dimensional geometry and an all-quadrilateral surface mesh, then constructs hexahedral element connectivity from the outer boundary inward, and then resolves invalid connectivity. The result of the present invention is a complete representation of hex mesh connectivity only; actual mesh node locations are determined later. The basic method of the present invention comprises the step of forming hexahedral elements by making crossings of entities referred to as "whisker chords." This step, combined with a seaming operation in space, is shown to be sufficient for meshing simple block problems. Entities that appear when meshing more complex geometries, namely blind chords, merged sheets, and self-intersecting chords, are described. A method for detecting invalid connectivity in space, based on repeated edges, is also described, along with its application to various cases of invalid connectivity introduced and resolved by the method.

  17. Global positioning method based on polarized light compass system

    NASA Astrophysics Data System (ADS)

    Liu, Jun; Yang, Jiangtao; Wang, Yubo; Tang, Jun; Shen, Chong

    2018-05-01

    This paper presents a global positioning method based on a polarized light compass system. A main limitation of polarization positioning is the environment such as weak and locally destroyed polarization environments, and the solution to the positioning problem is given in this paper which is polarization image de-noising and segmentation. Therefore, the pulse coupled neural network is employed for enhancing positioning performance. The prominent advantages of the present positioning technique are as follows: (i) compared to the existing position method based on polarized light, better sun tracking accuracy can be achieved and (ii) the robustness and accuracy of positioning under weak and locally destroyed polarization environments, such as cloudy or building shielding, are improved significantly. Finally, some field experiments are given to demonstrate the effectiveness and applicability of the proposed global positioning technique. The experiments have shown that our proposed method outperforms the conventional polarization positioning method, the real time longitude and latitude with accuracy up to 0.0461° and 0.0911°, respectively.

  18. Development of a practical image-based scatter correction method for brain perfusion SPECT: comparison with the TEW method.

    PubMed

    Shidahara, Miho; Watabe, Hiroshi; Kim, Kyeong Min; Kato, Takashi; Kawatsu, Shoji; Kato, Rikio; Yoshimura, Kumiko; Iida, Hidehiro; Ito, Kengo

    2005-10-01

    An image-based scatter correction (IBSC) method was developed to convert scatter-uncorrected into scatter-corrected SPECT images. The purpose of this study was to validate this method by means of phantom simulations and human studies with 99mTc-labeled tracers, based on comparison with the conventional triple energy window (TEW) method. The IBSC method corrects scatter on the reconstructed image I(mub)AC with Chang's attenuation correction factor. The scatter component image is estimated by convolving I(mub)AC with a scatter function followed by multiplication with an image-based scatter fraction function. The IBSC method was evaluated with Monte Carlo simulations and 99mTc-ethyl cysteinate dimer SPECT human brain perfusion studies obtained from five volunteers. The image counts and contrast of the scatter-corrected images obtained by the IBSC and TEW methods were compared. Using data obtained from the simulations, the image counts and contrast of the scatter-corrected images obtained by the IBSC and TEW methods were found to be nearly identical for both gray and white matter. In human brain images, no significant differences in image contrast were observed between the IBSC and TEW methods. The IBSC method is a simple scatter correction technique feasible for use in clinical routine.

  19. Connotations of pixel-based scale effect in remote sensing and the modified fractal-based analysis method

    NASA Astrophysics Data System (ADS)

    Feng, Guixiang; Ming, Dongping; Wang, Min; Yang, Jianyu

    2017-06-01

    Scale problems are a major source of concern in the field of remote sensing. Since the remote sensing is a complex technology system, there is a lack of enough cognition on the connotation of scale and scale effect in remote sensing. Thus, this paper first introduces the connotations of pixel-based scale and summarizes the general understanding of pixel-based scale effect. Pixel-based scale effect analysis is essentially important for choosing the appropriate remote sensing data and the proper processing parameters. Fractal dimension is a useful measurement to analysis pixel-based scale. However in traditional fractal dimension calculation, the impact of spatial resolution is not considered, which leads that the scale effect change with spatial resolution can't be clearly reflected. Therefore, this paper proposes to use spatial resolution as the modified scale parameter of two fractal methods to further analyze the pixel-based scale effect. To verify the results of two modified methods (MFBM (Modified Windowed Fractal Brownian Motion Based on the Surface Area) and MDBM (Modified Windowed Double Blanket Method)); the existing scale effect analysis method (information entropy method) is used to evaluate. And six sub-regions of building areas and farmland areas were cut out from QuickBird images to be used as the experimental data. The results of the experiment show that both the fractal dimension and information entropy present the same trend with the decrease of spatial resolution, and some inflection points appear at the same feature scales. Further analysis shows that these feature scales (corresponding to the inflection points) are related to the actual sizes of the geo-object, which results in fewer mixed pixels in the image, and these inflection points are significantly indicative of the observed features. Therefore, the experiment results indicate that the modified fractal methods are effective to reflect the pixel-based scale effect existing in remote sensing

  20. A rule-based automatic sleep staging method.

    PubMed

    Liang, Sheng-Fu; Kuo, Chin-En; Hu, Yu-Han; Cheng, Yu-Shian

    2012-03-30

    In this paper, a rule-based automatic sleep staging method was proposed. Twelve features including temporal and spectrum analyses of the EEG, EOG, and EMG signals were utilized. Normalization was applied to each feature to eliminating individual differences. A hierarchical decision tree with fourteen rules was constructed for sleep stage classification. Finally, a smoothing process considering the temporal contextual information was applied for the continuity. The overall agreement and kappa coefficient of the proposed method applied to the all night polysomnography (PSG) of seventeen healthy subjects compared with the manual scorings by R&K rules can reach 86.68% and 0.79, respectively. This method can integrate with portable PSG system for sleep evaluation at-home in the near future. Copyright © 2012 Elsevier B.V. All rights reserved.

  1. [Galaxy/quasar classification based on nearest neighbor method].

    PubMed

    Li, Xiang-Ru; Lu, Yu; Zhou, Jian-Ming; Wang, Yong-Jun

    2011-09-01

    With the wide application of high-quality CCD in celestial spectrum imagery and the implementation of many large sky survey programs (e. g., Sloan Digital Sky Survey (SDSS), Two-degree-Field Galaxy Redshift Survey (2dF), Spectroscopic Survey Telescope (SST), Large Sky Area Multi-Object Fiber Spectroscopic Telescope (LAMOST) program and Large Synoptic Survey Telescope (LSST) program, etc.), celestial observational data are coming into the world like torrential rain. Therefore, to utilize them effectively and fully, research on automated processing methods for celestial data is imperative. In the present work, we investigated how to recognizing galaxies and quasars from spectra based on nearest neighbor method. Galaxies and quasars are extragalactic objects, they are far away from earth, and their spectra are usually contaminated by various noise. Therefore, it is a typical problem to recognize these two types of spectra in automatic spectra classification. Furthermore, the utilized method, nearest neighbor, is one of the most typical, classic, mature algorithms in pattern recognition and data mining, and often is used as a benchmark in developing novel algorithm. For applicability in practice, it is shown that the recognition ratio of nearest neighbor method (NN) is comparable to the best results reported in the literature based on more complicated methods, and the superiority of NN is that this method does not need to be trained, which is useful in incremental learning and parallel computation in mass spectral data processing. In conclusion, the results in this work are helpful for studying galaxies and quasars spectra classification.

  2. Index cost estimate based BIM method - Computational example for sports fields

    NASA Astrophysics Data System (ADS)

    Zima, Krzysztof

    2017-07-01

    The paper presents an example ofcost estimation in the early phase of the project. The fragment of relative database containing solution, descriptions, geometry of construction object and unit cost of sports facilities was shown. The Index Cost Estimate Based BIM method calculationswith use of Case Based Reasoning were presented, too. The article presentslocal and global similarity measurement and example of BIM based quantity takeoff process. The outcome of cost calculations based on CBR method was presented as a final result of calculations.

  3. Comparing K-mer based methods for improved classification of 16S sequences.

    PubMed

    Vinje, Hilde; Liland, Kristian Hovde; Almøy, Trygve; Snipen, Lars

    2015-07-01

    The need for precise and stable taxonomic classification is highly relevant in modern microbiology. Parallel to the explosion in the amount of sequence data accessible, there has also been a shift in focus for classification methods. Previously, alignment-based methods were the most applicable tools. Now, methods based on counting K-mers by sliding windows are the most interesting classification approach with respect to both speed and accuracy. Here, we present a systematic comparison on five different K-mer based classification methods for the 16S rRNA gene. The methods differ from each other both in data usage and modelling strategies. We have based our study on the commonly known and well-used naïve Bayes classifier from the RDP project, and four other methods were implemented and tested on two different data sets, on full-length sequences as well as fragments of typical read-length. The difference in classification error obtained by the methods seemed to be small, but they were stable and for both data sets tested. The Preprocessed nearest-neighbour (PLSNN) method performed best for full-length 16S rRNA sequences, significantly better than the naïve Bayes RDP method. On fragmented sequences the naïve Bayes Multinomial method performed best, significantly better than all other methods. For both data sets explored, and on both full-length and fragmented sequences, all the five methods reached an error-plateau. We conclude that no K-mer based method is universally best for classifying both full-length sequences and fragments (reads). All methods approach an error plateau indicating improved training data is needed to improve classification from here. Classification errors occur most frequent for genera with few sequences present. For improving the taxonomy and testing new classification methods, the need for a better and more universal and robust training data set is crucial.

  4. Methods and approaches in the topology-based analysis of biological pathways

    PubMed Central

    Mitrea, Cristina; Taghavi, Zeinab; Bokanizad, Behzad; Hanoudi, Samer; Tagett, Rebecca; Donato, Michele; Voichiţa, Călin; Drăghici, Sorin

    2013-01-01

    The goal of pathway analysis is to identify the pathways significantly impacted in a given phenotype. Many current methods are based on algorithms that consider pathways as simple gene lists, dramatically under-utilizing the knowledge that such pathways are meant to capture. During the past few years, a plethora of methods claiming to incorporate various aspects of the pathway topology have been proposed. These topology-based methods, sometimes referred to as “third generation,” have the potential to better model the phenomena described by pathways. Although there is now a large variety of approaches used for this purpose, no review is currently available to offer guidance for potential users and developers. This review covers 22 such topology-based pathway analysis methods published in the last decade. We compare these methods based on: type of pathways analyzed (e.g., signaling or metabolic), input (subset of genes, all genes, fold changes, gene p-values, etc.), mathematical models, pathway scoring approaches, output (one or more pathway scores, p-values, etc.) and implementation (web-based, standalone, etc.). We identify and discuss challenges, arising both in methodology and in pathway representation, including inconsistent terminology, different data formats, lack of meaningful benchmarks, and the lack of tissue and condition specificity. PMID:24133454

  5. LEAKAGE CHARACTERISTICS OF BASE OF RIVERBANK BY SELF POTENTIAL METHOD AND EXAMINATION OF EFFECTIVENESS OF SELF POTENTIAL METHOD TO HEALTH MONITORING OF BASE OF RIVERBANK

    NASA Astrophysics Data System (ADS)

    Matsumoto, Kensaku; Okada, Takashi; Takeuchi, Atsuo; Yazawa, Masato; Uchibori, Sumio; Shimizu, Yoshihiko

    Field Measurement of Self Potential Method using Copper Sulfate Electrode was performed in base of riverbank in WATARASE River, where has leakage problem to examine leakage characteristics. Measurement results showed typical S-shape what indicates existence of flow groundwater. The results agreed with measurement results by Ministry of Land, Infrastructure and Transport with good accuracy. Results of 1m depth ground temperature detection and Chain-Array detection showed good agreement with results of the Self Potential Method. Correlation between Self Potential value and groundwater velocity was examined model experiment. The result showed apparent correlation. These results indicate that the Self Potential Method was effective method to examine the characteristics of ground water of base of riverbank in leakage problem.

  6. Efficient method of image edge detection based on FSVM

    NASA Astrophysics Data System (ADS)

    Cai, Aiping; Xiong, Xiaomei

    2013-07-01

    For efficient object cover edge detection in digital images, this paper studied traditional methods and algorithm based on SVM. It analyzed Canny edge detection algorithm existed some pseudo-edge and poor anti-noise capability. In order to provide a reliable edge extraction method, propose a new detection algorithm based on FSVM. Which contains several steps: first, trains classify sample and gives the different membership function to different samples. Then, a new training sample is formed by increase the punishment some wrong sub-sample, and use the new FSVM classification model for train and test them. Finally the edges are extracted of the object image by using the model. Experimental result shows that good edge detection image will be obtained and adding noise experiments results show that this method has good anti-noise.

  7. A Comparative Study of Two Azimuth Based Non Standard Location Methods

    DTIC Science & Technology

    2017-03-23

    Standard Location Methods Rongsong JIH U.S. Department of State / Arms Control, Verification, and Compliance Bureau, 2201 C Street, NW, Washington...COMPARATIVE STUDY OF TWO AZIMUTH-BASED NON-STANDARD LOCATION METHODS R. Jih Department of State / Arms Control, Verification, and Compliance Bureau...cable. The so-called “Yin Zhong Xian” (“引中线” in Chinese) algorithm, hereafter the YZX method , is an Oriental version of IPB-based procedure. It

  8. Molecular cancer classification using a meta-sample-based regularized robust coding method.

    PubMed

    Wang, Shu-Lin; Sun, Liuchao; Fang, Jianwen

    2014-01-01

    Previous studies have demonstrated that machine learning based molecular cancer classification using gene expression profiling (GEP) data is promising for the clinic diagnosis and treatment of cancer. Novel classification methods with high efficiency and prediction accuracy are still needed to deal with high dimensionality and small sample size of typical GEP data. Recently the sparse representation (SR) method has been successfully applied to the cancer classification. Nevertheless, its efficiency needs to be improved when analyzing large-scale GEP data. In this paper we present the meta-sample-based regularized robust coding classification (MRRCC), a novel effective cancer classification technique that combines the idea of meta-sample-based cluster method with regularized robust coding (RRC) method. It assumes that the coding residual and the coding coefficient are respectively independent and identically distributed. Similar to meta-sample-based SR classification (MSRC), MRRCC extracts a set of meta-samples from the training samples, and then encodes a testing sample as the sparse linear combination of these meta-samples. The representation fidelity is measured by the l2-norm or l1-norm of the coding residual. Extensive experiments on publicly available GEP datasets demonstrate that the proposed method is more efficient while its prediction accuracy is equivalent to existing MSRC-based methods and better than other state-of-the-art dimension reduction based methods.

  9. Comparative study on gene set and pathway topology-based enrichment methods.

    PubMed

    Bayerlová, Michaela; Jung, Klaus; Kramer, Frank; Klemm, Florian; Bleckmann, Annalen; Beißbarth, Tim

    2015-10-22

    Enrichment analysis is a popular approach to identify pathways or sets of genes which are significantly enriched in the context of differentially expressed genes. The traditional gene set enrichment approach considers a pathway as a simple gene list disregarding any knowledge of gene or protein interactions. In contrast, the new group of so called pathway topology-based methods integrates the topological structure of a pathway into the analysis. We comparatively investigated gene set and pathway topology-based enrichment approaches, considering three gene set and four topological methods. These methods were compared in two extensive simulation studies and on a benchmark of 36 real datasets, providing the same pathway input data for all methods. In the benchmark data analysis both types of methods showed a comparable ability to detect enriched pathways. The first simulation study was conducted with KEGG pathways, which showed considerable gene overlaps between each other. In this study with original KEGG pathways, none of the topology-based methods outperformed the gene set approach. Therefore, a second simulation study was performed on non-overlapping pathways created by unique gene IDs. Here, methods accounting for pathway topology reached higher accuracy than the gene set methods, however their sensitivity was lower. We conducted one of the first comprehensive comparative works on evaluating gene set against pathway topology-based enrichment methods. The topological methods showed better performance in the simulation scenarios with non-overlapping pathways, however, they were not conclusively better in the other scenarios. This suggests that simple gene set approach might be sufficient to detect an enriched pathway under realistic circumstances. Nevertheless, more extensive studies and further benchmark data are needed to systematically evaluate these methods and to assess what gain and cost pathway topology information introduces into enrichment analysis. Both

  10. Comparative Analysis of a Principal Component Analysis-Based and an Artificial Neural Network-Based Method for Baseline Removal.

    PubMed

    Carvajal, Roberto C; Arias, Luis E; Garces, Hugo O; Sbarbaro, Daniel G

    2016-04-01

    This work presents a non-parametric method based on a principal component analysis (PCA) and a parametric one based on artificial neural networks (ANN) to remove continuous baseline features from spectra. The non-parametric method estimates the baseline based on a set of sampled basis vectors obtained from PCA applied over a previously composed continuous spectra learning matrix. The parametric method, however, uses an ANN to filter out the baseline. Previous studies have demonstrated that this method is one of the most effective for baseline removal. The evaluation of both methods was carried out by using a synthetic database designed for benchmarking baseline removal algorithms, containing 100 synthetic composed spectra at different signal-to-baseline ratio (SBR), signal-to-noise ratio (SNR), and baseline slopes. In addition to deomonstrating the utility of the proposed methods and to compare them in a real application, a spectral data set measured from a flame radiation process was used. Several performance metrics such as correlation coefficient, chi-square value, and goodness-of-fit coefficient were calculated to quantify and compare both algorithms. Results demonstrate that the PCA-based method outperforms the one based on ANN both in terms of performance and simplicity. © The Author(s) 2016.

  11. Design Tool Using a New Optimization Method Based on a Stochastic Process

    NASA Astrophysics Data System (ADS)

    Yoshida, Hiroaki; Yamaguchi, Katsuhito; Ishikawa, Yoshio

    Conventional optimization methods are based on a deterministic approach since their purpose is to find out an exact solution. However, such methods have initial condition dependence and the risk of falling into local solution. In this paper, we propose a new optimization method based on the concept of path integrals used in quantum mechanics. The method obtains a solution as an expected value (stochastic average) using a stochastic process. The advantages of this method are that it is not affected by initial conditions and does not require techniques based on experiences. We applied the new optimization method to a hang glider design. In this problem, both the hang glider design and its flight trajectory were optimized. The numerical calculation results prove that performance of the method is sufficient for practical use.

  12. Comparing the Principle-Based SBH Maieutic Method to Traditional Case Study Methods of Teaching Media Ethics

    ERIC Educational Resources Information Center

    Grant, Thomas A.

    2012-01-01

    This quasi-experimental study at a Northwest university compared two methods of teaching media ethics, a class taught with the principle-based SBH Maieutic Method (n = 25) and a class taught with a traditional case study method (n = 27), with a control group (n = 21) that received no ethics training. Following a 16-week intervention, a one-way…

  13. [Application of case-based method in genetics and eugenics teaching].

    PubMed

    Li, Ya-Xuan; Zhao, Xin; Zhang, Fei-Xiong; Hu, Ying-Kao; Yan, Yue-Ming; Cai, Min-Hua; Li, Xiao-Hui

    2012-05-01

    Genetics and Eugenics is a cross-discipline between genetics and eugenics. It is a common curriculum in many Chinese universities. In order to increase the learning interest, we introduced case teaching method and got a better teaching effect. Based on our teaching practices, we summarized some experiences about this subject. In this article, the main problem of case-based method applied in Genetics and Eugenics teaching was discussed.

  14. Comparison of Standard Culture-Based Method to Culture-Independent Method for Evaluation of Hygiene Effects on the Hand Microbiome.

    PubMed

    Zapka, C; Leff, J; Henley, J; Tittl, J; De Nardo, E; Butler, M; Griggs, R; Fierer, N; Edmonds-Wilson, S

    2017-03-28

    Hands play a critical role in the transmission of microbiota on one's own body, between individuals, and on environmental surfaces. Effectively measuring the composition of the hand microbiome is important to hand hygiene science, which has implications for human health. Hand hygiene products are evaluated using standard culture-based methods, but standard test methods for culture-independent microbiome characterization are lacking. We sampled the hands of 50 participants using swab-based and glove-based methods prior to and following four hand hygiene treatments (using a nonantimicrobial hand wash, alcohol-based hand sanitizer [ABHS], a 70% ethanol solution, or tap water). We compared results among culture plate counts, 16S rRNA gene sequencing of DNA extracted directly from hands, and sequencing of DNA extracted from culture plates. Glove-based sampling yielded higher numbers of unique operational taxonomic units (OTUs) but had less diversity in bacterial community composition than swab-based sampling. We detected treatment-induced changes in diversity only by using swab-based samples ( P < 0.001); we were unable to detect changes with glove-based samples. Bacterial cell counts significantly decreased with use of the ABHS ( P < 0.05) and ethanol control ( P < 0.05). Skin hydration at baseline correlated with bacterial abundances, bacterial community composition, pH, and redness across subjects. The importance of the method choice was substantial. These findings are important to ensure improvement of hand hygiene industry methods and for future hand microbiome studies. On the basis of our results and previously published studies, we propose recommendations for best practices in hand microbiome research. IMPORTANCE The hand microbiome is a critical area of research for diverse fields, such as public health and forensics. The suitability of culture-independent methods for assessing effects of hygiene products on microbiota has not been demonstrated. This is the

  15. An endoscopic diffuse optical tomographic method with high resolution based on the improved FOCUSS method

    NASA Astrophysics Data System (ADS)

    Qin, Zhuanping; Ma, Wenjuan; Ren, Shuyan; Geng, Liqing; Li, Jing; Yang, Ying; Qin, Yingmei

    2017-02-01

    Endoscopic DOT has the potential to apply to cancer-related imaging in tubular organs. Although the DOT has relatively large tissue penetration depth, the endoscopic DOT is limited by the narrow space of the internal tubular tissue, so as to the relatively small penetration depth. Because some adenocarcinomas including cervical adenocarcinoma are located in deep canal, it is necessary to improve the imaging resolution under the limited measurement condition. To improve the resolution, a new FOCUSS algorithm along with the image reconstruction algorithm based on the effective detection range (EDR) is developed. This algorithm is based on the region of interest (ROI) to reduce the dimensions of the matrix. The shrinking method cuts down the computation burden. To reduce the computational complexity, double conjugate gradient method is used in the matrix inversion. For a typical inner size and optical properties of the cervix-like tubular tissue, reconstructed images from the simulation data demonstrate that the proposed method achieves equivalent image quality to that obtained from the method based on EDR when the target is close the inner boundary of the model, and with higher spatial resolution and quantitative ratio when the targets are far from the inner boundary of the model. The quantitative ratio of reconstructed absorption and reduced scattering coefficient can be up to 70% and 80% under 5mm depth, respectively. Furthermore, the two close targets with different depths can be separated from each other. The proposed method will be useful to the development of endoscopic DOT technologies in tubular organs.

  16. Multi person detection and tracking based on hierarchical level-set method

    NASA Astrophysics Data System (ADS)

    Khraief, Chadia; Benzarti, Faouzi; Amiri, Hamid

    2018-04-01

    In this paper, we propose an efficient unsupervised method for mutli-person tracking based on hierarchical level-set approach. The proposed method uses both edge and region information in order to effectively detect objects. The persons are tracked on each frame of the sequence by minimizing an energy functional that combines color, texture and shape information. These features are enrolled in covariance matrix as region descriptor. The present method is fully automated without the need to manually specify the initial contour of Level-set. It is based on combined person detection and background subtraction methods. The edge-based is employed to maintain a stable evolution, guide the segmentation towards apparent boundaries and inhibit regions fusion. The computational cost of level-set is reduced by using narrow band technique. Many experimental results are performed on challenging video sequences and show the effectiveness of the proposed method.

  17. Comparison of a New Cobinamide-Based Method to a Standard Laboratory Method for Measuring Cyanide in Human Blood

    PubMed Central

    Swezey, Robert; Shinn, Walter; Green, Carol; Drover, David R.; Hammer, Gregory B.; Schulman, Scott R.; Zajicek, Anne; Jett, David A.; Boss, Gerry R.

    2013-01-01

    Most hospital laboratories do not measure blood cyanide concentrations, and samples must be sent to reference laboratories. A simple method is needed for measuring cyanide in hospitals. The authors previously developed a method to quantify cyanide based on the high binding affinity of the vitamin B12 analog, cobinamide, for cyanide and a major spectral change observed for cyanide-bound cobinamide. This method is now validated in human blood, and the findings include a mean inter-assay accuracy of 99.1%, precision of 8.75% and a lower limit of quantification of 3.27 µM cyanide. The method was applied to blood samples from children treated with sodium nitroprusside and it yielded measurable results in 88 of 172 samples (51%), whereas the reference laboratory yielded results in only 19 samples (11%). In all 19 samples, the cobinamide-based method also yielded measurable results. The two methods showed reasonable agreement when analyzed by linear regression, but not when analyzed by a standard error of the estimate or paired t-test. Differences in results between the two methods may be because samples were assayed at different times on different sample types. The cobinamide-based method is applicable to human blood, and can be used in hospital laboratories and emergency rooms. PMID:23653045

  18. Deghosting based on the transmission matrix method

    NASA Astrophysics Data System (ADS)

    Wang, Benfeng; Wu, Ru-Shan; Chen, Xiaohong

    2017-12-01

    As the developments of seismic exploration and subsequent seismic exploitation advance, marine acquisition systems with towed streamers become an important seismic data acquisition method. But the existing air-water reflective interface can generate surface related multiples, including ghosts, which can affect the accuracy and performance of the following seismic data processing algorithms. Thus, we derive a deghosting method from a new perspective, i.e. using the transmission matrix (T-matrix) method instead of inverse scattering series. The T-matrix-based deghosting algorithm includes all scattering effects and is convergent absolutely. Initially, the effectiveness of the proposed method is demonstrated using synthetic data obtained from a designed layered model, and its noise-resistant property is also illustrated using noisy synthetic data contaminated by random noise. Numerical examples on complicated data from the open SMAART Pluto model and field marine data further demonstrate the validity and flexibility of the proposed method. After deghosting, low frequency components are recovered reasonably and the fake high frequency components are attenuated, and the recovered low frequency components will be useful for the subsequent full waveform inversion. The proposed deghosting method is currently suitable for two-dimensional towed streamer cases with accurate constant depth information and its extension into variable-depth streamers in three-dimensional cases will be studied in the future.

  19. A Study of Impact Point Detecting Method Based on Seismic Signal

    NASA Astrophysics Data System (ADS)

    Huo, Pengju; Zhang, Yu; Xu, Lina; Huang, Yong

    The projectile landing position has to be determined for its recovery and range in the targeting test. In this paper, a global search method based on the velocity variance is proposed. In order to verify the applicability of this method, simulation analysis within the scope of four million square meters has been conducted in the same array structure of the commonly used linear positioning method, and MATLAB was used to compare and analyze the two methods. The compared simulation results show that the global search method based on the speed of variance has high positioning accuracy and stability, which can meet the needs of impact point location.

  20. Method of removing and detoxifying a phosphorus-based substance

    DOEpatents

    Vandegrift, G.F.; Steindler, M.J.

    1985-05-21

    A method of removing a phosphorus-based poisonous substance from water contaminated is presented. In addition, the toxicity of the phosphorus-based substance is also subsequently destroyed. A water-immiscible organic solvent is first immobilized on a supported liquid membrane before the contaminated water is contacted with one side of the supported liquid membrane to absorb the phosphorus-based substance in the organic solvent. The other side of the supported liquid membrane is contacted with a hydroxy-affording strong base to react with phosphorus-based solvated species to form a non-toxic product.

  1. Discovering, Indexing and Interlinking Information Resources

    PubMed Central

    Celli, Fabrizio; Keizer, Johannes; Jaques, Yves; Konstantopoulos, Stasinos; Vudragović, Dušan

    2015-01-01

    The social media revolution is having a dramatic effect on the world of scientific publication. Scientists now publish their research interests, theories and outcomes across numerous channels, including personal blogs and other thematic web spaces where ideas, activities and partial results are discussed. Accordingly, information systems that facilitate access to scientific literature must learn to cope with this valuable and varied data, evolving to make this research easily discoverable and available to end users. In this paper we describe the incremental process of discovering web resources in the domain of agricultural science and technology. Making use of Linked Open Data methodologies, we interlink a wide array of custom-crawled resources with the AGRIS bibliographic database in order to enrich the user experience of the AGRIS website. We also discuss the SemaGrow Stack, a query federation and data integration infrastructure used to estimate the semantic distance between crawled web resources and AGRIS. PMID:26834982

  2. A Hybrid Method for Pancreas Extraction from CT Image Based on Level Set Methods

    PubMed Central

    Tan, Hanqing; Fujita, Hiroshi

    2013-01-01

    This paper proposes a novel semiautomatic method to extract the pancreas from abdominal CT images. Traditional level set and region growing methods that request locating initial contour near the final boundary of object have problem of leakage to nearby tissues of pancreas region. The proposed method consists of a customized fast-marching level set method which generates an optimal initial pancreas region to solve the problem that the level set method is sensitive to the initial contour location and a modified distance regularized level set method which extracts accurate pancreas. The novelty in our method is the proper selection and combination of level set methods, furthermore an energy-decrement algorithm and an energy-tune algorithm are proposed to reduce the negative impact of bonding force caused by connected tissue whose intensity is similar with pancreas. As a result, our method overcomes the shortages of oversegmentation at weak boundary and can accurately extract pancreas from CT images. The proposed method is compared to other five state-of-the-art medical image segmentation methods based on a CT image dataset which contains abdominal images from 10 patients. The evaluated results demonstrate that our method outperforms other methods by achieving higher accuracy and making less false segmentation in pancreas extraction. PMID:24066016

  3. System and method for integrating hazard-based decision making tools and processes

    DOEpatents

    Hodgin, C Reed [Westminster, CO

    2012-03-20

    A system and method for inputting, analyzing, and disseminating information necessary for identified decision-makers to respond to emergency situations. This system and method provides consistency and integration among multiple groups, and may be used for both initial consequence-based decisions and follow-on consequence-based decisions. The system and method in a preferred embodiment also provides tools for accessing and manipulating information that are appropriate for each decision-maker, in order to achieve more reasoned and timely consequence-based decisions. The invention includes processes for designing and implementing a system or method for responding to emergency situations.

  4. A MULTICORE BASED PARALLEL IMAGE REGISTRATION METHOD

    PubMed Central

    Yang, Lin; Gong, Leiguang; Zhang, Hong; Nosher, John L.; Foran, David J.

    2012-01-01

    Image registration is a crucial step for many image-assisted clinical applications such as surgery planning and treatment evaluation. In this paper we proposed a landmark based nonlinear image registration algorithm for matching 2D image pairs. The algorithm was shown to be effective and robust under conditions of large deformations. In landmark based registration, the most important step is establishing the correspondence among the selected landmark points. This usually requires an extensive search which is often computationally expensive. We introduced a nonregular data partition algorithm using the K-means clustering algorithm to group the landmarks based on the number of available processing cores. The step optimizes the memory usage and data transfer. We have tested our method using IBM Cell Broadband Engine (Cell/B.E.) platform. PMID:19964921

  5. Comparison of Standard Culture-Based Method to Culture-Independent Method for Evaluation of Hygiene Effects on the Hand Microbiome

    PubMed Central

    Leff, J.; Henley, J.; Tittl, J.; De Nardo, E.; Butler, M.; Griggs, R.; Fierer, N.

    2017-01-01

    ABSTRACT Hands play a critical role in the transmission of microbiota on one’s own body, between individuals, and on environmental surfaces. Effectively measuring the composition of the hand microbiome is important to hand hygiene science, which has implications for human health. Hand hygiene products are evaluated using standard culture-based methods, but standard test methods for culture-independent microbiome characterization are lacking. We sampled the hands of 50 participants using swab-based and glove-based methods prior to and following four hand hygiene treatments (using a nonantimicrobial hand wash, alcohol-based hand sanitizer [ABHS], a 70% ethanol solution, or tap water). We compared results among culture plate counts, 16S rRNA gene sequencing of DNA extracted directly from hands, and sequencing of DNA extracted from culture plates. Glove-based sampling yielded higher numbers of unique operational taxonomic units (OTUs) but had less diversity in bacterial community composition than swab-based sampling. We detected treatment-induced changes in diversity only by using swab-based samples (P < 0.001); we were unable to detect changes with glove-based samples. Bacterial cell counts significantly decreased with use of the ABHS (P < 0.05) and ethanol control (P < 0.05). Skin hydration at baseline correlated with bacterial abundances, bacterial community composition, pH, and redness across subjects. The importance of the method choice was substantial. These findings are important to ensure improvement of hand hygiene industry methods and for future hand microbiome studies. On the basis of our results and previously published studies, we propose recommendations for best practices in hand microbiome research. PMID:28351915

  6. Method for rapid base sequencing in DNA and RNA

    DOEpatents

    Jett, J.H.; Keller, R.A.; Martin, J.C.; Moyzis, R.K.; Ratliff, R.L.; Shera, E.B.; Stewart, C.C.

    1987-10-07

    A method is provided for the rapid base sequencing of DNA or RNA fragments wherein a single fragment of DNA or RNA is provided with identifiable bases and suspended in a moving flow stream. An exonuclease sequentially cleaves individual bases from the end of the suspended fragment. The moving flow stream maintains the cleaved bases in an orderly train for subsequent detection and identification. In a particular embodiment, individual bases forming the DNA or RNA fragments are individually tagged with a characteristic fluorescent dye. The train of bases is then excited to fluorescence with an output spectrum characteristic of the individual bases. Accordingly, the base sequence of the original DNA or RNA fragment can be reconstructed. 2 figs.

  7. Method for rapid base sequencing in DNA and RNA

    DOEpatents

    Jett, J.H.; Keller, R.A.; Martin, J.C.; Moyzis, R.K.; Ratliff, R.L.; Shera, E.B.; Stewart, C.C.

    1990-10-09

    A method is provided for the rapid base sequencing of DNA or RNA fragments wherein a single fragment of DNA or RNA is provided with identifiable bases and suspended in a moving flow stream. An exonuclease sequentially cleaves individual bases from the end of the suspended fragment. The moving flow stream maintains the cleaved bases in an orderly train for subsequent detection and identification. In a particular embodiment, individual bases forming the DNA or RNA fragments are individually tagged with a characteristic fluorescent dye. The train of bases is then excited to fluorescence with an output spectrum characteristic of the individual bases. Accordingly, the base sequence of the original DNA or RNA fragment can be reconstructed. 2 figs.

  8. Method for rapid base sequencing in DNA and RNA

    DOEpatents

    Jett, James H.; Keller, Richard A.; Martin, John C.; Moyzis, Robert K.; Ratliff, Robert L.; Shera, E. Brooks; Stewart, Carleton C.

    1990-01-01

    A method is provided for the rapid base sequencing of DNA or RNA fragments wherein a single fragment of DNA or RNA is provided with identifiable bases and suspended in a moving flow stream. An exonuclease sequentially cleaves individual bases from the end of the suspended fragment. The moving flow stream maintains the cleaved bases in an orderly train for subsequent detection and identification. In a particular embodiment, individual bases forming the DNA or RNA fragments are individually tagged with a characteristic fluorescent dye. The train of bases is then excited to fluorescence with an output spectrum characteristic of the individual bases. Accordingly, the base sequence of the original DNA or RNA fragment can be reconstructed.

  9. Hybrid statistics-simulations based method for atom-counting from ADF STEM images.

    PubMed

    De Wael, Annelies; De Backer, Annick; Jones, Lewys; Nellist, Peter D; Van Aert, Sandra

    2017-06-01

    A hybrid statistics-simulations based method for atom-counting from annular dark field scanning transmission electron microscopy (ADF STEM) images of monotype crystalline nanostructures is presented. Different atom-counting methods already exist for model-like systems. However, the increasing relevance of radiation damage in the study of nanostructures demands a method that allows atom-counting from low dose images with a low signal-to-noise ratio. Therefore, the hybrid method directly includes prior knowledge from image simulations into the existing statistics-based method for atom-counting, and accounts in this manner for possible discrepancies between actual and simulated experimental conditions. It is shown by means of simulations and experiments that this hybrid method outperforms the statistics-based method, especially for low electron doses and small nanoparticles. The analysis of a simulated low dose image of a small nanoparticle suggests that this method allows for far more reliable quantitative analysis of beam-sensitive materials. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. MFAM: Multiple Frequency Adaptive Model-Based Indoor Localization Method.

    PubMed

    Tuta, Jure; Juric, Matjaz B

    2018-03-24

    This paper presents MFAM (Multiple Frequency Adaptive Model-based localization method), a novel model-based indoor localization method that is capable of using multiple wireless signal frequencies simultaneously. It utilizes indoor architectural model and physical properties of wireless signal propagation through objects and space. The motivation for developing multiple frequency localization method lies in the future Wi-Fi standards (e.g., 802.11ah) and the growing number of various wireless signals present in the buildings (e.g., Wi-Fi, Bluetooth, ZigBee, etc.). Current indoor localization methods mostly rely on a single wireless signal type and often require many devices to achieve the necessary accuracy. MFAM utilizes multiple wireless signal types and improves the localization accuracy over the usage of a single frequency. It continuously monitors signal propagation through space and adapts the model according to the changes indoors. Using multiple signal sources lowers the required number of access points for a specific signal type while utilizing signals, already present in the indoors. Due to the unavailability of the 802.11ah hardware, we have evaluated proposed method with similar signals; we have used 2.4 GHz Wi-Fi and 868 MHz HomeMatic home automation signals. We have performed the evaluation in a modern two-bedroom apartment and measured mean localization error 2.0 to 2.3 m and median error of 2.0 to 2.2 m. Based on our evaluation results, using two different signals improves the localization accuracy by 18% in comparison to 2.4 GHz Wi-Fi-only approach. Additional signals would improve the accuracy even further. We have shown that MFAM provides better accuracy than competing methods, while having several advantages for real-world usage.

  11. MFAM: Multiple Frequency Adaptive Model-Based Indoor Localization Method

    PubMed Central

    Juric, Matjaz B.

    2018-01-01

    This paper presents MFAM (Multiple Frequency Adaptive Model-based localization method), a novel model-based indoor localization method that is capable of using multiple wireless signal frequencies simultaneously. It utilizes indoor architectural model and physical properties of wireless signal propagation through objects and space. The motivation for developing multiple frequency localization method lies in the future Wi-Fi standards (e.g., 802.11ah) and the growing number of various wireless signals present in the buildings (e.g., Wi-Fi, Bluetooth, ZigBee, etc.). Current indoor localization methods mostly rely on a single wireless signal type and often require many devices to achieve the necessary accuracy. MFAM utilizes multiple wireless signal types and improves the localization accuracy over the usage of a single frequency. It continuously monitors signal propagation through space and adapts the model according to the changes indoors. Using multiple signal sources lowers the required number of access points for a specific signal type while utilizing signals, already present in the indoors. Due to the unavailability of the 802.11ah hardware, we have evaluated proposed method with similar signals; we have used 2.4 GHz Wi-Fi and 868 MHz HomeMatic home automation signals. We have performed the evaluation in a modern two-bedroom apartment and measured mean localization error 2.0 to 2.3 m and median error of 2.0 to 2.2 m. Based on our evaluation results, using two different signals improves the localization accuracy by 18% in comparison to 2.4 GHz Wi-Fi-only approach. Additional signals would improve the accuracy even further. We have shown that MFAM provides better accuracy than competing methods, while having several advantages for real-world usage. PMID:29587352

  12. Connectivity-based, all-hexahedral mesh generation method and apparatus

    DOEpatents

    Tautges, T.J.; Mitchell, S.A.; Blacker, T.D.; Murdoch, P.

    1998-06-16

    The present invention is a computer-based method and apparatus for constructing all-hexahedral finite element meshes for finite element analysis. The present invention begins with a three-dimensional geometry and an all-quadrilateral surface mesh, then constructs hexahedral element connectivity from the outer boundary inward, and then resolves invalid connectivity. The result of the present invention is a complete representation of hex mesh connectivity only; actual mesh node locations are determined later. The basic method of the present invention comprises the step of forming hexahedral elements by making crossings of entities referred to as ``whisker chords.`` This step, combined with a seaming operation in space, is shown to be sufficient for meshing simple block problems. Entities that appear when meshing more complex geometries, namely blind chords, merged sheets, and self-intersecting chords, are described. A method for detecting invalid connectivity in space, based on repeated edges, is also described, along with its application to various cases of invalid connectivity introduced and resolved by the method. 79 figs.

  13. Usability Evaluation Methods for Gesture-Based Games: A Systematic Review

    PubMed Central

    Simor, Fernando Winckler; Brum, Manoela Rogofski; Schmidt, Jaison Dairon Ebertz; De Marchi, Ana Carolina Bertoletti

    2016-01-01

    Background Gestural interaction systems are increasingly being used, mainly in games, expanding the idea of entertainment and providing experiences with the purpose of promoting better physical and/or mental health. Therefore, it is necessary to establish mechanisms for evaluating the usability of these interfaces, which make gestures the basis of interaction, to achieve a balance between functionality and ease of use. Objective This study aims to present the results of a systematic review focused on usability evaluation methods for gesture-based games, considering devices with motion-sensing capability. We considered the usability methods used, the common interface issues, and the strategies adopted to build good gesture-based games. Methods The research was centered on four electronic databases: IEEE, Association for Computing Machinery (ACM), Springer, and Science Direct from September 4 to 21, 2015. Within 1427 studies evaluated, 10 matched the eligibility criteria. As a requirement, we considered studies about gesture-based games, Kinect and/or Wii as devices, and the use of a usability method to evaluate the user interface. Results In the 10 studies found, there was no standardization in the methods because they considered diverse analysis variables. Heterogeneously, authors used different instruments to evaluate gesture-based interfaces and no default approach was proposed. Questionnaires were the most used instruments (70%, 7/10), followed by interviews (30%, 3/10), and observation and video recording (20%, 2/10). Moreover, 60% (6/10) of the studies used gesture-based serious games to evaluate the performance of elderly participants in rehabilitation tasks. This highlights the need for creating an evaluation protocol for older adults to provide a user-friendly interface according to the user’s age and limitations. Conclusions Through this study, we conclude this field is in need of a usability evaluation method for serious games, especially games for

  14. A Case-Based Reasoning Method with Rank Aggregation

    NASA Astrophysics Data System (ADS)

    Sun, Jinhua; Du, Jiao; Hu, Jian

    2018-03-01

    In order to improve the accuracy of case-based reasoning (CBR), this paper addresses a new CBR framework with the basic principle of rank aggregation. First, the ranking methods are put forward in each attribute subspace of case. The ordering relation between cases on each attribute is got between cases. Then, a sorting matrix is got. Second, the similar case retrieval process from ranking matrix is transformed into a rank aggregation optimal problem, which uses the Kemeny optimal. On the basis, a rank aggregation case-based reasoning algorithm, named RA-CBR, is designed. The experiment result on UCI data sets shows that case retrieval accuracy of RA-CBR algorithm is higher than euclidean distance CBR and mahalanobis distance CBR testing.So we can get the conclusion that RA-CBR method can increase the performance and efficiency of CBR.

  15. Damage evaluation by a guided wave-hidden Markov model based method

    NASA Astrophysics Data System (ADS)

    Mei, Hanfei; Yuan, Shenfang; Qiu, Lei; Zhang, Jinjin

    2016-02-01

    Guided wave based structural health monitoring has shown great potential in aerospace applications. However, one of the key challenges of practical engineering applications is the accurate interpretation of the guided wave signals under time-varying environmental and operational conditions. This paper presents a guided wave-hidden Markov model based method to improve the damage evaluation reliability of real aircraft structures under time-varying conditions. In the proposed approach, an HMM based unweighted moving average trend estimation method, which can capture the trend of damage propagation from the posterior probability obtained by HMM modeling is used to achieve a probabilistic evaluation of the structural damage. To validate the developed method, experiments are performed on a hole-edge crack specimen under fatigue loading condition and a real aircraft wing spar under changing structural boundary conditions. Experimental results show the advantage of the proposed method.

  16. Workshop on In Situ Biogeochemical Transformation of Chlorinated Solvents

    DTIC Science & Technology

    2008-02-01

    sites across the country, and also has its own internal research programs. In situ bioremediation has become a widely- used technology for...concern [e.g., dithionite, sulfate (at high concentrations), pesticides , and agri-chemicals that are residues in mulch used in biowalls, as well as... Bioremediation of Chlorinated Solvents in Groundwater Using a Permeable Mulch Biowall, Operable Unit 1, Altus Air Force Base, Oklahoma. Prepared

  17. Self-Alignment MEMS IMU Method Based on the Rotation Modulation Technique on a Swing Base

    PubMed Central

    Chen, Zhiyong; Yang, Haotian; Wang, Chengbin; Lin, Zhihui; Guo, Meifeng

    2018-01-01

    The micro-electro-mechanical-system (MEMS) inertial measurement unit (IMU) has been widely used in the field of inertial navigation due to its small size, low cost, and light weight, but aligning MEMS IMUs remains a challenge for researchers. MEMS IMUs have been conventionally aligned on a static base, requiring other sensors, such as magnetometers or satellites, to provide auxiliary information, which limits its application range to some extent. Therefore, improving the alignment accuracy of MEMS IMU as much as possible under swing conditions is of considerable value. This paper proposes an alignment method based on the rotation modulation technique (RMT), which is completely self-aligned, unlike the existing alignment techniques. The effect of the inertial sensor errors is mitigated by rotating the IMU. Then, inertial frame-based alignment using the rotation modulation technique (RMT-IFBA) achieved coarse alignment on the swing base. The strong tracking filter (STF) further improved the alignment accuracy. The performance of the proposed method was validated with a physical experiment, and the results of the alignment showed that the standard deviations of pitch, roll, and heading angle were 0.0140°, 0.0097°, and 0.91°, respectively, which verified the practicality and efficacy of the proposed method for the self-alignment of the MEMS IMU on a swing base. PMID:29649150

  18. Project-Based Learning in Undergraduate Environmental Chemistry Laboratory: Using EPA Methods to Guide Student Method Development for Pesticide Quantitation

    ERIC Educational Resources Information Center

    Davis, Eric J.; Pauls, Steve; Dick, Jonathan

    2017-01-01

    Presented is a project-based learning (PBL) laboratory approach for an upper-division environmental chemistry or quantitative analysis course. In this work, a combined laboratory class of 11 environmental chemistry students developed a method based on published EPA methods for the extraction of dichlorodiphenyltrichloroethane (DDT) and its…

  19. Needs and Opportunities for Uncertainty-Based Multidisciplinary Design Methods for Aerospace Vehicles

    NASA Technical Reports Server (NTRS)

    Zang, Thomas A.; Hemsch, Michael J.; Hilburger, Mark W.; Kenny, Sean P; Luckring, James M.; Maghami, Peiman; Padula, Sharon L.; Stroud, W. Jefferson

    2002-01-01

    This report consists of a survey of the state of the art in uncertainty-based design together with recommendations for a Base research activity in this area for the NASA Langley Research Center. This report identifies the needs and opportunities for computational and experimental methods that provide accurate, efficient solutions to nondeterministic multidisciplinary aerospace vehicle design problems. Barriers to the adoption of uncertainty-based design methods are identified. and the benefits of the use of such methods are explained. Particular research needs are listed.

  20. Comparison of landmark-based and automatic methods for cortical surface registration

    PubMed Central

    Pantazis, Dimitrios; Joshi, Anand; Jiang, Jintao; Shattuck, David; Bernstein, Lynne E.; Damasio, Hanna; Leahy, Richard M.

    2009-01-01

    Group analysis of structure or function in cerebral cortex typically involves as a first step the alignment of the cortices. A surface based approach to this problem treats the cortex as a convoluted surface and coregisters across subjects so that cortical landmarks or features are aligned. This registration can be performed using curves representing sulcal fundi and gyral crowns to constrain the mapping. Alternatively, registration can be based on the alignment of curvature metrics computed over the entire cortical surface. The former approach typically involves some degree of user interaction in defining the sulcal and gyral landmarks while the latter methods can be completely automated. Here we introduce a cortical delineation protocol consisting of 26 consistent landmarks spanning the entire cortical surface. We then compare the performance of a landmark-based registration method that uses this protocol with that of two automatic methods implemented in the software packages FreeSurfer and BrainVoyager. We compare performance in terms of discrepancy maps between the different methods, the accuracy with which regions of interest are aligned, and the ability of the automated methods to correctly align standard cortical landmarks. Our results show similar performance for ROIs in the perisylvian region for the landmark based method and FreeSurfer. However, the discrepancy maps showed larger variability between methods in occipital and frontal cortex and also that automated methods often produce misalignment of standard cortical landmarks. Consequently, selection of the registration approach should consider the importance of accurate sulcal alignment for the specific task for which coregistration is being performed. When automatic methods are used, the users should ensure that sulci in regions of interest in their studies are adequately aligned before proceeding with subsequent analysis. PMID:19796696

  1. Web-based emergency response exercise management systems and methods thereof

    DOEpatents

    Goforth, John W.; Mercer, Michael B.; Heath, Zach; Yang, Lynn I.

    2014-09-09

    According to one embodiment, a method for simulating portions of an emergency response exercise includes generating situational awareness outputs associated with a simulated emergency and sending the situational awareness outputs to a plurality of output devices. Also, the method includes outputting to a user device a plurality of decisions associated with the situational awareness outputs at a decision point, receiving a selection of one of the decisions from the user device, generating new situational awareness outputs based on the selected decision, and repeating the sending, outputting and receiving steps based on the new situational awareness outputs. Other methods, systems, and computer program products are included according to other embodiments of the invention.

  2. Validation of a radiosonde-based cloud layer detection method against a ground-based remote sensing method at multiple ARM sites

    NASA Astrophysics Data System (ADS)

    Zhang, Jinqiang; Li, Zhanqing; Chen, Hongbin; Cribb, Maureen

    2013-01-01

    Cloud vertical structure is a key quantity in meteorological and climate studies, but it is also among the most difficult quantities to observe. In this study, we develop a long-term (10 years) radiosonde-based cloud profile product for the U.S. Department of Energy's Atmospheric Radiation Measurement (ARM) program Southern Great Plains (SGP), Tropical Western Pacific (TWP), and North Slope of Alaska (NSA) sites and a shorter-term product for the ARM Mobile Facility (AMF) deployed in Shouxian, Anhui Province, China (AMF-China). The AMF-China site was in operation from 14 May to 28 December 2008; the ARM sites have been collecting data for over 15 years. The Active Remote Sensing of Cloud (ARSCL) value-added product (VAP), which combines data from the 95-GHz W-band ARM Cloud Radar (WACR) and/or the 35-GHz Millimeter Microwave Cloud Radar (MMCR), is used in this study to validate the radiosonde-based cloud layer retrieval method. The performance of the radiosonde-based cloud layer retrieval method applied to data from different climate regimes is evaluated. Overall, cloud layers derived from the ARSCL VAP and radiosonde data agree very well at the SGP and AMF-China sites. At the TWP and NSA sites, the radiosonde tends to detect more cloud layers in the upper troposphere.

  3. Content based Image Retrieval based on Different Global and Local Color Histogram Methods: A Survey

    NASA Astrophysics Data System (ADS)

    Suhasini, Pallikonda Sarah; Sri Rama Krishna, K.; Murali Krishna, I. V.

    2017-02-01

    Different global and local color histogram methods for content based image retrieval (CBIR) are investigated in this paper. Color histogram is a widely used descriptor for CBIR. Conventional method of extracting color histogram is global, which misses the spatial content, is less invariant to deformation and viewpoint changes, and results in a very large three dimensional histogram corresponding to the color space used. To address the above deficiencies, different global and local histogram methods are proposed in recent research. Different ways of extracting local histograms to have spatial correspondence, invariant colour histogram to add deformation and viewpoint invariance and fuzzy linking method to reduce the size of the histogram are found in recent papers. The color space and the distance metric used are vital in obtaining color histogram. In this paper the performance of CBIR based on different global and local color histograms in three different color spaces, namely, RGB, HSV, L*a*b* and also with three distance measures Euclidean, Quadratic and Histogram intersection are surveyed, to choose appropriate method for future research.

  4. Inverse solutions for electrical impedance tomography based on conjugate gradients methods

    NASA Astrophysics Data System (ADS)

    Wang, M.

    2002-01-01

    A multistep inverse solution for two-dimensional electric field distribution is developed to deal with the nonlinear inverse problem of electric field distribution in relation to its boundary condition and the problem of divergence due to errors introduced by the ill-conditioned sensitivity matrix and the noise produced by electrode modelling and instruments. This solution is based on a normalized linear approximation method where the change in mutual impedance is derived from the sensitivity theorem and a method of error vector decomposition. This paper presents an algebraic solution of the linear equations at each inverse step, using a generalized conjugate gradients method. Limiting the number of iterations in the generalized conjugate gradients method controls the artificial errors introduced by the assumption of linearity and the ill-conditioned sensitivity matrix. The solution of the nonlinear problem is approached using a multistep inversion. This paper also reviews the mathematical and physical definitions of the sensitivity back-projection algorithm based on the sensitivity theorem. Simulations and discussion based on the multistep algorithm, the sensitivity coefficient back-projection method and the Newton-Raphson method are given. Examples of imaging gas-liquid mixing and a human hand in brine are presented.

  5. A quasiparticle-based multi-reference coupled-cluster method.

    PubMed

    Rolik, Zoltán; Kállay, Mihály

    2014-10-07

    The purpose of this paper is to introduce a quasiparticle-based multi-reference coupled-cluster (MRCC) approach. The quasiparticles are introduced via a unitary transformation which allows us to represent a complete active space reference function and other elements of an orthonormal multi-reference (MR) basis in a determinant-like form. The quasiparticle creation and annihilation operators satisfy the fermion anti-commutation relations. On the basis of these quasiparticles, a generalization of the normal-ordered operator products for the MR case can be introduced as an alternative to the approach of Mukherjee and Kutzelnigg [Recent Prog. Many-Body Theor. 4, 127 (1995); Mukherjee and Kutzelnigg, J. Chem. Phys. 107, 432 (1997)]. Based on the new normal ordering any quasiparticle-based theory can be formulated using the well-known diagram techniques. Beyond the general quasiparticle framework we also present a possible realization of the unitary transformation. The suggested transformation has an exponential form where the parameters, holding exclusively active indices, are defined in a form similar to the wave operator of the unitary coupled-cluster approach. The definition of our quasiparticle-based MRCC approach strictly follows the form of the single-reference coupled-cluster method and retains several of its beneficial properties. Test results for small systems are presented using a pilot implementation of the new approach and compared to those obtained by other MR methods.

  6. Hybrid Orientation Based Human Limbs Motion Tracking Method

    PubMed Central

    Glonek, Grzegorz; Wojciechowski, Adam

    2017-01-01

    One of the key technologies that lays behind the human–machine interaction and human motion diagnosis is the limbs motion tracking. To make the limbs tracking efficient, it must be able to estimate a precise and unambiguous position of each tracked human joint and resulting body part pose. In recent years, body pose estimation became very popular and broadly available for home users because of easy access to cheap tracking devices. Their robustness can be improved by different tracking modes data fusion. The paper defines the novel approach—orientation based data fusion—instead of dominating in literature position based approach, for two classes of tracking devices: depth sensors (i.e., Microsoft Kinect) and inertial measurement units (IMU). The detailed analysis of their working characteristics allowed to elaborate a new method that let fuse more precisely limbs orientation data from both devices and compensates their imprecisions. The paper presents the series of performed experiments that verified the method’s accuracy. This novel approach allowed to outperform the precision of position-based joints tracking, the methods dominating in the literature, of up to 18%. PMID:29232832

  7. Validation of a Smartphone Image-Based Dietary Assessment Method for Pregnant Women

    PubMed Central

    Ashman, Amy M.; Collins, Clare E.; Brown, Leanne J.; Rae, Kym M.; Rollo, Megan E.

    2017-01-01

    Image-based dietary records could lower participant burden associated with traditional prospective methods of dietary assessment. They have been used in children, adolescents and adults, but have not been evaluated in pregnant women. The current study evaluated relative validity of the DietBytes image-based dietary assessment method for assessing energy and nutrient intakes. Pregnant women collected image-based dietary records (via a smartphone application) of all food, drinks and supplements consumed over three non-consecutive days. Intakes from the image-based method were compared to intakes collected from three 24-h recalls, taken on random days; once per week, in the weeks following the image-based record. Data were analyzed using nutrient analysis software. Agreement between methods was ascertained using Pearson correlations and Bland-Altman plots. Twenty-five women (27 recruited, one withdrew, one incomplete), median age 29 years, 15 primiparas, eight Aboriginal Australians, completed image-based records for analysis. Significant correlations between the two methods were observed for energy, macronutrients and fiber (r = 0.58–0.84, all p < 0.05), and for micronutrients both including (r = 0.47–0.94, all p < 0.05) and excluding (r = 0.40–0.85, all p < 0.05) supplements in the analysis. Bland-Altman plots confirmed acceptable agreement with no systematic bias. The DietBytes method demonstrated acceptable relative validity for assessment of nutrient intakes of pregnant women. PMID:28106758

  8. A calibration method of infrared LVF based spectroradiometer

    NASA Astrophysics Data System (ADS)

    Liu, Jiaqing; Han, Shunli; Liu, Lei; Hu, Dexin

    2017-10-01

    In this paper, a calibration method of LVF-based spectroradiometer is summarize, including spectral calibration and radiometric calibration. The spectral calibration process as follow: first, the relationship between stepping motor's step number and transmission wavelength is derivative by theoretical calculation, including a non-linearity correction of LVF;second, a line-to-line method was used to corrected the theoretical wavelength; Finally, the 3.39 μm and 10.69 μm laser is used for spectral calibration validation, show the sought 0.1% accuracy or better is achieved.A new sub-region multi-point calibration method is used for radiometric calibration to improving accuracy, results show the sought 1% accuracy or better is achieved.

  9. An Implicit Characteristic Based Method for Electromagnetics

    NASA Technical Reports Server (NTRS)

    Beggs, John H.; Briley, W. Roger

    2001-01-01

    An implicit characteristic-based approach for numerical solution of Maxwell's time-dependent curl equations in flux conservative form is introduced. This method combines a characteristic based finite difference spatial approximation with an implicit lower-upper approximate factorization (LU/AF) time integration scheme. This approach is advantageous for three-dimensional applications because the characteristic differencing enables a two-factor approximate factorization that retains its unconditional stability in three space dimensions, and it does not require solution of tridiagonal systems. Results are given both for a Fourier analysis of stability, damping and dispersion properties, and for one-dimensional model problems involving propagation and scattering for free space and dielectric materials using both uniform and nonuniform grids. The explicit Finite Difference Time Domain Method (FDTD) algorithm is used as a convenient reference algorithm for comparison. The one-dimensional results indicate that for low frequency problems on a highly resolved uniform or nonuniform grid, this LU/AF algorithm can produce accurate solutions at Courant numbers significantly greater than one, with a corresponding improvement in efficiency for simulating a given period of time. This approach appears promising for development of dispersion optimized LU/AF schemes for three dimensional applications.

  10. WordSeeker: concurrent bioinformatics software for discovering genome-wide patterns and word-based genomic signatures

    PubMed Central

    2010-01-01

    Background An important focus of genomic science is the discovery and characterization of all functional elements within genomes. In silico methods are used in genome studies to discover putative regulatory genomic elements (called words or motifs). Although a number of methods have been developed for motif discovery, most of them lack the scalability needed to analyze large genomic data sets. Methods This manuscript presents WordSeeker, an enumerative motif discovery toolkit that utilizes multi-core and distributed computational platforms to enable scalable analysis of genomic data. A controller task coordinates activities of worker nodes, each of which (1) enumerates a subset of the DNA word space and (2) scores words with a distributed Markov chain model. Results A comprehensive suite of performance tests was conducted to demonstrate the performance, speedup and efficiency of WordSeeker. The scalability of the toolkit enabled the analysis of the entire genome of Arabidopsis thaliana; the results of the analysis were integrated into The Arabidopsis Gene Regulatory Information Server (AGRIS). A public version of WordSeeker was deployed on the Glenn cluster at the Ohio Supercomputer Center. Conclusion WordSeeker effectively utilizes concurrent computing platforms to enable the identification of putative functional elements in genomic data sets. This capability facilitates the analysis of the large quantity of sequenced genomic data. PMID:21210985

  11. Microbial detection method based on sensing molecular hydrogen

    NASA Technical Reports Server (NTRS)

    Wilkins, J. R.; Stoner, G. E.; Boykin, E. H.

    1974-01-01

    A simple method for detecting bacteria, based on the time of hydrogen evolution, was developed and tested against various members of the Enterobacteriaceae group. The test system consisted of (1) two electrodes, platinum and a reference electrode, (2) a buffer amplifier, and (3) a strip-chart recorder. Hydrogen evolution was measured by an increase in voltage in the negative (cathodic) direction. A linear relationship was established between inoculum size and the time hydrogen was detected (lag period). Lag times ranged from 1 h for 1 million cells/ml to 7 h for 1 cell/ml. For each 10-fold decrease in inoculum, length of the lag period increased 60 to 70 min. Based on the linear relationship between inoculum and lag period, these results indicate the potential application of the hydrogen-sensing method for rapidly detecting coliforms and other gas-producing microorganisms in a variety of clinical, food, and other samples.

  12. Wavelet-based adaptive thresholding method for image segmentation

    NASA Astrophysics Data System (ADS)

    Chen, Zikuan; Tao, Yang; Chen, Xin; Griffis, Carl

    2001-05-01

    A nonuniform background distribution may cause a global thresholding method to fail to segment objects. One solution is using a local thresholding method that adapts to local surroundings. In this paper, we propose a novel local thresholding method for image segmentation, using multiscale threshold functions obtained by wavelet synthesis with weighted detail coefficients. In particular, the coarse-to- fine synthesis with attenuated detail coefficients produces a threshold function corresponding to a high-frequency- reduced signal. This wavelet-based local thresholding method adapts to both local size and local surroundings, and its implementation can take advantage of the fast wavelet algorithm. We applied this technique to physical contaminant detection for poultry meat inspection using x-ray imaging. Experiments showed that inclusion objects in deboned poultry could be extracted at multiple resolutions despite their irregular sizes and uneven backgrounds.

  13. Infrared dim small target segmentation method based on ALI-PCNN model

    NASA Astrophysics Data System (ADS)

    Zhao, Shangnan; Song, Yong; Zhao, Yufei; Li, Yun; Li, Xu; Jiang, Yurong; Li, Lin

    2017-10-01

    Pulse Coupled Neural Network (PCNN) is improved by Adaptive Lateral Inhibition (ALI), while a method of infrared (IR) dim small target segmentation based on ALI-PCNN model is proposed in this paper. Firstly, the feeding input signal is modulated by lateral inhibition network to suppress background. Then, the linking input is modulated by ALI, and linking weight matrix is generated adaptively by calculating ALI coefficient of each pixel. Finally, the binary image is generated through the nonlinear modulation and the pulse generator in PCNN. The experimental results show that the segmentation effect as well as the values of contrast across region and uniformity across region of the proposed method are better than the OTSU method, maximum entropy method, the methods based on conventional PCNN and visual attention, and the proposed method has excellent performance in extracting IR dim small target from complex background.

  14. A sediment graph model based on SCS-CN method

    NASA Astrophysics Data System (ADS)

    Singh, P. K.; Bhunya, P. K.; Mishra, S. K.; Chaube, U. C.

    2008-01-01

    SummaryThis paper proposes new conceptual sediment graph models based on coupling of popular and extensively used methods, viz., Nash model based instantaneous unit sediment graph (IUSG), soil conservation service curve number (SCS-CN) method, and Power law. These models vary in their complexity and this paper tests their performance using data of the Nagwan watershed (area = 92.46 km 2) (India). The sensitivity of total sediment yield and peak sediment flow rate computations to model parameterisation is analysed. The exponent of the Power law, β, is more sensitive than other model parameters. The models are found to have substantial potential for computing sediment graphs (temporal sediment flow rate distribution) as well as total sediment yield.

  15. A novel dose-based positioning method for CT image-guided proton therapy

    PubMed Central

    Cheung, Joey P.; Park, Peter C.; Court, Laurence E.; Ronald Zhu, X.; Kudchadker, Rajat J.; Frank, Steven J.; Dong, Lei

    2013-01-01

    Purpose: Proton dose distributions can potentially be altered by anatomical changes in the beam path despite perfect target alignment using traditional image guidance methods. In this simulation study, the authors explored the use of dosimetric factors instead of only anatomy to set up patients for proton therapy using in-room volumetric computed tomographic (CT) images. Methods: To simulate patient anatomy in a free-breathing treatment condition, weekly time-averaged four-dimensional CT data near the end of treatment for 15 lung cancer patients were used in this study for a dose-based isocenter shift method to correct dosimetric deviations without replanning. The isocenter shift was obtained using the traditional anatomy-based image guidance method as the starting position. Subsequent isocenter shifts were established based on dosimetric criteria using a fast dose approximation method. For each isocenter shift, doses were calculated every 2 mm up to ±8 mm in each direction. The optimal dose alignment was obtained by imposing a target coverage constraint that at least 99% of the target would receive at least 95% of the prescribed dose and by minimizing the mean dose to the ipsilateral lung. Results: The authors found that 7 of 15 plans did not meet the target coverage constraint when using only the anatomy-based alignment. After the authors applied dose-based alignment, all met the target coverage constraint. For all but one case in which the target dose was met using both anatomy-based and dose-based alignment, the latter method was able to improve normal tissue sparing. Conclusions: The authors demonstrated that a dose-based adjustment to the isocenter can improve target coverage and/or reduce dose to nearby normal tissue. PMID:23635262

  16. System and method for deriving a process-based specification

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael Gerard (Inventor); Rouff, Christopher A. (Inventor); Rash, James Larry (Inventor)

    2009-01-01

    A system and method for deriving a process-based specification for a system is disclosed. The process-based specification is mathematically inferred from a trace-based specification. The trace-based specification is derived from a non-empty set of traces or natural language scenarios. The process-based specification is mathematically equivalent to the trace-based specification. Code is generated, if applicable, from the process-based specification. A process, or phases of a process, using the features disclosed can be reversed and repeated to allow for an interactive development and modification of legacy systems. The process is applicable to any class of system, including, but not limited to, biological and physical systems, electrical and electro-mechanical systems in addition to software, hardware and hybrid hardware-software systems.

  17. Weaving a Formal Methods Education with Problem-Based Learning

    NASA Astrophysics Data System (ADS)

    Gibson, J. Paul

    The idea of weaving formal methods through computing (or software engineering) degrees is not a new one. However, there has been little success in developing and implementing such a curriculum. Formal methods continue to be taught as stand-alone modules and students, in general, fail to see how fundamental these methods are to the engineering of software. A major problem is one of motivation — how can the students be expected to enthusiastically embrace a challenging subject when the learning benefits, beyond passing an exam and achieving curriculum credits, are not clear? Problem-based learning has gradually moved from being an innovative pedagogique technique, commonly used to better-motivate students, to being widely adopted in the teaching of many different disciplines, including computer science and software engineering. Our experience shows that a good problem can be re-used throughout a student's academic life. In fact, the best computing problems can be used with children (young and old), undergraduates and postgraduates. In this paper we present a process for weaving formal methods through a University curriculum that is founded on the application of problem-based learning and a library of good software engineering problems, where students learn about formal methods without sitting a traditional formal methods module. The process of constructing good problems and integrating them into the curriculum is shown to be analagous to the process of engineering software. This approach is not intended to replace more traditional formal methods modules: it will better prepare students for such specialised modules and ensure that all students have an understanding and appreciation for formal methods even if they do not go on to specialise in them.

  18. Active treatments for amblyopia: a review of the methods and evidence base.

    PubMed

    Suttle, Catherine M

    2010-09-01

    Treatment for amblyopia commonly involves passive methods such as occlusion of the non-amblyopic eye. An evidence base for these methods is provided by animal models of visual deprivation and plasticity in early life and randomised controlled studies in humans with amblyopia. Other treatments of amblyopia, intended to be used instead of or in conjunction with passive methods, are known as 'active' because they require some activity on the part of the patient. Active methods are intended to enhance treatment of amblyopia in a number of ways, including increased compliance and attention during the treatment periods (due to activities that are interesting for the patient) and the use of stimuli designed to activate and to encourage connectivity between certain cortical cell types. Active methods of amblyopia treatment are widely available and are discussed to some extent in the literature, but in many cases the evidence base is unclear, and effectiveness has not been thoroughly tested. This review looks at the techniques and evidence base for a range of these methods and discusses the need for an evidence-based approach to the acceptance and use of active amblyopia treatments.

  19. A fast button surface defects detection method based on convolutional neural network

    NASA Astrophysics Data System (ADS)

    Liu, Lizhe; Cao, Danhua; Wu, Songlin; Wu, Yubin; Wei, Taoran

    2018-01-01

    Considering the complexity of the button surface texture and the variety of buttons and defects, we propose a fast visual method for button surface defect detection, based on convolutional neural network (CNN). CNN has the ability to extract the essential features by training, avoiding designing complex feature operators adapted to different kinds of buttons, textures and defects. Firstly, we obtain the normalized button region and then use HOG-SVM method to identify the front and back side of the button. Finally, a convolutional neural network is developed to recognize the defects. Aiming at detecting the subtle defects, we propose a network structure with multiple feature channels input. To deal with the defects of different scales, we take a strategy of multi-scale image block detection. The experimental results show that our method is valid for a variety of buttons and able to recognize all kinds of defects that have occurred, including dent, crack, stain, hole, wrong paint and uneven. The detection rate exceeds 96%, which is much better than traditional methods based on SVM and methods based on template match. Our method can reach the speed of 5 fps on DSP based smart camera with 600 MHz frequency.

  20. Support vector machine-based facial-expression recognition method combining shape and appearance

    NASA Astrophysics Data System (ADS)

    Han, Eun Jung; Kang, Byung Jun; Park, Kang Ryoung; Lee, Sangyoun

    2010-11-01

    Facial expression recognition can be widely used for various applications, such as emotion-based human-machine interaction, intelligent robot interfaces, face recognition robust to expression variation, etc. Previous studies have been classified as either shape- or appearance-based recognition. The shape-based method has the disadvantage that the individual variance of facial feature points exists irrespective of similar expressions, which can cause a reduction of the recognition accuracy. The appearance-based method has a limitation in that the textural information of the face is very sensitive to variations in illumination. To overcome these problems, a new facial-expression recognition method is proposed, which combines both shape and appearance information, based on the support vector machine (SVM). This research is novel in the following three ways as compared to previous works. First, the facial feature points are automatically detected by using an active appearance model. From these, the shape-based recognition is performed by using the ratios between the facial feature points based on the facial-action coding system. Second, the SVM, which is trained to recognize the same and different expression classes, is proposed to combine two matching scores obtained from the shape- and appearance-based recognitions. Finally, a single SVM is trained to discriminate four different expressions, such as neutral, a smile, anger, and a scream. By determining the expression of the input facial image whose SVM output is at a minimum, the accuracy of the expression recognition is much enhanced. The experimental results showed that the recognition accuracy of the proposed method was better than previous researches and other fusion methods.

  1. A multiple-point spatially weighted k-NN method for object-based classification

    NASA Astrophysics Data System (ADS)

    Tang, Yunwei; Jing, Linhai; Li, Hui; Atkinson, Peter M.

    2016-10-01

    Object-based classification, commonly referred to as object-based image analysis (OBIA), is now commonly regarded as able to produce more appealing classification maps, often of greater accuracy, than pixel-based classification and its application is now widespread. Therefore, improvement of OBIA using spatial techniques is of great interest. In this paper, multiple-point statistics (MPS) is proposed for object-based classification enhancement in the form of a new multiple-point k-nearest neighbour (k-NN) classification method (MPk-NN). The proposed method first utilises a training image derived from a pre-classified map to characterise the spatial correlation between multiple points of land cover classes. The MPS borrows spatial structures from other parts of the training image, and then incorporates this spatial information, in the form of multiple-point probabilities, into the k-NN classifier. Two satellite sensor images with a fine spatial resolution were selected to evaluate the new method. One is an IKONOS image of the Beijing urban area and the other is a WorldView-2 image of the Wolong mountainous area, in China. The images were object-based classified using the MPk-NN method and several alternatives, including the k-NN, the geostatistically weighted k-NN, the Bayesian method, the decision tree classifier (DTC), and the support vector machine classifier (SVM). It was demonstrated that the new spatial weighting based on MPS can achieve greater classification accuracy relative to the alternatives and it is, thus, recommended as appropriate for object-based classification.

  2. Couple Graph Based Label Propagation Method for Hyperspectral Remote Sensing Data Classification

    NASA Astrophysics Data System (ADS)

    Wang, X. P.; Hu, Y.; Chen, J.

    2018-04-01

    Graph based semi-supervised classification method are widely used for hyperspectral image classification. We present a couple graph based label propagation method, which contains both the adjacency graph and the similar graph. We propose to construct the similar graph by using the similar probability, which utilize the label similarity among examples probably. The adjacency graph was utilized by a common manifold learning method, which has effective improve the classification accuracy of hyperspectral data. The experiments indicate that the couple graph Laplacian which unite both the adjacency graph and the similar graph, produce superior classification results than other manifold Learning based graph Laplacian and Sparse representation based graph Laplacian in label propagation framework.

  3. [A retrieval method of drug molecules based on graph collapsing].

    PubMed

    Qu, J W; Lv, X Q; Liu, Z M; Liao, Y; Sun, P H; Wang, B; Tang, Z

    2018-04-18

    To establish a compact and efficient hypergraph representation and a graph-similarity-based retrieval method of molecules to achieve effective and efficient medicine information retrieval. Chemical structural formula (CSF) was a primary search target as a unique and precise identifier for each compound at the molecular level in the research field of medicine information retrieval. To retrieve medicine information effectively and efficiently, a complete workflow of the graph-based CSF retrieval system was introduced. This system accepted the photos taken from smartphones and the sketches drawn on tablet personal computers as CSF inputs, and formalized the CSFs with the corresponding graphs. Then this paper proposed a compact and efficient hypergraph representation for molecules on the basis of analyzing factors that directly affected the efficiency of graph matching. According to the characteristics of CSFs, a hierarchical collapsing method combining graph isomorphism and frequent subgraph mining was adopted. There was yet a fundamental challenge, subgraph overlapping during the collapsing procedure, which hindered the method from establishing the correct compact hypergraph of an original CSF graph. Therefore, a graph-isomorphism-based algorithm was proposed to select dominant acyclic subgraphs on the basis of overlapping analysis. Finally, the spatial similarity among graphical CSFs was evaluated by multi-dimensional measures of similarity. To evaluate the performance of the proposed method, the proposed system was firstly compared with Wikipedia Chemical Structure Explorer (WCSE), the state-of-the-art system that allowed CSF similarity searching within Wikipedia molecules dataset, on retrieval accuracy. The system achieved higher values on mean average precision, discounted cumulative gain, rank-biased precision, and expected reciprocal rank than WCSE from the top-2 to the top-10 retrieved results. Specifically, the system achieved 10%, 1.41, 6.42%, and 1

  4. A combined emitter threat assessment method based on ICW-RCM

    NASA Astrophysics Data System (ADS)

    Zhang, Ying; Wang, Hongwei; Guo, Xiaotao; Wang, Yubing

    2017-08-01

    Considering that the tradition al emitter threat assessment methods are difficult to intuitively reflect the degree of target threaten and the deficiency of real-time and complexity, on the basis of radar chart method(RCM), an algorithm of emitter combined threat assessment based on ICW-RCM (improved combination weighting method, ICW) is proposed. The coarse sorting is integrated with fine sorting in emitter combined threat assessment, sequencing the emitter threat level roughly accordance to radar operation mode, and reducing task priority of the low-threat emitter; On the basis of ICW-RCM, sequencing the same radar operation mode emitter roughly, finally, obtain the results of emitter threat assessment through coarse and fine sorting. Simulation analyses show the correctness and effectiveness of this algorithm. Comparing with classical method of emitter threat assessment based on CW-RCM, the algorithm is visual in image and can work quickly with lower complexity.

  5. Practical quantum mechanics-based fragment methods for predicting molecular crystal properties.

    PubMed

    Wen, Shuhao; Nanda, Kaushik; Huang, Yuanhang; Beran, Gregory J O

    2012-06-07

    Significant advances in fragment-based electronic structure methods have created a real alternative to force-field and density functional techniques in condensed-phase problems such as molecular crystals. This perspective article highlights some of the important challenges in modeling molecular crystals and discusses techniques for addressing them. First, we survey recent developments in fragment-based methods for molecular crystals. Second, we use examples from our own recent research on a fragment-based QM/MM method, the hybrid many-body interaction (HMBI) model, to analyze the physical requirements for a practical and effective molecular crystal model chemistry. We demonstrate that it is possible to predict molecular crystal lattice energies to within a couple kJ mol(-1) and lattice parameters to within a few percent in small-molecule crystals. Fragment methods provide a systematically improvable approach to making predictions in the condensed phase, which is critical to making robust predictions regarding the subtle energy differences found in molecular crystals.

  6. Transistor-based particle detection systems and methods

    DOEpatents

    Jain, Ankit; Nair, Pradeep R.; Alam, Muhammad Ashraful

    2015-06-09

    Transistor-based particle detection systems and methods may be configured to detect charged and non-charged particles. Such systems may include a supporting structure contacting a gate of a transistor and separating the gate from a dielectric of the transistor, and the transistor may have a near pull-in bias and a sub-threshold region bias to facilitate particle detection. The transistor may be configured to change current flow through the transistor in response to a change in stiffness of the gate caused by securing of a particle to the gate, and the transistor-based particle detection system may configured to detect the non-charged particle at least from the change in current flow.

  7. Comparing team-based and mixed active-learning methods in an ambulatory care elective course.

    PubMed

    Zingone, Michelle M; Franks, Andrea S; Guirguis, Alexander B; George, Christa M; Howard-Thompson, Amanda; Heidel, Robert E

    2010-11-10

    To assess students' performance and perceptions of team-based and mixed active-learning methods in 2 ambulatory care elective courses, and to describe faculty members' perceptions of team-based learning. Using the 2 teaching methods, students' grades were compared. Students' perceptions were assessed through 2 anonymous course evaluation instruments. Faculty members who taught courses using the team-based learning method were surveyed regarding their impressions of team-based learning. The ambulatory care course was offered to 64 students using team-based learning (n = 37) and mixed active learning (n = 27) formats. The mean quality points earned were 3.7 (team-based learning) and 3.3 (mixed active learning), p < 0.001. Course evaluations for both courses were favorable. All faculty members who used the team-based learning method reported that they would consider using team-based learning in another course. Students were satisfied with both teaching methods; however, student grades were significantly higher in the team-based learning course. Faculty members recognized team-based learning as an effective teaching strategy for small-group active learning.

  8. Two-Way Gene Interaction From Microarray Data Based on Correlation Methods

    PubMed Central

    Alavi Majd, Hamid; Talebi, Atefeh; Gilany, Kambiz; Khayyer, Nasibeh

    2016-01-01

    Background Gene networks have generated a massive explosion in the development of high-throughput techniques for monitoring various aspects of gene activity. Networks offer a natural way to model interactions between genes, and extracting gene network information from high-throughput genomic data is an important and difficult task. Objectives The purpose of this study is to construct a two-way gene network based on parametric and nonparametric correlation coefficients. The first step in constructing a Gene Co-expression Network is to score all pairs of gene vectors. The second step is to select a score threshold and connect all gene pairs whose scores exceed this value. Materials and Methods In the foundation-application study, we constructed two-way gene networks using nonparametric methods, such as Spearman’s rank correlation coefficient and Blomqvist’s measure, and compared them with Pearson’s correlation coefficient. We surveyed six genes of venous thrombosis disease, made a matrix entry representing the score for the corresponding gene pair, and obtained two-way interactions using Pearson’s correlation, Spearman’s rank correlation, and Blomqvist’s coefficient. Finally, these methods were compared with Cytoscape, based on BIND, and Gene Ontology, based on molecular function visual methods; R software version 3.2 and Bioconductor were used to perform these methods. Results Based on the Pearson and Spearman correlations, the results were the same and were confirmed by Cytoscape and GO visual methods; however, Blomqvist’s coefficient was not confirmed by visual methods. Conclusions Some results of the correlation coefficients are not the same with visualization. The reason may be due to the small number of data. PMID:27621916

  9. Object-based change detection method using refined Markov random field

    NASA Astrophysics Data System (ADS)

    Peng, Daifeng; Zhang, Yongjun

    2017-01-01

    In order to fully consider the local spatial constraints between neighboring objects in object-based change detection (OBCD), an OBCD approach is presented by introducing a refined Markov random field (MRF). First, two periods of images are stacked and segmented to produce image objects. Second, object spectral and textual histogram features are extracted and G-statistic is implemented to measure the distance among different histogram distributions. Meanwhile, object heterogeneity is calculated by combining spectral and textual histogram distance using adaptive weight. Third, an expectation-maximization algorithm is applied for determining the change category of each object and the initial change map is then generated. Finally, a refined change map is produced by employing the proposed refined object-based MRF method. Three experiments were conducted and compared with some state-of-the-art unsupervised OBCD methods to evaluate the effectiveness of the proposed method. Experimental results demonstrate that the proposed method obtains the highest accuracy among the methods used in this paper, which confirms its validness and effectiveness in OBCD.

  10. Metaphoric Investigation of the Phonic-Based Sentence Method

    ERIC Educational Resources Information Center

    Dogan, Birsen

    2012-01-01

    This study aimed to understand the views of prospective teachers with "phonic-based sentence method" through metaphoric images. In this descriptive study, the participants involve the prospective teachers who take reading-writing instruction courses in Primary School Classroom Teaching Program of the Education Faculty of Pamukkale…

  11. Towards robust and repeatable sampling methods in eDNA based studies.

    PubMed

    Dickie, Ian A; Boyer, Stephane; Buckley, Hannah; Duncan, Richard P; Gardner, Paul; Hogg, Ian D; Holdaway, Robert J; Lear, Gavin; Makiola, Andreas; Morales, Sergio E; Powell, Jeff R; Weaver, Louise

    2018-05-26

    DNA based techniques are increasingly used for measuring the biodiversity (species presence, identity, abundance and community composition) of terrestrial and aquatic ecosystems. While there are numerous reviews of molecular methods and bioinformatic steps, there has been little consideration of the methods used to collect samples upon which these later steps are based. This represents a critical knowledge gap, as methodologically sound field sampling is the foundation for subsequent analyses. We reviewed field sampling methods used for metabarcoding studies of both terrestrial and freshwater ecosystem biodiversity over a nearly three-year period (n = 75). We found that 95% (n = 71) of these studies used subjective sampling methods, inappropriate field methods, and/or failed to provide critical methodological information. It would be possible for researchers to replicate only 5% of the metabarcoding studies in our sample, a poorer level of reproducibility than for ecological studies in general. Our findings suggest greater attention to field sampling methods and reporting is necessary in eDNA-based studies of biodiversity to ensure robust outcomes and future reproducibility. Methods must be fully and accurately reported, and protocols developed that minimise subjectivity. Standardisation of sampling protocols would be one way to help to improve reproducibility, and have additional benefits in allowing compilation and comparison of data from across studies. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  12. Attribute-Based Methods

    Treesearch

    Thomas P. Holmes; Wiktor L. Adamowicz

    2003-01-01

    Stated preference methods of environmental valuation have been used by economists for decades where behavioral data have limitations. The contingent valuation method (Chapter 5) is the oldest stated preference approach, and hundreds of contingent valuation studies have been conducted. More recently, and especially over the last decade, a class of stated preference...

  13. Usability Evaluation Methods for Gesture-Based Games: A Systematic Review.

    PubMed

    Simor, Fernando Winckler; Brum, Manoela Rogofski; Schmidt, Jaison Dairon Ebertz; Rieder, Rafael; De Marchi, Ana Carolina Bertoletti

    2016-10-04

    Gestural interaction systems are increasingly being used, mainly in games, expanding the idea of entertainment and providing experiences with the purpose of promoting better physical and/or mental health. Therefore, it is necessary to establish mechanisms for evaluating the usability of these interfaces, which make gestures the basis of interaction, to achieve a balance between functionality and ease of use. This study aims to present the results of a systematic review focused on usability evaluation methods for gesture-based games, considering devices with motion-sensing capability. We considered the usability methods used, the common interface issues, and the strategies adopted to build good gesture-based games. The research was centered on four electronic databases: IEEE, Association for Computing Machinery (ACM), Springer, and Science Direct from September 4 to 21, 2015. Within 1427 studies evaluated, 10 matched the eligibility criteria. As a requirement, we considered studies about gesture-based games, Kinect and/or Wii as devices, and the use of a usability method to evaluate the user interface. In the 10 studies found, there was no standardization in the methods because they considered diverse analysis variables. Heterogeneously, authors used different instruments to evaluate gesture-based interfaces and no default approach was proposed. Questionnaires were the most used instruments (70%, 7/10), followed by interviews (30%, 3/10), and observation and video recording (20%, 2/10). Moreover, 60% (6/10) of the studies used gesture-based serious games to evaluate the performance of elderly participants in rehabilitation tasks. This highlights the need for creating an evaluation protocol for older adults to provide a user-friendly interface according to the user's age and limitations. Through this study, we conclude this field is in need of a usability evaluation method for serious games, especially games for older adults, and that the definition of a methodology

  14. SB certification handout material requirements, test methods, responsibilities, and minimum classification levels for mixture-based specification for flexible base.

    DOT National Transportation Integrated Search

    2012-10-01

    A handout with tables representing the material requirements, test methods, responsibilities, and minimum classification levels mixture-based specification for flexible base and details on aggregate and test methods employed, along with agency and co...

  15. Scene-based method for spatial misregistration detection in hyperspectral imagery.

    PubMed

    Dell'Endice, Francesco; Nieke, Jens; Schläpfer, Daniel; Itten, Klaus I

    2007-05-20

    Hyperspectral imaging (HSI) sensors suffer from spatial misregistration, an artifact that prevents the accurate acquisition of the spectra. Physical considerations let us assume that the influence of the spatial misregistration on the acquired data depends both on the wavelength and on the across-track position. A scene-based method, based on edge detection, is therefore proposed. Such a procedure measures the variation on the spatial location of an edge between its various monochromatic projections, giving an estimation for spatial misregistration, and also allowing identification of misalignments. The method has been applied to several hyperspectral sensors, either prism, or grating-based designs. The results confirm the dependence assumptions on lambda and theta, spectral wavelength and across-track pixel, respectively. Suggestions are also given to correct for spatial misregistration.

  16. Summary of water body extraction methods based on ZY-3 satellite

    NASA Astrophysics Data System (ADS)

    Zhu, Yu; Sun, Li Jian; Zhang, Chuan Yin

    2017-12-01

    Extracting from remote sensing images is one of the main means of water information extraction. Affected by spectral characteristics, many methods can be not applied to the satellite image of ZY-3. To solve this problem, we summarize the extraction methods for ZY-3 and analyze the extraction results of existing methods. According to the characteristics of extraction results, the method of WI& single band threshold and the method of texture filtering based on probability statistics are explored. In addition, the advantages and disadvantages of all methods are compared, which provides some reference for the research of water extraction from images. The obtained conclusions are as follows. 1) NIR has higher water sensitivity, consequently when the surface reflectance in the study area is less similar to water, using single band threshold method or multi band operation can obtain the ideal effect. 2) Compared with the water index and HIS optimal index method, object extraction method based on rules, which takes into account not only the spectral information of the water, but also space and texture feature constraints, can obtain better extraction effect, yet the image segmentation process is time consuming and the definition of the rules requires a certain knowledge. 3) The combination of the spectral relationship and water index can eliminate the interference of the shadow to a certain extent. When there is less small water or small water is not considered in further study, texture filtering based on probability statistics can effectively reduce the noises in result and avoid mixing shadows or paddy field with water in a certain extent.

  17. Secure Method for Biometric-Based Recognition with Integrated Cryptographic Functions

    PubMed Central

    Chiou, Shin-Yan

    2013-01-01

    Biometric systems refer to biometric technologies which can be used to achieve authentication. Unlike cryptography-based technologies, the ratio for certification in biometric systems needs not to achieve 100% accuracy. However, biometric data can only be directly compared through proximal access to the scanning device and cannot be combined with cryptographic techniques. Moreover, repeated use, improper storage, or transmission leaks may compromise security. Prior studies have attempted to combine cryptography and biometrics, but these methods require the synchronization of internal systems and are vulnerable to power analysis attacks, fault-based cryptanalysis, and replay attacks. This paper presents a new secure cryptographic authentication method using biometric features. The proposed system combines the advantages of biometric identification and cryptographic techniques. By adding a subsystem to existing biometric recognition systems, we can simultaneously achieve the security of cryptographic technology and the error tolerance of biometric recognition. This method can be used for biometric data encryption, signatures, and other types of cryptographic computation. The method offers a high degree of security with protection against power analysis attacks, fault-based cryptanalysis, and replay attacks. Moreover, it can be used to improve the confidentiality of biological data storage and biodata identification processes. Remote biometric authentication can also be safely applied. PMID:23762851

  18. Secure method for biometric-based recognition with integrated cryptographic functions.

    PubMed

    Chiou, Shin-Yan

    2013-01-01

    Biometric systems refer to biometric technologies which can be used to achieve authentication. Unlike cryptography-based technologies, the ratio for certification in biometric systems needs not to achieve 100% accuracy. However, biometric data can only be directly compared through proximal access to the scanning device and cannot be combined with cryptographic techniques. Moreover, repeated use, improper storage, or transmission leaks may compromise security. Prior studies have attempted to combine cryptography and biometrics, but these methods require the synchronization of internal systems and are vulnerable to power analysis attacks, fault-based cryptanalysis, and replay attacks. This paper presents a new secure cryptographic authentication method using biometric features. The proposed system combines the advantages of biometric identification and cryptographic techniques. By adding a subsystem to existing biometric recognition systems, we can simultaneously achieve the security of cryptographic technology and the error tolerance of biometric recognition. This method can be used for biometric data encryption, signatures, and other types of cryptographic computation. The method offers a high degree of security with protection against power analysis attacks, fault-based cryptanalysis, and replay attacks. Moreover, it can be used to improve the confidentiality of biological data storage and biodata identification processes. Remote biometric authentication can also be safely applied.

  19. Environmental services coupled to food products and brands: food companies interests and on-farm accounting.

    PubMed

    Kempa, Daniela

    2013-09-01

    Much research has been carried out on governmental support of agri environmental measures (AEM). However, little is known about demands on and incentives from the commercial market for environmental contributions of the farmers. The factors farm structures, level of remuneration and legal framework have been thoroughly investigated. However, demands of the food industry for environmentally friendly goods(1) and their effects on farmers' decisions have not yet been analyzed. Leading companies in the food industry have observed an increasing consumer awareness and, due to higher competition, see an additional need to communicate environmental benefits which result from either organic production methods or agri-environmental measures. To address this research deficit, two case studies were carried out. The first case study is a survey aimed at the industrial food producers' demands with regards to the environmental performance of supplying farms. Concurrently, within a second survey farmers were questioned to find out what conditions are required to implement agri-environmental measures beyond cross compliance and document their environmental performance. This article presents the outcomes of the first case study. The results show that food companies have an interest in the documentation of environmental benefits of supplying farms for their marketing strategies. Provision of support by finance or contract-design is also seen as appropriate tool to promote an environmentally friendly production. In turn the food producers' demand and support for documented environmental services can have a positive influence on farmers' decisions for implementation and documentation of these services. Thus, the surveys provide essential findings for further development of documentation strategies for environmental benefits within the supply chain. Copyright © 2012 Elsevier Ltd. All rights reserved.

  20. Research on Knowledge-Based Optimization Method of Indoor Location Based on Low Energy Bluetooth

    NASA Astrophysics Data System (ADS)

    Li, C.; Li, G.; Deng, Y.; Wang, T.; Kang, Z.

    2017-09-01

    With the rapid development of LBS (Location-based Service), the demand for commercialization of indoor location has been increasing, but its technology is not perfect. Currently, the accuracy of indoor location, the complexity of the algorithm, and the cost of positioning are hard to be simultaneously considered and it is still restricting the determination and application of mainstream positioning technology. Therefore, this paper proposes a method of knowledge-based optimization of indoor location based on low energy Bluetooth. The main steps include: 1) The establishment and application of a priori and posterior knowledge base. 2) Primary selection of signal source. 3) Elimination of positioning gross error. 4) Accumulation of positioning knowledge. The experimental results show that the proposed algorithm can eliminate the signal source of outliers and improve the accuracy of single point positioning in the simulation data. The proposed scheme is a dynamic knowledge accumulation rather than a single positioning process. The scheme adopts cheap equipment and provides a new idea for the theory and method of indoor positioning. Moreover, the performance of the high accuracy positioning results in the simulation data shows that the scheme has a certain application value in the commercial promotion.

  1. Two-Way Gene Interaction From Microarray Data Based on Correlation Methods.

    PubMed

    Alavi Majd, Hamid; Talebi, Atefeh; Gilany, Kambiz; Khayyer, Nasibeh

    2016-06-01

    Gene networks have generated a massive explosion in the development of high-throughput techniques for monitoring various aspects of gene activity. Networks offer a natural way to model interactions between genes, and extracting gene network information from high-throughput genomic data is an important and difficult task. The purpose of this study is to construct a two-way gene network based on parametric and nonparametric correlation coefficients. The first step in constructing a Gene Co-expression Network is to score all pairs of gene vectors. The second step is to select a score threshold and connect all gene pairs whose scores exceed this value. In the foundation-application study, we constructed two-way gene networks using nonparametric methods, such as Spearman's rank correlation coefficient and Blomqvist's measure, and compared them with Pearson's correlation coefficient. We surveyed six genes of venous thrombosis disease, made a matrix entry representing the score for the corresponding gene pair, and obtained two-way interactions using Pearson's correlation, Spearman's rank correlation, and Blomqvist's coefficient. Finally, these methods were compared with Cytoscape, based on BIND, and Gene Ontology, based on molecular function visual methods; R software version 3.2 and Bioconductor were used to perform these methods. Based on the Pearson and Spearman correlations, the results were the same and were confirmed by Cytoscape and GO visual methods; however, Blomqvist's coefficient was not confirmed by visual methods. Some results of the correlation coefficients are not the same with visualization. The reason may be due to the small number of data.

  2. Salient object detection method based on multiple semantic features

    NASA Astrophysics Data System (ADS)

    Wang, Chunyang; Yu, Chunyan; Song, Meiping; Wang, Yulei

    2018-04-01

    The existing salient object detection model can only detect the approximate location of salient object, or highlight the background, to resolve the above problem, a salient object detection method was proposed based on image semantic features. First of all, three novel salient features were presented in this paper, including object edge density feature (EF), object semantic feature based on the convex hull (CF) and object lightness contrast feature (LF). Secondly, the multiple salient features were trained with random detection windows. Thirdly, Naive Bayesian model was used for combine these features for salient detection. The results on public datasets showed that our method performed well, the location of salient object can be fixed and the salient object can be accurately detected and marked by the specific window.

  3. A supervoxel-based segmentation method for prostate MR images.

    PubMed

    Tian, Zhiqiang; Liu, Lizhi; Zhang, Zhenfeng; Xue, Jianru; Fei, Baowei

    2017-02-01

    Segmentation of the prostate on MR images has many applications in prostate cancer management. In this work, we propose a supervoxel-based segmentation method for prostate MR images. A supervoxel is a set of pixels that have similar intensities, locations, and textures in a 3D image volume. The prostate segmentation problem is considered as assigning a binary label to each supervoxel, which is either the prostate or background. A supervoxel-based energy function with data and smoothness terms is used to model the label. The data term estimates the likelihood of a supervoxel belonging to the prostate by using a supervoxel-based shape feature. The geometric relationship between two neighboring supervoxels is used to build the smoothness term. The 3D graph cut is used to minimize the energy function to get the labels of the supervoxels, which yields the prostate segmentation. A 3D active contour model is then used to get a smooth surface by using the output of the graph cut as an initialization. The performance of the proposed algorithm was evaluated on 30 in-house MR image data and PROMISE12 dataset. The mean Dice similarity coefficients are 87.2 ± 2.3% and 88.2 ± 2.8% for our 30 in-house MR volumes and the PROMISE12 dataset, respectively. The proposed segmentation method yields a satisfactory result for prostate MR images. The proposed supervoxel-based method can accurately segment prostate MR images and can have a variety of application in prostate cancer diagnosis and therapy. © 2016 American Association of Physicists in Medicine.

  4. Picking vs Waveform based detection and location methods for induced seismicity monitoring

    NASA Astrophysics Data System (ADS)

    Grigoli, Francesco; Boese, Maren; Scarabello, Luca; Diehl, Tobias; Weber, Bernd; Wiemer, Stefan; Clinton, John F.

    2017-04-01

    Microseismic monitoring is a common operation in various industrial activities related to geo-resouces, such as oil and gas and mining operations or geothermal energy exploitation. In microseismic monitoring we generally deal with large datasets from dense monitoring networks that require robust automated analysis procedures. The seismic sequences being monitored are often characterized by very many events with short inter-event times that can even provide overlapped seismic signatures. In these situations, traditional approaches that identify seismic events using dense seismic networks based on detections, phase identification and event association can fail, leading to missed detections and/or reduced location resolution. In recent years, to improve the quality of automated catalogues, various waveform-based methods for the detection and location of microseismicity have been proposed. These methods exploit the coherence of the waveforms recorded at different stations and do not require any automated picking procedure. Although this family of methods have been applied to different induced seismicity datasets, an extensive comparison with sophisticated pick-based detection and location methods is still lacking. We aim here to perform a systematic comparison in term of performance using the waveform-based method LOKI and the pick-based detection and location methods (SCAUTOLOC and SCANLOC) implemented within the SeisComP3 software package. SCANLOC is a new detection and location method specifically designed for seismic monitoring at local scale. Although recent applications have proved an extensive test with induced seismicity datasets have been not yet performed. This method is based on a cluster search algorithm to associate detections to one or many potential earthquake sources. On the other hand, SCAUTOLOC is more a "conventional" method and is the basic tool for seismic event detection and location in SeisComp3. This approach was specifically designed for

  5. Wavelet-based unsupervised learning method for electrocardiogram suppression in surface electromyograms.

    PubMed

    Niegowski, Maciej; Zivanovic, Miroslav

    2016-03-01

    We present a novel approach aimed at removing electrocardiogram (ECG) perturbation from single-channel surface electromyogram (EMG) recordings by means of unsupervised learning of wavelet-based intensity images. The general idea is to combine the suitability of certain wavelet decomposition bases which provide sparse electrocardiogram time-frequency representations, with the capacity of non-negative matrix factorization (NMF) for extracting patterns from images. In order to overcome convergence problems which often arise in NMF-related applications, we design a novel robust initialization strategy which ensures proper signal decomposition in a wide range of ECG contamination levels. Moreover, the method can be readily used because no a priori knowledge or parameter adjustment is needed. The proposed method was evaluated on real surface EMG signals against two state-of-the-art unsupervised learning algorithms and a singular spectrum analysis based method. The results, expressed in terms of high-to-low energy ratio, normalized median frequency, spectral power difference and normalized average rectified value, suggest that the proposed method enables better ECG-EMG separation quality than the reference methods. Copyright © 2015 IPEM. Published by Elsevier Ltd. All rights reserved.

  6. Explorations in Using Arts-Based Self-Study Methods

    ERIC Educational Resources Information Center

    Samaras, Anastasia P.

    2010-01-01

    Research methods courses typically require students to conceptualize, describe, and present their research ideas in writing. In this article, the author describes her exploration in using arts-based techniques for teaching research to support the development of students' self-study research projects. The pedagogical approach emerged from the…

  7. A probability-based multi-cycle sorting method for 4D-MRI: A simulation study

    PubMed Central

    Liang, Xiao; Yin, Fang-Fang; Liu, Yilin; Cai, Jing

    2016-01-01

    Purpose: To develop a novel probability-based sorting method capable of generating multiple breathing cycles of 4D-MRI images and to evaluate performance of this new method by comparing with conventional phase-based methods in terms of image quality and tumor motion measurement. Methods: Based on previous findings that breathing motion probability density function (PDF) of a single breathing cycle is dramatically different from true stabilized PDF that resulted from many breathing cycles, it is expected that a probability-based sorting method capable of generating multiple breathing cycles of 4D images may capture breathing variation information missing from conventional single-cycle sorting methods. The overall idea is to identify a few main breathing cycles (and their corresponding weightings) that can best represent the main breathing patterns of the patient and then reconstruct a set of 4D images for each of the identified main breathing cycles. This method is implemented in three steps: (1) The breathing signal is decomposed into individual breathing cycles, characterized by amplitude, and period; (2) individual breathing cycles are grouped based on amplitude and period to determine the main breathing cycles. If a group contains more than 10% of all breathing cycles in a breathing signal, it is determined as a main breathing pattern group and is represented by the average of individual breathing cycles in the group; (3) for each main breathing cycle, a set of 4D images is reconstructed using a result-driven sorting method adapted from our previous study. The probability-based sorting method was first tested on 26 patients’ breathing signals to evaluate its feasibility of improving target motion PDF. The new method was subsequently tested for a sequential image acquisition scheme on the 4D digital extended cardiac torso (XCAT) phantom. Performance of the probability-based and conventional sorting methods was evaluated in terms of target volume precision and

  8. Betweenness-Based Method to Identify Critical Transmission Sectors for Supply Chain Environmental Pressure Mitigation.

    PubMed

    Liang, Sai; Qu, Shen; Xu, Ming

    2016-02-02

    To develop industry-specific policies for mitigating environmental pressures, previous studies primarily focus on identifying sectors that directly generate large amounts of environmental pressures (a.k.a. production-based method) or indirectly drive large amounts of environmental pressures through supply chains (e.g., consumption-based method). In addition to those sectors as important environmental pressure producers or drivers, there exist sectors that are also important to environmental pressure mitigation as transmission centers. Economy-wide environmental pressure mitigation might be achieved by improving production efficiency of these key transmission sectors, that is, using less upstream inputs to produce unitary output. We develop a betweenness-based method to measure the importance of transmission sectors, borrowing the betweenness concept from network analysis. We quantify the betweenness of sectors by examining supply chain paths extracted from structural path analysis that pass through a particular sector. We take China as an example and find that those critical transmission sectors identified by betweenness-based method are not always identifiable by existing methods. This indicates that betweenness-based method can provide additional insights that cannot be obtained with existing methods on the roles individual sectors play in generating economy-wide environmental pressures. Betweenness-based method proposed here can therefore complement existing methods for guiding sector-level environmental pressure mitigation strategies.

  9. A wavelet-based Gaussian method for energy dispersive X-ray fluorescence spectrum.

    PubMed

    Liu, Pan; Deng, Xiaoyan; Tang, Xin; Shen, Shijian

    2017-05-01

    This paper presents a wavelet-based Gaussian method (WGM) for the peak intensity estimation of energy dispersive X-ray fluorescence (EDXRF). The relationship between the parameters of Gaussian curve and the wavelet coefficients of Gaussian peak point is firstly established based on the Mexican hat wavelet. It is found that the Gaussian parameters can be accurately calculated by any two wavelet coefficients at the peak point which has to be known. This fact leads to a local Gaussian estimation method for spectral peaks, which estimates the Gaussian parameters based on the detail wavelet coefficients of Gaussian peak point. The proposed method is tested via simulated and measured spectra from an energy X-ray spectrometer, and compared with some existing methods. The results prove that the proposed method can directly estimate the peak intensity of EDXRF free from the background information, and also effectively distinguish overlap peaks in EDXRF spectrum.

  10. Method to produce nanocrystalline powders of oxide-based phosphors for lighting applications

    DOEpatents

    Loureiro, Sergio Paulo Martins; Setlur, Anant Achyut; Williams, Darryl Stephen; Manoharan, Mohan; Srivastava, Alok Mani

    2007-12-25

    Some embodiments of the present invention are directed toward nanocrystalline oxide-based phosphor materials, and methods for making same. Typically, such methods comprise a steric entrapment route for converting precursors into such phosphor material. In some embodiments, the nanocrystalline oxide-based phosphor materials are quantum splitting phosphors. In some or other embodiments, such nanocrystalline oxide based phosphor materials provide reduced scattering, leading to greater efficiency, when used in lighting applications.

  11. Distance-Based Phylogenetic Methods Around a Polytomy.

    PubMed

    Davidson, Ruth; Sullivant, Seth

    2014-01-01

    Distance-based phylogenetic algorithms attempt to solve the NP-hard least-squares phylogeny problem by mapping an arbitrary dissimilarity map representing biological data to a tree metric. The set of all dissimilarity maps is a Euclidean space properly containing the space of all tree metrics as a polyhedral fan. Outputs of distance-based tree reconstruction algorithms such as UPGMA and neighbor-joining are points in the maximal cones in the fan. Tree metrics with polytomies lie at the intersections of maximal cones. A phylogenetic algorithm divides the space of all dissimilarity maps into regions based upon which combinatorial tree is reconstructed by the algorithm. Comparison of phylogenetic methods can be done by comparing the geometry of these regions. We use polyhedral geometry to compare the local nature of the subdivisions induced by least-squares phylogeny, UPGMA, and neighbor-joining when the true tree has a single polytomy with exactly four neighbors. Our results suggest that in some circumstances, UPGMA and neighbor-joining poorly match least-squares phylogeny.

  12. Bridge Displacement Monitoring Method Based on Laser Projection-Sensing Technology

    PubMed Central

    Zhao, Xuefeng; Liu, Hao; Yu, Yan; Xu, Xiaodong; Hu, Weitong; Li, Mingchu; Ou, Jingping

    2015-01-01

    Bridge displacement is the most basic evaluation index of the health status of a bridge structure. The existing measurement methods for bridge displacement basically fail to realize long-term and real-time dynamic monitoring of bridge structures, because of the low degree of automation and the insufficient precision, causing bottlenecks and restriction. To solve this problem, we proposed a bridge displacement monitoring system based on laser projection-sensing technology. First, the laser spot recognition method was studied. Second, the software for the displacement monitoring system was developed. Finally, a series of experiments using this system were conducted, and the results show that such a system has high measurement accuracy and speed. We aim to develop a low-cost, high-accuracy and long-term monitoring method for bridge displacement based on these preliminary efforts. PMID:25871716

  13. A degradation-based sorting method for lithium-ion battery reuse.

    PubMed

    Chen, Hao; Shen, Julia

    2017-01-01

    In a world where millions of people are dependent on batteries to provide them with convenient and portable energy, battery recycling is of the utmost importance. In this paper, we developed a new method to sort 18650 Lithium-ion batteries in large quantities and in real time for harvesting used cells with enough capacity for battery reuse. Internal resistance and capacity tests were conducted as a basis for comparison with a novel degradation-based method based on X-ray radiographic scanning and digital image contrast computation. The test results indicate that the sorting accuracy of the test cells is about 79% and the execution time of our algorithm is at a level of 200 milliseconds, making our method a potential real-time solution for reusing the remaining capacity in good used cells.

  14. A degradation-based sorting method for lithium-ion battery reuse

    PubMed Central

    Chen, Hao

    2017-01-01

    In a world where millions of people are dependent on batteries to provide them with convenient and portable energy, battery recycling is of the utmost importance. In this paper, we developed a new method to sort 18650 Lithium-ion batteries in large quantities and in real time for harvesting used cells with enough capacity for battery reuse. Internal resistance and capacity tests were conducted as a basis for comparison with a novel degradation-based method based on X-ray radiographic scanning and digital image contrast computation. The test results indicate that the sorting accuracy of the test cells is about 79% and the execution time of our algorithm is at a level of 200 milliseconds, making our method a potential real-time solution for reusing the remaining capacity in good used cells. PMID:29023485

  15. Differential equation based method for accurate approximations in optimization

    NASA Technical Reports Server (NTRS)

    Pritchard, Jocelyn I.; Adelman, Howard M.

    1990-01-01

    This paper describes a method to efficiently and accurately approximate the effect of design changes on structural response. The key to this new method is to interpret sensitivity equations as differential equations that may be solved explicitly for closed form approximations, hence, the method is denoted the Differential Equation Based (DEB) method. Approximations were developed for vibration frequencies, mode shapes and static displacements. The DEB approximation method was applied to a cantilever beam and results compared with the commonly-used linear Taylor series approximations and exact solutions. The test calculations involved perturbing the height, width, cross-sectional area, tip mass, and bending inertia of the beam. The DEB method proved to be very accurate, and in msot cases, was more accurate than the linear Taylor series approximation. The method is applicable to simultaneous perturbation of several design variables. Also, the approximations may be used to calculate other system response quantities. For example, the approximations for displacement are used to approximate bending stresses.

  16. Differential equation based method for accurate approximations in optimization

    NASA Technical Reports Server (NTRS)

    Pritchard, Jocelyn I.; Adelman, Howard M.

    1990-01-01

    A method to efficiently and accurately approximate the effect of design changes on structural response is described. The key to this method is to interpret sensitivity equations as differential equations that may be solved explicitly for closed form approximations, hence, the method is denoted the Differential Equation Based (DEB) method. Approximations were developed for vibration frequencies, mode shapes and static displacements. The DEB approximation method was applied to a cantilever beam and results compared with the commonly-used linear Taylor series approximations and exact solutions. The test calculations involved perturbing the height, width, cross-sectional area, tip mass, and bending inertia of the beam. The DEB method proved to be very accurate, and in most cases, was more accurate than the linear Taylor series approximation. The method is applicable to simultaneous perturbation of several design variables. Also, the approximations may be used to calculate other system response quantities. For example, the approximations for displacements are used to approximate bending stresses.

  17. Fault Diagnosis for Rotating Machinery: A Method based on Image Processing

    PubMed Central

    Lu, Chen; Wang, Yang; Ragulskis, Minvydas; Cheng, Yujie

    2016-01-01

    Rotating machinery is one of the most typical types of mechanical equipment and plays a significant role in industrial applications. Condition monitoring and fault diagnosis of rotating machinery has gained wide attention for its significance in preventing catastrophic accident and guaranteeing sufficient maintenance. With the development of science and technology, fault diagnosis methods based on multi-disciplines are becoming the focus in the field of fault diagnosis of rotating machinery. This paper presents a multi-discipline method based on image-processing for fault diagnosis of rotating machinery. Different from traditional analysis method in one-dimensional space, this study employs computing method in the field of image processing to realize automatic feature extraction and fault diagnosis in a two-dimensional space. The proposed method mainly includes the following steps. First, the vibration signal is transformed into a bi-spectrum contour map utilizing bi-spectrum technology, which provides a basis for the following image-based feature extraction. Then, an emerging approach in the field of image processing for feature extraction, speeded-up robust features, is employed to automatically exact fault features from the transformed bi-spectrum contour map and finally form a high-dimensional feature vector. To reduce the dimensionality of the feature vector, thus highlighting main fault features and reducing subsequent computing resources, t-Distributed Stochastic Neighbor Embedding is adopt to reduce the dimensionality of the feature vector. At last, probabilistic neural network is introduced for fault identification. Two typical rotating machinery, axial piston hydraulic pump and self-priming centrifugal pumps, are selected to demonstrate the effectiveness of the proposed method. Results show that the proposed method based on image-processing achieves a high accuracy, thus providing a highly effective means to fault diagnosis for rotating machinery. PMID

  18. Fault Diagnosis for Rotating Machinery: A Method based on Image Processing.

    PubMed

    Lu, Chen; Wang, Yang; Ragulskis, Minvydas; Cheng, Yujie

    2016-01-01

    Rotating machinery is one of the most typical types of mechanical equipment and plays a significant role in industrial applications. Condition monitoring and fault diagnosis of rotating machinery has gained wide attention for its significance in preventing catastrophic accident and guaranteeing sufficient maintenance. With the development of science and technology, fault diagnosis methods based on multi-disciplines are becoming the focus in the field of fault diagnosis of rotating machinery. This paper presents a multi-discipline method based on image-processing for fault diagnosis of rotating machinery. Different from traditional analysis method in one-dimensional space, this study employs computing method in the field of image processing to realize automatic feature extraction and fault diagnosis in a two-dimensional space. The proposed method mainly includes the following steps. First, the vibration signal is transformed into a bi-spectrum contour map utilizing bi-spectrum technology, which provides a basis for the following image-based feature extraction. Then, an emerging approach in the field of image processing for feature extraction, speeded-up robust features, is employed to automatically exact fault features from the transformed bi-spectrum contour map and finally form a high-dimensional feature vector. To reduce the dimensionality of the feature vector, thus highlighting main fault features and reducing subsequent computing resources, t-Distributed Stochastic Neighbor Embedding is adopt to reduce the dimensionality of the feature vector. At last, probabilistic neural network is introduced for fault identification. Two typical rotating machinery, axial piston hydraulic pump and self-priming centrifugal pumps, are selected to demonstrate the effectiveness of the proposed method. Results show that the proposed method based on image-processing achieves a high accuracy, thus providing a highly effective means to fault diagnosis for rotating machinery.

  19. Method for fabricating beryllium-based multilayer structures

    DOEpatents

    Skulina, Kenneth M.; Bionta, Richard M.; Makowiecki, Daniel M.; Alford, Craig S.

    2003-02-18

    Beryllium-based multilayer structures and a process for fabricating beryllium-based multilayer mirrors, useful in the wavelength region greater than the beryllium K-edge (111 .ANG. or 11.1 nm). The process includes alternating sputter deposition of beryllium and a metal, typically from the fifth row of the periodic table, such as niobium (Nb), molybdenum (Mo), ruthenium (Ru), and rhodium (Rh). The process includes not only the method of sputtering the materials, but the industrial hygiene controls for safe handling of beryllium. The mirrors made in accordance with the process may be utilized in soft x-ray and extreme-ultraviolet projection lithography, which requires mirrors of high reflectivity (>60%) for x-rays in the range of 60-140 .ANG. (60-14.0 nm).

  20. Continental-scale Validation of MODIS-based and LEDAPS Landsat ETM+ Atmospheric Correction Methods

    NASA Technical Reports Server (NTRS)

    Ju, Junchang; Roy, David P.; Vermote, Eric; Masek, Jeffrey; Kovalskyy, Valeriy

    2012-01-01

    The potential of Landsat data processing to provide systematic continental scale products has been demonstrated by several projects including the NASA Web-enabled Landsat Data (WELD) project. The recent free availability of Landsat data increases the need for robust and efficient atmospheric correction algorithms applicable to large volume Landsat data sets. This paper compares the accuracy of two Landsat atmospheric correction methods: a MODIS-based method and the Landsat Ecosystem Disturbance Adaptive Processing System (LEDAPS) method. Both methods are based on the 6SV radiative transfer code but have different atmospheric characterization approaches. The MODIS-based method uses the MODIS Terra derived dynamic aerosol type, aerosol optical thickness, and water vapor to atmospherically correct ETM+ acquisitions in each coincident orbit. The LEDAPS method uses aerosol characterizations derived independently from each Landsat acquisition and assumes a fixed continental aerosol type and uses ancillary water vapor. Validation results are presented comparing ETM+ atmospherically corrected data generated using these two methods with AERONET corrected ETM+ data for 95 10 km×10 km 30 m subsets, a total of nearly 8 million 30 m pixels, located across the conterminous United States. The results indicate that the MODIS-based method has better accuracy than the LEDAPS method for the ETM+ red and longer wavelength bands.

  1. Development of DNA-based Identification methods to track the ...

    EPA Pesticide Factsheets

    The ability to track the identity and abundance of larval fish, which are ubiquitous during spawning season, may lead to a greater understanding of fish species distributions in Great Lakes nearshore areas including early-detection of invasive fish species before they become established. However, larval fish are notoriously hard to identify using traditional morphological techniques. While DNA-based identification methods could increase the ability of aquatic resource managers to determine larval fish composition, use of these methods in aquatic surveys is still uncommon and presents many challenges. In response to this need, we have been working with the U. S. Fish and Wildlife Service to develop field and laboratory methods to facilitate the identification of larval fish using DNA-meta-barcoding. In 2012, we initiated a pilot-project to develop a workflow for conducting DNA-based identification, and compared the species composition at sites within the St. Louis River Estuary of Lake Superior using traditional identification versus DNA meta-barcoding. In 2013, we extended this research to conduct DNA-identification of fish larvae collected from multiple nearshore areas of the Great Lakes by the USFWS. The species composition of larval fish generally mirrored that of fish species known from the same areas, but was influenced by the timing and intensity of sampling. Results indicate that DNA-based identification needs only very low levels of biomass to detect pre

  2. A low delay transmission method of multi-channel video based on FPGA

    NASA Astrophysics Data System (ADS)

    Fu, Weijian; Wei, Baozhi; Li, Xiaobin; Wang, Quan; Hu, Xiaofei

    2018-03-01

    In order to guarantee the fluency of multi-channel video transmission in video monitoring scenarios, we designed a kind of video format conversion method based on FPGA and its DMA scheduling for video data, reduces the overall video transmission delay.In order to sace the time in the conversion process, the parallel ability of FPGA is used to video format conversion. In order to improve the direct memory access (DMA) writing transmission rate of PCIe bus, a DMA scheduling method based on asynchronous command buffer is proposed. The experimental results show that this paper designs a low delay transmission method based on FPGA, which increases the DMA writing transmission rate by 34% compared with the existing method, and then the video overall delay is reduced to 23.6ms.

  3. Changes in Teaching Efficacy during a Professional Development School-Based Science Methods Course

    ERIC Educational Resources Information Center

    Swars, Susan L.; Dooley, Caitlin McMunn

    2010-01-01

    This mixed methods study offers a theoretically grounded description of a field-based science methods course within a Professional Development School (PDS) model (i.e., PDS-based course). The preservice teachers' (n = 21) experiences within the PDS-based course prompted significant changes in their personal teaching efficacy, with the…

  4. A probability-based multi-cycle sorting method for 4D-MRI: A simulation study.

    PubMed

    Liang, Xiao; Yin, Fang-Fang; Liu, Yilin; Cai, Jing

    2016-12-01

    To develop a novel probability-based sorting method capable of generating multiple breathing cycles of 4D-MRI images and to evaluate performance of this new method by comparing with conventional phase-based methods in terms of image quality and tumor motion measurement. Based on previous findings that breathing motion probability density function (PDF) of a single breathing cycle is dramatically different from true stabilized PDF that resulted from many breathing cycles, it is expected that a probability-based sorting method capable of generating multiple breathing cycles of 4D images may capture breathing variation information missing from conventional single-cycle sorting methods. The overall idea is to identify a few main breathing cycles (and their corresponding weightings) that can best represent the main breathing patterns of the patient and then reconstruct a set of 4D images for each of the identified main breathing cycles. This method is implemented in three steps: (1) The breathing signal is decomposed into individual breathing cycles, characterized by amplitude, and period; (2) individual breathing cycles are grouped based on amplitude and period to determine the main breathing cycles. If a group contains more than 10% of all breathing cycles in a breathing signal, it is determined as a main breathing pattern group and is represented by the average of individual breathing cycles in the group; (3) for each main breathing cycle, a set of 4D images is reconstructed using a result-driven sorting method adapted from our previous study. The probability-based sorting method was first tested on 26 patients' breathing signals to evaluate its feasibility of improving target motion PDF. The new method was subsequently tested for a sequential image acquisition scheme on the 4D digital extended cardiac torso (XCAT) phantom. Performance of the probability-based and conventional sorting methods was evaluated in terms of target volume precision and accuracy as measured

  5. Constructing financial network based on PMFG and threshold method

    NASA Astrophysics Data System (ADS)

    Nie, Chun-Xiao; Song, Fu-Tie

    2018-04-01

    Based on planar maximally filtered graph (PMFG) and threshold method, we introduced a correlation-based network named PMFG-based threshold network (PTN). We studied the community structure of PTN and applied ISOMAP algorithm to represent PTN in low-dimensional Euclidean space. The results show that the community corresponds well to the cluster in the Euclidean space. Further, we studied the dynamics of the community structure and constructed the normalized mutual information (NMI) matrix. Based on the real data in the market, we found that the volatility of the market can lead to dramatic changes in the community structure, and the structure is more stable during the financial crisis.

  6. Fast and secure encryption-decryption method based on chaotic dynamics

    DOEpatents

    Protopopescu, Vladimir A.; Santoro, Robert T.; Tolliver, Johnny S.

    1995-01-01

    A method and system for the secure encryption of information. The method comprises the steps of dividing a message of length L into its character components; generating m chaotic iterates from m independent chaotic maps; producing an "initial" value based upon the m chaotic iterates; transforming the "initial" value to create a pseudo-random integer; repeating the steps of generating, producing and transforming until a pseudo-random integer sequence of length L is created; and encrypting the message as ciphertext based upon the pseudo random integer sequence. A system for accomplishing the invention is also provided.

  7. A deep learning-based multi-model ensemble method for cancer prediction.

    PubMed

    Xiao, Yawen; Wu, Jun; Lin, Zongli; Zhao, Xiaodong

    2018-01-01

    Cancer is a complex worldwide health problem associated with high mortality. With the rapid development of the high-throughput sequencing technology and the application of various machine learning methods that have emerged in recent years, progress in cancer prediction has been increasingly made based on gene expression, providing insight into effective and accurate treatment decision making. Thus, developing machine learning methods, which can successfully distinguish cancer patients from healthy persons, is of great current interest. However, among the classification methods applied to cancer prediction so far, no one method outperforms all the others. In this paper, we demonstrate a new strategy, which applies deep learning to an ensemble approach that incorporates multiple different machine learning models. We supply informative gene data selected by differential gene expression analysis to five different classification models. Then, a deep learning method is employed to ensemble the outputs of the five classifiers. The proposed deep learning-based multi-model ensemble method was tested on three public RNA-seq data sets of three kinds of cancers, Lung Adenocarcinoma, Stomach Adenocarcinoma and Breast Invasive Carcinoma. The test results indicate that it increases the prediction accuracy of cancer for all the tested RNA-seq data sets as compared to using a single classifier or the majority voting algorithm. By taking full advantage of different classifiers, the proposed deep learning-based multi-model ensemble method is shown to be accurate and effective for cancer prediction. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Development of performance-based evaluation methods and specifications for roadside maintenance.

    DOT National Transportation Integrated Search

    2011-01-01

    This report documents the work performed during Project 0-6387, Performance Based Roadside : Maintenance Specifications. Quality assurance methods and specifications for roadside performance-based : maintenance contracts (PBMCs) were developed ...

  9. Appearance-based representative samples refining method for palmprint recognition

    NASA Astrophysics Data System (ADS)

    Wen, Jiajun; Chen, Yan

    2012-07-01

    The sparse representation can deal with the lack of sample problem due to utilizing of all the training samples. However, the discrimination ability will degrade when more training samples are used for representation. We propose a novel appearance-based palmprint recognition method. We aim to find a compromise between the discrimination ability and the lack of sample problem so as to obtain a proper representation scheme. Under the assumption that the test sample can be well represented by a linear combination of a certain number of training samples, we first select the representative training samples according to the contributions of the samples. Then we further refine the training samples by an iteration procedure, excluding the training sample with the least contribution to the test sample for each time. Experiments on PolyU multispectral palmprint database and two-dimensional and three-dimensional palmprint database show that the proposed method outperforms the conventional appearance-based palmprint recognition methods. Moreover, we also explore and find out the principle of the usage for the key parameters in the proposed algorithm, which facilitates to obtain high-recognition accuracy.

  10. A supervoxel-based segmentation method for prostate MR images

    NASA Astrophysics Data System (ADS)

    Tian, Zhiqiang; Liu, LiZhi; Fei, Baowei

    2015-03-01

    Accurate segmentation of the prostate has many applications in prostate cancer diagnosis and therapy. In this paper, we propose a "Supervoxel" based method for prostate segmentation. The prostate segmentation problem is considered as assigning a label to each supervoxel. An energy function with data and smoothness terms is used to model the labeling process. The data term estimates the likelihood of a supervoxel belongs to the prostate according to a shape feature. The geometric relationship between two neighboring supervoxels is used to construct a smoothness term. A threedimensional (3D) graph cut method is used to minimize the energy function in order to segment the prostate. A 3D level set is then used to get a smooth surface based on the output of the graph cut. The performance of the proposed segmentation algorithm was evaluated with respect to the manual segmentation ground truth. The experimental results on 12 prostate volumes showed that the proposed algorithm yields a mean Dice similarity coefficient of 86.9%+/-3.2%. The segmentation method can be used not only for the prostate but also for other organs.

  11. Level set method for image segmentation based on moment competition

    NASA Astrophysics Data System (ADS)

    Min, Hai; Wang, Xiao-Feng; Huang, De-Shuang; Jin, Jing; Wang, Hong-Zhi; Li, Hai

    2015-05-01

    We propose a level set method for image segmentation which introduces the moment competition and weakly supervised information into the energy functional construction. Different from the region-based level set methods which use force competition, the moment competition is adopted to drive the contour evolution. Here, a so-called three-point labeling scheme is proposed to manually label three independent points (weakly supervised information) on the image. Then the intensity differences between the three points and the unlabeled pixels are used to construct the force arms for each image pixel. The corresponding force is generated from the global statistical information of a region-based method and weighted by the force arm. As a result, the moment can be constructed and incorporated into the energy functional to drive the evolving contour to approach the object boundary. In our method, the force arm can take full advantage of the three-point labeling scheme to constrain the moment competition. Additionally, the global statistical information and weakly supervised information are successfully integrated, which makes the proposed method more robust than traditional methods for initial contour placement and parameter setting. Experimental results with performance analysis also show the superiority of the proposed method on segmenting different types of complicated images, such as noisy images, three-phase images, images with intensity inhomogeneity, and texture images.

  12. Systems, methods and apparatus for verification of knowledge-based systems

    NASA Technical Reports Server (NTRS)

    Rash, James L. (Inventor); Gracinin, Denis (Inventor); Erickson, John D. (Inventor); Rouff, Christopher A. (Inventor); Hinchey, Michael G. (Inventor)

    2010-01-01

    Systems, methods and apparatus are provided through which in some embodiments, domain knowledge is translated into a knowledge-based system. In some embodiments, a formal specification is derived from rules of a knowledge-based system, the formal specification is analyzed, and flaws in the formal specification are used to identify and correct errors in the domain knowledge, from which a knowledge-based system is translated.

  13. Method of removing and detoxifying a phosphorus-based substance

    DOEpatents

    Vandegrift, George F.; Steindler, Martin J.

    1989-01-01

    A method of removing organic phosphorus-based poisonous substances from water contaminated therewith and of subsequently destroying the toxicity of the substance is disclosed. Initially, a water-immiscible organic is immobilized on a supported liquid membrane. Thereafter, the contaminated water is contacted with one side of the supported liquid membrane to selectively dissolve the phosphorus-based substance in the organic extractant. At the same time, the other side of the supported liquid membrane is contacted with a hydroxy-affording strong base to react the phosphorus-based substance dissolved by the organic extractant with a hydroxy ion. This forms a non-toxic reaction product in the base. The organic extractant can be a water-insoluble trialkyl amine, such as trilauryl amine. The phosphorus-based substance can be phosphoryl or a thiophosphoryl.

  14. Ground-Cover Measurements: Assessing Correlation Among Aerial and Ground-Based Methods

    NASA Astrophysics Data System (ADS)

    Booth, D. Terrance; Cox, Samuel E.; Meikle, Tim; Zuuring, Hans R.

    2008-12-01

    Wyoming’s Green Mountain Common Allotment is public land providing livestock forage, wildlife habitat, and unfenced solitude, amid other ecological services. It is also the center of ongoing debate over USDI Bureau of Land Management’s (BLM) adjudication of land uses. Monitoring resource use is a BLM responsibility, but conventional monitoring is inadequate for the vast areas encompassed in this and other public-land units. New monitoring methods are needed that will reduce monitoring costs. An understanding of data-set relationships among old and new methods is also needed. This study compared two conventional methods with two remote sensing methods using images captured from two meters and 100 meters above ground level from a camera stand (a ground, image-based method) and a light airplane (an aerial, image-based method). Image analysis used SamplePoint or VegMeasure software. Aerial methods allowed for increased sampling intensity at low cost relative to the time and travel required by ground methods. Costs to acquire the aerial imagery and measure ground cover on 162 aerial samples representing 9000 ha were less than 3000. The four highest correlations among data sets for bare ground—the ground-cover characteristic yielding the highest correlations (r)—ranged from 0.76 to 0.85 and included ground with ground, ground with aerial, and aerial with aerial data-set associations. We conclude that our aerial surveys are a cost-effective monitoring method, that ground with aerial data-set correlations can be equal to, or greater than those among ground-based data sets, and that bare ground should continue to be investigated and tested for use as a key indicator of rangeland health.

  15. Missing value imputation in DNA microarrays based on conjugate gradient method.

    PubMed

    Dorri, Fatemeh; Azmi, Paeiz; Dorri, Faezeh

    2012-02-01

    Analysis of gene expression profiles needs a complete matrix of gene array values; consequently, imputation methods have been suggested. In this paper, an algorithm that is based on conjugate gradient (CG) method is proposed to estimate missing values. k-nearest neighbors of the missed entry are first selected based on absolute values of their Pearson correlation coefficient. Then a subset of genes among the k-nearest neighbors is labeled as the best similar ones. CG algorithm with this subset as its input is then used to estimate the missing values. Our proposed CG based algorithm (CGimpute) is evaluated on different data sets. The results are compared with sequential local least squares (SLLSimpute), Bayesian principle component analysis (BPCAimpute), local least squares imputation (LLSimpute), iterated local least squares imputation (ILLSimpute) and adaptive k-nearest neighbors imputation (KNNKimpute) methods. The average of normalized root mean squares error (NRMSE) and relative NRMSE in different data sets with various missing rates shows CGimpute outperforms other methods. Copyright © 2011 Elsevier Ltd. All rights reserved.

  16. Effectiveness of Spray-Based Decontamination Methods for ...

    EPA Pesticide Factsheets

    Report The objective of this project was to assess the effectiveness of spray-based common decontamination methods for inactivating Bacillus (B.) atrophaeus (surrogate for B. anthracis) spores and bacteriophage MS2 (surrogate for foot and mouth disease virus [FMDV]) on selected test surfaces (with or without a model agricultural soil load). Relocation of viable viruses or spores from the contaminated coupon surfaces into aerosol or liquid fractions during the decontamination methods was investigated. This project was conducted to support jointly held missions of the U.S. Department of Homeland Security (DHS) and the U.S. Environmental Protection Agency (EPA). Within the EPA, the project supports the mission of EPA’s Homeland Security Research Program (HSRP) by providing relevant information pertinent to the decontamination of contaminated areas resulting from a biological incident.

  17. A Novel Weighted Kernel PCA-Based Method for Optimization and Uncertainty Quantification

    NASA Astrophysics Data System (ADS)

    Thimmisetty, C.; Talbot, C.; Chen, X.; Tong, C. H.

    2016-12-01

    It has been demonstrated that machine learning methods can be successfully applied to uncertainty quantification for geophysical systems through the use of the adjoint method coupled with kernel PCA-based optimization. In addition, it has been shown through weighted linear PCA how optimization with respect to both observation weights and feature space control variables can accelerate convergence of such methods. Linear machine learning methods, however, are inherently limited in their ability to represent features of non-Gaussian stochastic random fields, as they are based on only the first two statistical moments of the original data. Nonlinear spatial relationships and multipoint statistics leading to the tortuosity characteristic of channelized media, for example, are captured only to a limited extent by linear PCA. With the aim of coupling the kernel-based and weighted methods discussed, we present a novel mathematical formulation of kernel PCA, Weighted Kernel Principal Component Analysis (WKPCA), that both captures nonlinear relationships and incorporates the attribution of significance levels to different realizations of the stochastic random field of interest. We also demonstrate how new instantiations retaining defining characteristics of the random field can be generated using Bayesian methods. In particular, we present a novel WKPCA-based optimization method that minimizes a given objective function with respect to both feature space random variables and observation weights through which optimal snapshot significance levels and optimal features are learned. We showcase how WKPCA can be applied to nonlinear optimal control problems involving channelized media, and in particular demonstrate an application of the method to learning the spatial distribution of material parameter values in the context of linear elasticity, and discuss further extensions of the method to stochastic inversion.

  18. Hybrid PSO-ASVR-based method for data fitting in the calibration of infrared radiometer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Sen; Li, Chengwei, E-mail: heikuanghit@163.com

    2016-06-15

    The present paper describes a hybrid particle swarm optimization-adaptive support vector regression (PSO-ASVR)-based method for data fitting in the calibration of infrared radiometer. The proposed hybrid PSO-ASVR-based method is based on PSO in combination with Adaptive Processing and Support Vector Regression (SVR). The optimization technique involves setting parameters in the ASVR fitting procedure, which significantly improves the fitting accuracy. However, its use in the calibration of infrared radiometer has not yet been widely explored. Bearing this in mind, the PSO-ASVR-based method, which is based on the statistical learning theory, is successfully used here to get the relationship between the radiationmore » of a standard source and the response of an infrared radiometer. Main advantages of this method are the flexible adjustment mechanism in data processing and the optimization mechanism in a kernel parameter setting of SVR. Numerical examples and applications to the calibration of infrared radiometer are performed to verify the performance of PSO-ASVR-based method compared to conventional data fitting methods.« less

  19. State of the Art on Functional Virgin Olive Oils Enriched with Bioactive Compounds and Their Properties.

    PubMed

    Reboredo-Rodríguez, Patricia; Figueiredo-González, María; González-Barreiro, Carmen; Simal-Gándara, Jesús; Salvador, María Desamparados; Cancho-Grande, Beatriz; Fregapane, Giuseppe

    2017-03-20

    Virgin olive oil, the main fat of the Mediterranean diet, is per se considered as a functional food-as stated by the European Food Safety Authority (EFSA)-due to its content in healthy compounds. The daily intake of endogenous bioactive phenolics from virgin olive oil is variable due to the influence of multiple agronomic and technological factors. Thus, a good strategy to ensure an optimal intake of polyphenols through habitual diet would be to produce enriched virgin olive oil with well-known bioactive polyphenols. Different sources of natural biological active substances can be potentially used to enrich virgin olive oil (e.g., raw materials derived from the same olive tree, mainly olive leaves and pomaces, and/or other compounds from plants and vegetables, mainly herbs and spices). The development of these functional olive oils may help in prevention of chronic diseases (such as cardiovascular diseases, immune frailty, ageing disorders and degenerative diseases) and improving the quality of life for many consumers reducing health care costs. In the present review, the most relevant scientific information related to the development of enriched virgin olive oil and their positive human health effects has been collected and discussed.

  20. A novel isoflavone profiling method based on UPLC-PDA-ESI-MS.

    PubMed

    Zhang, Shuang; Zheng, Zong-Ping; Zeng, Mao-Mao; He, Zhi-Yong; Tao, Guan-Jun; Qin, Fang; Chen, Jie

    2017-03-15

    A novel non-targeted isoflavone profiling method was developed using the diagnostic fragment-ion-based extension strategy, based on ultra-high performance liquid chromatography coupled with photo-diode array detector and electrospray ionization-mass spectrometry (UPLC-PDA-ESI-MS). 16 types of isoflavones were obtained in positive mode, but only 12 were obtained in negative mode due to the absence of precursor ions. Malonyldaidzin and malonylgenistin glycosylated at the 4'-O position or malonylated at the 4″-O position of glucose were indicated by their retention behavior and fragmentation pattern. Three possible quantification methods in one run based on UPLC-PDA and UPLC-ESI-MS were validated and compared, suggesting that methods based on UPLC-ESI-MS possess remarkable selectivity and sensitivity. Impermissible quantitative deviations induced by the linearity calibration with 400-fold dynamic range was observed for the first time and was recalibrated with a 20-fold dynamic range. These results suggest that isoflavones and their stereoisomers can be simultaneously determined by positive-ion UPLC-ESI-MS in soymilk. Copyright © 2016. Published by Elsevier Ltd.

  1. Method for PE Pipes Fusion Jointing Based on TRIZ Contradictions Theory

    NASA Astrophysics Data System (ADS)

    Sun, Jianguang; Tan, Runhua; Gao, Jinyong; Wei, Zihui

    The core of the TRIZ theories is the contradiction detection and solution. TRIZ provided various methods for the contradiction solution, but all that is not systematized. Combined with the technique system conception, this paper summarizes an integration solution method for contradiction solution based on the TRIZ contradiction theory. According to the method, a flowchart of integration solution method for contradiction is given. As a casestudy, method of fusion jointing PE pipe is analysised.

  2. Research and Implementation of Tibetan Word Segmentation Based on Syllable Methods

    NASA Astrophysics Data System (ADS)

    Jiang, Jing; Li, Yachao; Jiang, Tao; Yu, Hongzhi

    2018-03-01

    Tibetan word segmentation (TWS) is an important problem in Tibetan information processing, while abbreviated word recognition is one of the key and most difficult problems in TWS. Most of the existing methods of Tibetan abbreviated word recognition are rule-based approaches, which need vocabulary support. In this paper, we propose a method based on sequence tagging model for abbreviated word recognition, and then implement in TWS systems with sequence labeling models. The experimental results show that our abbreviated word recognition method is fast and effective and can be combined easily with the segmentation model. This significantly increases the effect of the Tibetan word segmentation.

  3. A method of mobile video transmission based on J2ee

    NASA Astrophysics Data System (ADS)

    Guo, Jian-xin; Zhao, Ji-chun; Gong, Jing; Chun, Yang

    2013-03-01

    As 3G (3rd-generation) networks evolve worldwide, the rising demand for mobile video services and the enormous growth of video on the internet is creating major new revenue opportunities for mobile network operators and application developers. The text introduced a method of mobile video transmission based on J2ME, giving the method of video compressing, then describing the video compressing standard, and then describing the software design. The proposed mobile video method based on J2EE is a typical mobile multimedia application, which has a higher availability and a wide range of applications. The users can get the video through terminal devices such as phone.

  4. Infrared image segmentation method based on spatial coherence histogram and maximum entropy

    NASA Astrophysics Data System (ADS)

    Liu, Songtao; Shen, Tongsheng; Dai, Yao

    2014-11-01

    In order to segment the target well and suppress background noises effectively, an infrared image segmentation method based on spatial coherence histogram and maximum entropy is proposed. First, spatial coherence histogram is presented by weighting the importance of the different position of these pixels with the same gray-level, which is obtained by computing their local density. Then, after enhancing the image by spatial coherence histogram, 1D maximum entropy method is used to segment the image. The novel method can not only get better segmentation results, but also have a faster computation time than traditional 2D histogram-based segmentation methods.

  5. Evaluation of Deep Learning Based Stereo Matching Methods: from Ground to Aerial Images

    NASA Astrophysics Data System (ADS)

    Liu, J.; Ji, S.; Zhang, C.; Qin, Z.

    2018-05-01

    Dense stereo matching has been extensively studied in photogrammetry and computer vision. In this paper we evaluate the application of deep learning based stereo methods, which were raised from 2016 and rapidly spread, on aerial stereos other than ground images that are commonly used in computer vision community. Two popular methods are evaluated. One learns matching cost with a convolutional neural network (known as MC-CNN); the other produces a disparity map in an end-to-end manner by utilizing both geometry and context (known as GC-net). First, we evaluate the performance of the deep learning based methods for aerial stereo images by a direct model reuse. The models pre-trained on KITTI 2012, KITTI 2015 and Driving datasets separately, are directly applied to three aerial datasets. We also give the results of direct training on target aerial datasets. Second, the deep learning based methods are compared to the classic stereo matching method, Semi-Global Matching(SGM), and a photogrammetric software, SURE, on the same aerial datasets. Third, transfer learning strategy is introduced to aerial image matching based on the assumption of a few target samples available for model fine tuning. It experimentally proved that the conventional methods and the deep learning based methods performed similarly, and the latter had greater potential to be explored.

  6. Investigation of self-adaptive LED surgical lighting based on entropy contrast enhancing method

    NASA Astrophysics Data System (ADS)

    Liu, Peng; Wang, Huihui; Zhang, Yaqin; Shen, Junfei; Wu, Rengmao; Zheng, Zhenrong; Li, Haifeng; Liu, Xu

    2014-05-01

    Investigation was performed to explore the possibility of enhancing contrast by varying the spectral distribution (SPD) of the surgical lighting. The illumination scenes with different SPDs were generated by the combination of a self-adaptive white light optimization method and the LED ceiling system, the images of biological sample are taken by a CCD camera and then processed by an 'Entropy' based contrast evaluation model which is proposed specific for surgery occasion. Compared with the neutral white LED based and traditional algorithm based image enhancing methods, the illumination based enhancing method turns out a better performance in contrast enhancing and improves the average contrast value about 9% and 6%, respectively. This low cost method is simple, practicable, and thus may provide an alternative solution for the expensive visual facility medical instruments.

  7. A Natural Teaching Method Based on Learning Theory.

    ERIC Educational Resources Information Center

    Smilkstein, Rita

    1991-01-01

    The natural teaching method is active and student-centered, based on schema and constructivist theories, and informed by research in neuroplasticity. A schema is a mental picture or understanding of something we have learned. Humans can have knowledge only to the degree to which they have constructed schemas from learning experiences and practice.…

  8. Effective Teaching Methods--Project-based Learning in Physics

    ERIC Educational Resources Information Center

    Holubova, Renata

    2008-01-01

    The paper presents results of the research of new effective teaching methods in physics and science. It is found out that it is necessary to educate pre-service teachers in approaches stressing the importance of the own activity of students, in competences how to create an interdisciplinary project. Project-based physics teaching and learning…

  9. Experimental cocrystal screening and solution based scale-up cocrystallization methods.

    PubMed

    Malamatari, Maria; Ross, Steven A; Douroumis, Dennis; Velaga, Sitaram P

    2017-08-01

    Cocrystals are crystalline single phase materials composed of two or more different molecular and/or ionic compounds generally in a stoichiometric ratio which are neither solvates nor simple salts. If one of the components is an active pharmaceutical ingredient (API), the term pharmaceutical cocrystal is often used. There is a growing interest among drug development scientists in exploring cocrystals, as means to address physicochemical, biopharmaceutical and mechanical properties and expand solid form diversity of the API. Conventionally, coformers are selected based on crystal engineering principles, and the equimolar mixtures of API and coformers are subjected to solution-based crystallization that are commonly employed in polymorph and salt screening. However, the availability of new knowledge on cocrystal phase behaviour in solid state and solutions has spurred the development and implementation of more rational experimental cocrystal screening as well as scale-up methods. This review aims to provide overview of commonly employed solid form screening techniques in drug development with an emphasis on cocrystal screening methodologies. The latest developments in understanding and the use of cocrystal phase diagrams in both screening and solution based scale-up methods are also presented. Final section is devoted to reviewing the state of the art research covering solution based scale-up cocrystallization process for different cocrystals besides more recent continuous crystallization methods. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. An Improved Interferometric Calibration Method Based on Independent Parameter Decomposition

    NASA Astrophysics Data System (ADS)

    Fan, J.; Zuo, X.; Li, T.; Chen, Q.; Geng, X.

    2018-04-01

    Interferometric SAR is sensitive to earth surface undulation. The accuracy of interferometric parameters plays a significant role in precise digital elevation model (DEM). The interferometric calibration is to obtain high-precision global DEM by calculating the interferometric parameters using ground control points (GCPs). However, interferometric parameters are always calculated jointly, making them difficult to decompose precisely. In this paper, we propose an interferometric calibration method based on independent parameter decomposition (IPD). Firstly, the parameters related to the interferometric SAR measurement are determined based on the three-dimensional reconstruction model. Secondly, the sensitivity of interferometric parameters is quantitatively analyzed after the geometric parameters are completely decomposed. Finally, each interferometric parameter is calculated based on IPD and interferometric calibration model is established. We take Weinan of Shanxi province as an example and choose 4 TerraDEM-X image pairs to carry out interferometric calibration experiment. The results show that the elevation accuracy of all SAR images is better than 2.54 m after interferometric calibration. Furthermore, the proposed method can obtain the accuracy of DEM products better than 2.43 m in the flat area and 6.97 m in the mountainous area, which can prove the correctness and effectiveness of the proposed IPD based interferometric calibration method. The results provide a technical basis for topographic mapping of 1 : 50000 and even larger scale in the flat area and mountainous area.

  11. Environment-based pin-power reconstruction method for homogeneous core calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leroyer, H.; Brosselard, C.; Girardi, E.

    2012-07-01

    Core calculation schemes are usually based on a classical two-step approach associated with assembly and core calculations. During the first step, infinite lattice assemblies calculations relying on a fundamental mode approach are used to generate cross-sections libraries for PWRs core calculations. This fundamental mode hypothesis may be questioned when dealing with loading patterns involving several types of assemblies (UOX, MOX), burnable poisons, control rods and burn-up gradients. This paper proposes a calculation method able to take into account the heterogeneous environment of the assemblies when using homogeneous core calculations and an appropriate pin-power reconstruction. This methodology is applied to MOXmore » assemblies, computed within an environment of UOX assemblies. The new environment-based pin-power reconstruction is then used on various clusters of 3x3 assemblies showing burn-up gradients and UOX/MOX interfaces, and compared to reference calculations performed with APOLLO-2. The results show that UOX/MOX interfaces are much better calculated with the environment-based calculation scheme when compared to the usual pin-power reconstruction method. The power peak is always better located and calculated with the environment-based pin-power reconstruction method on every cluster configuration studied. This study shows that taking into account the environment in transport calculations can significantly improve the pin-power reconstruction so far as it is consistent with the core loading pattern. (authors)« less

  12. Reentry trajectory optimization based on a multistage pseudospectral method.

    PubMed

    Zhao, Jiang; Zhou, Rui; Jin, Xuelian

    2014-01-01

    Of the many direct numerical methods, the pseudospectral method serves as an effective tool to solve the reentry trajectory optimization for hypersonic vehicles. However, the traditional pseudospectral method is time-consuming due to large number of discretization points. For the purpose of autonomous and adaptive reentry guidance, the research herein presents a multistage trajectory control strategy based on the pseudospectral method, capable of dealing with the unexpected situations in reentry flight. The strategy typically includes two subproblems: the trajectory estimation and trajectory refining. In each processing stage, the proposed method generates a specified range of trajectory with the transition of the flight state. The full glide trajectory consists of several optimal trajectory sequences. The newly focused geographic constraints in actual flight are discussed thereafter. Numerical examples of free-space flight, target transition flight, and threat avoidance flight are used to show the feasible application of multistage pseudospectral method in reentry trajectory optimization.

  13. Reentry Trajectory Optimization Based on a Multistage Pseudospectral Method

    PubMed Central

    Zhou, Rui; Jin, Xuelian

    2014-01-01

    Of the many direct numerical methods, the pseudospectral method serves as an effective tool to solve the reentry trajectory optimization for hypersonic vehicles. However, the traditional pseudospectral method is time-consuming due to large number of discretization points. For the purpose of autonomous and adaptive reentry guidance, the research herein presents a multistage trajectory control strategy based on the pseudospectral method, capable of dealing with the unexpected situations in reentry flight. The strategy typically includes two subproblems: the trajectory estimation and trajectory refining. In each processing stage, the proposed method generates a specified range of trajectory with the transition of the flight state. The full glide trajectory consists of several optimal trajectory sequences. The newly focused geographic constraints in actual flight are discussed thereafter. Numerical examples of free-space flight, target transition flight, and threat avoidance flight are used to show the feasible application of multistage pseudospectral method in reentry trajectory optimization. PMID:24574929

  14. A vision-based method for planar position measurement

    NASA Astrophysics Data System (ADS)

    Chen, Zong-Hao; Huang, Peisen S.

    2016-12-01

    In this paper, a vision-based method is proposed for three-degree-of-freedom (3-DOF) planar position (XY{θZ} ) measurement. This method uses a single camera to capture the image of a 2D periodic pattern and then uses the 2D discrete Fourier transform (2D DFT) method to estimate the phase of its fundamental frequency component for position measurement. To improve position measurement accuracy, the phase estimation error of 2D DFT is analyzed and a phase estimation method is proposed. Different simulations are done to verify the feasibility of this method and study the factors that influence the accuracy and precision of phase estimation. To demonstrate the performance of the proposed method for position measurement, a prototype encoder consisting of a black-and-white industrial camera with VGA resolution (480  ×  640 pixels) and an iPhone 4s has been developed. Experimental results show the peak-to-peak resolutions to be 3.5 nm in X axis, 8 nm in Y axis and 4 μ \\text{rad} in {θZ} axis. The corresponding RMS resolutions are 0.52 nm, 1.06 nm, and 0.60 μ \\text{rad} respectively.

  15. A method for data base management and analysis for wind tunnel data

    NASA Technical Reports Server (NTRS)

    Biser, Aileen O.

    1987-01-01

    To respond to the need for improved data base management and analysis capabilities for wind-tunnel data at the Langley 16-Foot Transonic Tunnel, research was conducted into current methods of managing wind-tunnel data and a method was developed as a solution to this need. This paper describes the development of the data base management and analysis method for wind-tunnel data. The design and implementation of the software system are discussed and examples of its use are shown.

  16. Stable isotope labelling methods in mass spectrometry-based quantitative proteomics.

    PubMed

    Chahrour, Osama; Cobice, Diego; Malone, John

    2015-09-10

    Mass-spectrometry based proteomics has evolved as a promising technology over the last decade and is undergoing a dramatic development in a number of different areas, such as; mass spectrometric instrumentation, peptide identification algorithms and bioinformatic computational data analysis. The improved methodology allows quantitative measurement of relative or absolute protein amounts, which is essential for gaining insights into their functions and dynamics in biological systems. Several different strategies involving stable isotopes label (ICAT, ICPL, IDBEST, iTRAQ, TMT, IPTL, SILAC), label-free statistical assessment approaches (MRM, SWATH) and absolute quantification methods (AQUA) are possible, each having specific strengths and weaknesses. Inductively coupled plasma mass spectrometry (ICP-MS), which is still widely recognised as elemental detector, has recently emerged as a complementary technique to the previous methods. The new application area for ICP-MS is targeting the fast growing field of proteomics related research, allowing absolute protein quantification using suitable elemental based tags. This document describes the different stable isotope labelling methods which incorporate metabolic labelling in live cells, ICP-MS based detection and post-harvest chemical label tagging for protein quantification, in addition to summarising their pros and cons. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. Evaluation method based on the image correlation for laser jamming image

    NASA Astrophysics Data System (ADS)

    Che, Jinxi; Li, Zhongmin; Gao, Bo

    2013-09-01

    The jamming effectiveness evaluation of infrared imaging system is an important part of electro-optical countermeasure. The infrared imaging devices in the military are widely used in the searching, tracking and guidance and so many other fields. At the same time, with the continuous development of laser technology, research of laser interference and damage effect developed continuously, laser has been used to disturbing the infrared imaging device. Therefore, the effect evaluation of the infrared imaging system by laser has become a meaningful problem to be solved. The information that the infrared imaging system ultimately present to the user is an image, so the evaluation on jamming effect can be made from the point of assessment of image quality. The image contains two aspects of the information, the light amplitude and light phase, so the image correlation can accurately perform the difference between the original image and disturbed image. In the paper, the evaluation method of digital image correlation, the assessment method of image quality based on Fourier transform, the estimate method of image quality based on error statistic and the evaluation method of based on peak signal noise ratio are analysed. In addition, the advantages and disadvantages of these methods are analysed. Moreover, the infrared disturbing images of the experiment result, in which the thermal infrared imager was interfered by laser, were analysed by using these methods. The results show that the methods can better reflect the jamming effects of the infrared imaging system by laser. Furthermore, there is good consistence between evaluation results by using the methods and the results of subjective visual evaluation. And it also provides well repeatability and convenient quantitative analysis. The feasibility of the methods to evaluate the jamming effect was proved. It has some extent reference value for the studying and developing on electro-optical countermeasures equipments and

  18. Progress in Working Towards a More Sustainable Agri-Food Industry

    EPA Science Inventory

    The human health and environmental issues related to food, feed, and bio-based systems, range widely from greenhouse gas emissions and energy use to land use, water availability, soil quality, water quality and quantity, biodi-versity losses, and chemical exposure. Threats that s...

  19. A Fast Method for Embattling Optimization of Ground-Based Radar Surveillance Network

    NASA Astrophysics Data System (ADS)

    Jiang, H.; Cheng, H.; Zhang, Y.; Liu, J.

    A growing number of space activities have created an orbital debris environment that poses increasing impact risks to existing space systems and human space flight. For the safety of in-orbit spacecraft, a lot of observation facilities are needed to catalog space objects, especially in low earth orbit. Surveillance of Low earth orbit objects are mainly rely on ground-based radar, due to the ability limitation of exist radar facilities, a large number of ground-based radar need to build in the next few years in order to meet the current space surveillance demands. How to optimize the embattling of ground-based radar surveillance network is a problem to need to be solved. The traditional method for embattling optimization of ground-based radar surveillance network is mainly through to the detection simulation of all possible stations with cataloged data, and makes a comprehensive comparative analysis of various simulation results with the combinational method, and then selects an optimal result as station layout scheme. This method is time consuming for single simulation and high computational complexity for the combinational analysis, when the number of stations increases, the complexity of optimization problem will be increased exponentially, and cannot be solved with traditional method. There is no better way to solve this problem till now. In this paper, target detection procedure was simplified. Firstly, the space coverage of ground-based radar was simplified, a space coverage projection model of radar facilities in different orbit altitudes was built; then a simplified objects cross the radar coverage model was established according to the characteristics of space objects orbit motion; after two steps simplification, the computational complexity of the target detection was greatly simplified, and simulation results shown the correctness of the simplified results. In addition, the detection areas of ground-based radar network can be easily computed with the

  20. Comparative analysis of ROS-based monocular SLAM methods for indoor navigation

    NASA Astrophysics Data System (ADS)

    Buyval, Alexander; Afanasyev, Ilya; Magid, Evgeni

    2017-03-01

    This paper presents a comparison of four most recent ROS-based monocular SLAM-related methods: ORB-SLAM, REMODE, LSD-SLAM, and DPPTAM, and analyzes their feasibility for a mobile robot application in indoor environment. We tested these methods using video data that was recorded from a conventional wide-angle full HD webcam with a rolling shutter. The camera was mounted on a human-operated prototype of an unmanned ground vehicle, which followed a closed-loop trajectory. Both feature-based methods (ORB-SLAM, REMODE) and direct SLAMrelated algorithms (LSD-SLAM, DPPTAM) demonstrated reasonably good results in detection of volumetric objects, corners, obstacles and other local features. However, we met difficulties with recovering typical for offices homogeneously colored walls, since all of these methods created empty spaces in a reconstructed sparse 3D scene. This may cause collisions of an autonomously guided robot with unfeatured walls and thus limits applicability of maps, which are obtained by the considered monocular SLAM-related methods for indoor robot navigation.

  1. A new rational-based optimal design strategy of ship structure based on multi-level analysis and super-element modeling method

    NASA Astrophysics Data System (ADS)

    Sun, Li; Wang, Deyu

    2011-09-01

    A new multi-level analysis method of introducing the super-element modeling method, derived from the multi-level analysis method first proposed by O. F. Hughes, has been proposed in this paper to solve the problem of high time cost in adopting a rational-based optimal design method for ship structural design. Furthermore, the method was verified by its effective application in optimization of the mid-ship section of a container ship. A full 3-D FEM model of a ship, suffering static and quasi-static loads, was used as the analyzing object for evaluating the structural performance of the mid-ship module, including static strength and buckling performance. Research results reveal that this new method could substantially reduce the computational cost of the rational-based optimization problem without decreasing its accuracy, which increases the feasibility and economic efficiency of using a rational-based optimal design method in ship structural design.

  2. The Application of Continuous Wavelet Transform Based Foreground Subtraction Method in 21 cm Sky Surveys

    NASA Astrophysics Data System (ADS)

    Gu, Junhua; Xu, Haiguang; Wang, Jingying; An, Tao; Chen, Wen

    2013-08-01

    We propose a continuous wavelet transform based non-parametric foreground subtraction method for the detection of redshifted 21 cm signal from the epoch of reionization. This method works based on the assumption that the foreground spectra are smooth in frequency domain, while the 21 cm signal spectrum is full of saw-tooth-like structures, thus their characteristic scales are significantly different. We can distinguish them in the wavelet coefficient space easily and perform the foreground subtraction. Compared with the traditional spectral fitting based method, our method is more tolerant to complex foregrounds. Furthermore, we also find that when the instrument has uncorrected response error, our method can also work significantly better than the spectral fitting based method. Our method can obtain similar results with the Wp smoothing method, which is also a non-parametric method, but our method consumes much less computing time.

  3. Cepstrum based feature extraction method for fungus detection

    NASA Astrophysics Data System (ADS)

    Yorulmaz, Onur; Pearson, Tom C.; Çetin, A. Enis

    2011-06-01

    In this paper, a method for detection of popcorn kernels infected by a fungus is developed using image processing. The method is based on two dimensional (2D) mel and Mellin-cepstrum computation from popcorn kernel images. Cepstral features that were extracted from popcorn images are classified using Support Vector Machines (SVM). Experimental results show that high recognition rates of up to 93.93% can be achieved for both damaged and healthy popcorn kernels using 2D mel-cepstrum. The success rate for healthy popcorn kernels was found to be 97.41% and the recognition rate for damaged kernels was found to be 89.43%.

  4. A Self-Adaptive Model-Based Wi-Fi Indoor Localization Method.

    PubMed

    Tuta, Jure; Juric, Matjaz B

    2016-12-06

    This paper presents a novel method for indoor localization, developed with the main aim of making it useful for real-world deployments. Many indoor localization methods exist, yet they have several disadvantages in real-world deployments-some are static, which is not suitable for long-term usage; some require costly human recalibration procedures; and others require special hardware such as Wi-Fi anchors and transponders. Our method is self-calibrating and self-adaptive thus maintenance free and based on Wi-Fi only. We have employed two well-known propagation models-free space path loss and ITU models-which we have extended with additional parameters for better propagation simulation. Our self-calibrating procedure utilizes one propagation model to infer parameters of the space and the other to simulate the propagation of the signal without requiring any additional hardware beside Wi-Fi access points, which is suitable for real-world usage. Our method is also one of the few model-based Wi-Fi only self-adaptive approaches that do not require the mobile terminal to be in the access-point mode. The only input requirements of the method are Wi-Fi access point positions, and positions and properties of the walls. Our method has been evaluated in single- and multi-room environments, with measured mean error of 2-3 and 3-4 m, respectively, which is similar to existing methods. The evaluation has proven that usable localization accuracy can be achieved in real-world environments solely by the proposed Wi-Fi method that relies on simple hardware and software requirements.

  5. A Self-Adaptive Model-Based Wi-Fi Indoor Localization Method

    PubMed Central

    Tuta, Jure; Juric, Matjaz B.

    2016-01-01

    This paper presents a novel method for indoor localization, developed with the main aim of making it useful for real-world deployments. Many indoor localization methods exist, yet they have several disadvantages in real-world deployments—some are static, which is not suitable for long-term usage; some require costly human recalibration procedures; and others require special hardware such as Wi-Fi anchors and transponders. Our method is self-calibrating and self-adaptive thus maintenance free and based on Wi-Fi only. We have employed two well-known propagation models—free space path loss and ITU models—which we have extended with additional parameters for better propagation simulation. Our self-calibrating procedure utilizes one propagation model to infer parameters of the space and the other to simulate the propagation of the signal without requiring any additional hardware beside Wi-Fi access points, which is suitable for real-world usage. Our method is also one of the few model-based Wi-Fi only self-adaptive approaches that do not require the mobile terminal to be in the access-point mode. The only input requirements of the method are Wi-Fi access point positions, and positions and properties of the walls. Our method has been evaluated in single- and multi-room environments, with measured mean error of 2–3 and 3–4 m, respectively, which is similar to existing methods. The evaluation has proven that usable localization accuracy can be achieved in real-world environments solely by the proposed Wi-Fi method that relies on simple hardware and software requirements. PMID:27929453

  6. Breast histopathology image segmentation using spatio-colour-texture based graph partition method.

    PubMed

    Belsare, A D; Mushrif, M M; Pangarkar, M A; Meshram, N

    2016-06-01

    This paper proposes a novel integrated spatio-colour-texture based graph partitioning method for segmentation of nuclear arrangement in tubules with a lumen or in solid islands without a lumen from digitized Hematoxylin-Eosin stained breast histology images, in order to automate the process of histology breast image analysis to assist the pathologists. We propose a new similarity based super pixel generation method and integrate it with texton representation to form spatio-colour-texture map of Breast Histology Image. Then a new weighted distance based similarity measure is used for generation of graph and final segmentation using normalized cuts method is obtained. The extensive experiments carried shows that the proposed algorithm can segment nuclear arrangement in normal as well as malignant duct in breast histology tissue image. For evaluation of the proposed method the ground-truth image database of 100 malignant and nonmalignant breast histology images is created with the help of two expert pathologists and the quantitative evaluation of proposed breast histology image segmentation has been performed. It shows that the proposed method outperforms over other methods. © 2015 The Authors Journal of Microscopy © 2015 Royal Microscopical Society.

  7. Validation of a new screening, determinative, and confirmatory multi-residue method for nitroimidazoles and their hydroxy metabolites in turkey muscle tissue by liquid chromatography-tandem mass spectrometry.

    PubMed

    Boison, Joe O; Asea, Philip A; Matus, Johanna L

    2012-08-01

    A new and sensitive multi-residue method (MRM) with detection by LC-MS/MS was developed and validated for the screening, determination, and confirmation of residues of 7 nitroimidazoles and 3 of their metabolites in turkey muscle tissues at concentrations ≥ 0.05 ng/g. The compounds were extracted into a solvent with an alkali salt. Sample clean-up and concentration was then done by solid-phase extraction (SPE) and the compounds were quantified by liquid chromatography-tandem mass spectrometry (LC-MS/MS). The characteristic parameters including repeatability, selectivity, ruggedness, stability, level of quantification, and level of confirmation for the new method were determined. Method validation was achieved by independent verification of the parameters measured during method characterization. The seven nitroimidazoles included are metronidazole (MTZ), ronidazole (RNZ), dimetridazole (DMZ), tinidazole (TNZ), ornidazole (ONZ), ipronidazole (IPR), and carnidazole (CNZ). It was discovered during the single laboratory validation of the method that five of the seven nitroimidazoles (i.e. metronidazole, dimetridazole, tinidazole, ornidazole and ipronidazole) and the 3 metabolites (1-(2-hydroxyethyl)-2-hydroxymethyl-5-nitroimidazole (MTZ-OH), 2-hydroxymethyl-1-methyl-5-nitroimidazole (HMMNI, the common metabolite of ronidazole and dimetridazole), and 1-methyl-2-(2'-hydroxyisopropyl)-5-nitroimidazole (IPR-OH) included in this study could be detected, confirmed, and quantified accurately whereas RNZ and CNZ could only be detected and confirmed but not accurately quantified. © Her Majesty the Queen in Right of Canada as Represented by the Minister of Agriculture and Agri-food Canada 2012.

  8. Membership determination of open clusters based on a spectral clustering method

    NASA Astrophysics Data System (ADS)

    Gao, Xin-Hua

    2018-06-01

    We present a spectral clustering (SC) method aimed at segregating reliable members of open clusters in multi-dimensional space. The SC method is a non-parametric clustering technique that performs cluster division using eigenvectors of the similarity matrix; no prior knowledge of the clusters is required. This method is more flexible in dealing with multi-dimensional data compared to other methods of membership determination. We use this method to segregate the cluster members of five open clusters (Hyades, Coma Ber, Pleiades, Praesepe, and NGC 188) in five-dimensional space; fairly clean cluster members are obtained. We find that the SC method can capture a small number of cluster members (weak signal) from a large number of field stars (heavy noise). Based on these cluster members, we compute the mean proper motions and distances for the Hyades, Coma Ber, Pleiades, and Praesepe clusters, and our results are in general quite consistent with the results derived by other authors. The test results indicate that the SC method is highly suitable for segregating cluster members of open clusters based on high-precision multi-dimensional astrometric data such as Gaia data.

  9. Hybrid charge division multiplexing method for silicon photomultiplier based PET detectors

    NASA Astrophysics Data System (ADS)

    Park, Haewook; Ko, Guen Bae; Lee, Jae Sung

    2017-06-01

    Silicon photomultiplier (SiPM) is widely utilized in various positron emission tomography (PET) detectors and systems. However, the individual recording of SiPM output signals is still challenging owing to the high granularity of the SiPM; thus, charge division multiplexing is commonly used in PET detectors. Resistive charge division method is well established for reducing the number of output channels in conventional multi-channel photosensors, but it degrades the timing performance of SiPM-based PET detectors by yielding a large resistor-capacitor (RC) constant. Capacitive charge division method, on the other hand, yields a small RC constant and provides a faster timing response than the resistive method, but it suffers from an output signal undershoot. Therefore, in this study, we propose a hybrid charge division method which can be implemented by cascading the parallel combination of a resistor and a capacitor throughout the multiplexing network. In order to compare the performance of the proposed method with the conventional methods, a 16-channel Hamamatsu SiPM (S11064-050P) was coupled with a 4  ×  4 LGSO crystal block (3  ×  3  ×  20 mm3) and a 9  ×  9 LYSO crystal block (1.2  ×  1.2  ×  10 mm3). In addition, we tested a time-over-threshold (TOT) readout using the digitized position signals to further demonstrate the feasibility of the time-based readout of multiplexed signals based on the proposed method. The results indicated that the proposed method exhibited good energy and timing performance, thus inheriting only the advantages of conventional resistive and capacitive methods. Moreover, the proposed method showed excellent pulse shape uniformity that does not depend on the position of the interacted crystal. Accordingly, we can conclude that the hybrid charge division method is useful for effectively reducing the number of output channels of the SiPM array.

  10. The attitude inversion method of geostationary satellites based on unscented particle filter

    NASA Astrophysics Data System (ADS)

    Du, Xiaoping; Wang, Yang; Hu, Heng; Gou, Ruixin; Liu, Hao

    2018-04-01

    The attitude information of geostationary satellites is difficult to be obtained since they are presented in non-resolved images on the ground observation equipment in space object surveillance. In this paper, an attitude inversion method for geostationary satellite based on Unscented Particle Filter (UPF) and ground photometric data is presented. The inversion algorithm based on UPF is proposed aiming at the strong non-linear feature in the photometric data inversion for satellite attitude, which combines the advantage of Unscented Kalman Filter (UKF) and Particle Filter (PF). This update method improves the particle selection based on the idea of UKF to redesign the importance density function. Moreover, it uses the RMS-UKF to partially correct the prediction covariance matrix, which improves the applicability of the attitude inversion method in view of UKF and the particle degradation and dilution of the attitude inversion method based on PF. This paper describes the main principles and steps of algorithm in detail, correctness, accuracy, stability and applicability of the method are verified by simulation experiment and scaling experiment in the end. The results show that the proposed method can effectively solve the problem of particle degradation and depletion in the attitude inversion method on account of PF, and the problem that UKF is not suitable for the strong non-linear attitude inversion. However, the inversion accuracy is obviously superior to UKF and PF, in addition, in the case of the inversion with large attitude error that can inverse the attitude with small particles and high precision.

  11. Advanced image based methods for structural integrity monitoring: Review and prospects

    NASA Astrophysics Data System (ADS)

    Farahani, Behzad V.; Sousa, Pedro José; Barros, Francisco; Tavares, Paulo J.; Moreira, Pedro M. G. P.

    2018-02-01

    There is a growing trend in engineering to develop methods for structural integrity monitoring and characterization of in-service mechanical behaviour of components. The fast growth in recent years of image processing techniques and image-based sensing for experimental mechanics, brought about a paradigm change in phenomena sensing. Hence, several widely applicable optical approaches are playing a significant role in support of experiment. The current review manuscript describes advanced image based methods for structural integrity monitoring, and focuses on methods such as Digital Image Correlation (DIC), Thermoelastic Stress Analysis (TSA), Electronic Speckle Pattern Interferometry (ESPI) and Speckle Pattern Shearing Interferometry (Shearography). These non-contact full-field techniques rely on intensive image processing methods to measure mechanical behaviour, and evolve even as reviews such as this are being written, which justifies a special effort to keep abreast of this progress.

  12. An adaptive block-based fusion method with LUE-SSIM for multi-focus images

    NASA Astrophysics Data System (ADS)

    Zheng, Jianing; Guo, Yongcai; Huang, Yukun

    2016-09-01

    Because of the lenses' limited depth of field, digital cameras are incapable of acquiring an all-in-focus image of objects at varying distances in a scene. Multi-focus image fusion technique can effectively solve this problem. Aiming at the block-based multi-focus image fusion methods, the problem that blocking-artifacts often occurs. An Adaptive block-based fusion method based on lifting undistorted-edge structural similarity (LUE-SSIM) is put forward. In this method, image quality metrics LUE-SSIM is firstly proposed, which utilizes the characteristics of human visual system (HVS) and structural similarity (SSIM) to make the metrics consistent with the human visual perception. Particle swarm optimization(PSO) algorithm which selects LUE-SSIM as the object function is used for optimizing the block size to construct the fused image. Experimental results on LIVE image database shows that LUE-SSIM outperform SSIM on Gaussian defocus blur images quality assessment. Besides, multi-focus image fusion experiment is carried out to verify our proposed image fusion method in terms of visual and quantitative evaluation. The results show that the proposed method performs better than some other block-based methods, especially in reducing the blocking-artifact of the fused image. And our method can effectively preserve the undistorted-edge details in focus region of the source images.

  13. Likelihood-based methods for evaluating principal surrogacy in augmented vaccine trials.

    PubMed

    Liu, Wei; Zhang, Bo; Zhang, Hui; Zhang, Zhiwei

    2017-04-01

    There is growing interest in assessing immune biomarkers, which are quick to measure and potentially predictive of long-term efficacy, as surrogate endpoints in randomized, placebo-controlled vaccine trials. This can be done under a principal stratification approach, with principal strata defined using a subject's potential immune responses to vaccine and placebo (the latter may be assumed to be zero). In this context, principal surrogacy refers to the extent to which vaccine efficacy varies across principal strata. Because a placebo recipient's potential immune response to vaccine is unobserved in a standard vaccine trial, augmented vaccine trials have been proposed to produce the information needed to evaluate principal surrogacy. This article reviews existing methods based on an estimated likelihood and a pseudo-score (PS) and proposes two new methods based on a semiparametric likelihood (SL) and a pseudo-likelihood (PL), for analyzing augmented vaccine trials. Unlike the PS method, the SL method does not require a model for missingness, which can be advantageous when immune response data are missing by happenstance. The SL method is shown to be asymptotically efficient, and it performs similarly to the PS and PL methods in simulation experiments. The PL method appears to have a computational advantage over the PS and SL methods.

  14. Continental-Scale Validation of Modis-Based and LEDAPS Landsat ETM + Atmospheric Correction Methods

    NASA Technical Reports Server (NTRS)

    Ju, Junchang; Roy, David P.; Vermote, Eric; Masek, Jeffrey; Kovalskyy, Valeriy

    2012-01-01

    The potential of Landsat data processing to provide systematic continental scale products has been demonstratedby several projects including the NASA Web-enabled Landsat Data (WELD) project. The recent freeavailability of Landsat data increases the need for robust and efficient atmospheric correction algorithms applicableto large volume Landsat data sets. This paper compares the accuracy of two Landsat atmospheric correctionmethods: a MODIS-based method and the Landsat Ecosystem Disturbance Adaptive ProcessingSystem (LEDAPS) method. Both methods are based on the 6SV radiative transfer code but have different atmosphericcharacterization approaches. The MODIS-based method uses the MODIS Terra derived dynamicaerosol type, aerosol optical thickness, and water vapor to atmospherically correct ETM+ acquisitions ineach coincident orbit. The LEDAPS method uses aerosol characterizations derived independently from eachLandsat acquisition and assumes a fixed continental aerosol type and uses ancillary water vapor. Validationresults are presented comparing ETM+ atmospherically corrected data generated using these two methodswith AERONET corrected ETM+ data for 95 10 km10 km 30 m subsets, a total of nearly 8 million 30 mpixels, located across the conterminous United States. The results indicate that the MODIS-based methodhas better accuracy than the LEDAPS method for the ETM+ red and longer wavelength bands.

  15. Sea Ice Detection Based on an Improved Similarity Measurement Method Using Hyperspectral Data.

    PubMed

    Han, Yanling; Li, Jue; Zhang, Yun; Hong, Zhonghua; Wang, Jing

    2017-05-15

    Hyperspectral remote sensing technology can acquire nearly continuous spectrum information and rich sea ice image information, thus providing an important means of sea ice detection. However, the correlation and redundancy among hyperspectral bands reduce the accuracy of traditional sea ice detection methods. Based on the spectral characteristics of sea ice, this study presents an improved similarity measurement method based on linear prediction (ISMLP) to detect sea ice. First, the first original band with a large amount of information is determined based on mutual information theory. Subsequently, a second original band with the least similarity is chosen by the spectral correlation measuring method. Finally, subsequent bands are selected through the linear prediction method, and a support vector machine classifier model is applied to classify sea ice. In experiments performed on images of Baffin Bay and Bohai Bay, comparative analyses were conducted to compare the proposed method and traditional sea ice detection methods. Our proposed ISMLP method achieved the highest classification accuracies (91.18% and 94.22%) in both experiments. From these results the ISMLP method exhibits better performance overall than other methods and can be effectively applied to hyperspectral sea ice detection.

  16. Sea Ice Detection Based on an Improved Similarity Measurement Method Using Hyperspectral Data

    PubMed Central

    Han, Yanling; Li, Jue; Zhang, Yun; Hong, Zhonghua; Wang, Jing

    2017-01-01

    Hyperspectral remote sensing technology can acquire nearly continuous spectrum information and rich sea ice image information, thus providing an important means of sea ice detection. However, the correlation and redundancy among hyperspectral bands reduce the accuracy of traditional sea ice detection methods. Based on the spectral characteristics of sea ice, this study presents an improved similarity measurement method based on linear prediction (ISMLP) to detect sea ice. First, the first original band with a large amount of information is determined based on mutual information theory. Subsequently, a second original band with the least similarity is chosen by the spectral correlation measuring method. Finally, subsequent bands are selected through the linear prediction method, and a support vector machine classifier model is applied to classify sea ice. In experiments performed on images of Baffin Bay and Bohai Bay, comparative analyses were conducted to compare the proposed method and traditional sea ice detection methods. Our proposed ISMLP method achieved the highest classification accuracies (91.18% and 94.22%) in both experiments. From these results the ISMLP method exhibits better performance overall than other methods and can be effectively applied to hyperspectral sea ice detection. PMID:28505135

  17. Mixed ionic-electronic conductor-based radiation detectors and methods of fabrication

    DOEpatents

    Conway, Adam; Beck, Patrick R; Graff, Robert T; Nelson, Art; Nikolic, Rebecca J; Payne, Stephen A; Voss, Lars; Kim, Hadong

    2015-04-07

    A method of fabricating a mixed ionic-electronic conductor (e.g. TlBr)-based radiation detector having halide-treated surfaces and associated methods of fabrication, which controls polarization of the mixed ionic-electronic MIEC material to improve stability and operational lifetime.

  18. High-speed engine/component performance assessment using exergy and thrust-based methods

    NASA Technical Reports Server (NTRS)

    Riggins, D. W.

    1996-01-01

    This investigation summarizes a comparative study of two high-speed engine performance assessment techniques based on energy (available work) and thrust-potential (thrust availability). Simple flow-fields utilizing Rayleigh heat addition and one-dimensional flow with friction are used to demonstrate the fundamental inability of conventional energy techniques to predict engine component performance, aid in component design, or accurately assess flow losses. The use of the thrust-based method on these same examples demonstrates its ability to yield useful information in all these categories. Energy and thrust are related and discussed from the stand-point of their fundamental thermodynamic and fluid dynamic definitions in order to explain the differences in information obtained using the two methods. The conventional definition of energy is shown to include work which is inherently unavailable to an aerospace Brayton engine. An engine-based energy is then developed which accurately accounts for this inherently unavailable work; performance parameters based on this quantity are then shown to yield design and loss information equivalent to the thrust-based method.

  19. A Micromechanics-Based Method for Multiscale Fatigue Prediction

    NASA Astrophysics Data System (ADS)

    Moore, John Allan

    An estimated 80% of all structural failures are due to mechanical fatigue, often resulting in catastrophic, dangerous and costly failure events. However, an accurate model to predict fatigue remains an elusive goal. One of the major challenges is that fatigue is intrinsically a multiscale process, which is dependent on a structure's geometric design as well as its material's microscale morphology. The following work begins with a microscale study of fatigue nucleation around non- metallic inclusions. Based on this analysis, a novel multiscale method for fatigue predictions is developed. This method simulates macroscale geometries explicitly while concurrently calculating the simplified response of microscale inclusions. Thus, providing adequate detail on multiple scales for accurate fatigue life predictions. The methods herein provide insight into the multiscale nature of fatigue, while also developing a tool to aid in geometric design and material optimization for fatigue critical devices such as biomedical stents and artificial heart valves.

  20. An Inquiry-Based Approach to Teaching Research Methods in Information Studies

    ERIC Educational Resources Information Center

    Albright, Kendra; Petrulis, Robert; Vasconcelos, Ana; Wood, Jamie

    2012-01-01

    This paper presents the results of a project that aimed at restructuring the delivery of research methods training at the Information School at the University of Sheffield, UK, based on an Inquiry-Based Learning (IBL) approach. The purpose of this research was to implement inquiry-based learning that would allow customization of research methods…

  1. Bioanalytical method transfer considerations of chromatographic-based assays.

    PubMed

    Williard, Clark V

    2016-07-01

    Bioanalysis is an important part of the modern drug development process. The business practice of outsourcing and transferring bioanalytical methods from laboratory to laboratory has increasingly become a crucial strategy for successful and efficient delivery of therapies to the market. This chapter discusses important considerations when transferring various types of chromatographic-based assays in today's pharmaceutical research and development environment.

  2. Accuracy of Dual-Energy Virtual Monochromatic CT Numbers: Comparison between the Single-Source Projection-Based and Dual-Source Image-Based Methods.

    PubMed

    Ueguchi, Takashi; Ogihara, Ryota; Yamada, Sachiko

    2018-03-21

    To investigate the accuracy of dual-energy virtual monochromatic computed tomography (CT) numbers obtained by two typical hardware and software implementations: the single-source projection-based method and the dual-source image-based method. A phantom with different tissue equivalent inserts was scanned with both single-source and dual-source scanners. A fast kVp-switching feature was used on the single-source scanner, whereas a tin filter was used on the dual-source scanner. Virtual monochromatic CT images of the phantom at energy levels of 60, 100, and 140 keV were obtained by both projection-based (on the single-source scanner) and image-based (on the dual-source scanner) methods. The accuracy of virtual monochromatic CT numbers for all inserts was assessed by comparing measured values to their corresponding true values. Linear regression analysis was performed to evaluate the dependency of measured CT numbers on tissue attenuation, method, and their interaction. Root mean square values of systematic error over all inserts at 60, 100, and 140 keV were approximately 53, 21, and 29 Hounsfield unit (HU) with the single-source projection-based method, and 46, 7, and 6 HU with the dual-source image-based method, respectively. Linear regression analysis revealed that the interaction between the attenuation and the method had a statistically significant effect on the measured CT numbers at 100 and 140 keV. There were attenuation-, method-, and energy level-dependent systematic errors in the measured virtual monochromatic CT numbers. CT number reproducibility was comparable between the two scanners, and CT numbers had better accuracy with the dual-source image-based method at 100 and 140 keV. Copyright © 2018 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.

  3. DRAINWAT--Based Methods For Estimating Nitrogen Transport in Poorly Drained Watersheds

    Treesearch

    Devendra M. Amatya; George M. Chescheir; Glenn P. Fernandez; R. Wayne Skaggs; J.W. Gilliam

    2004-01-01

    Methods are needed to quantify effects of land use and management practices on nutrient and sediment loads at the watershed scale. Two methods were used to apply a DRAINMOD-based watershed-scale model (DRAINWAT) to estimate total nitrogen (N) transport from a poorly drained, forested watershed. In both methods, in-stream retention or losses of N were calculated with a...

  4. Determining surface areas of marine alga cells by acid-base titration method.

    PubMed

    Wang, X; Ma, Y; Su, Y

    1997-09-01

    A new method for determining the surface area of living marine alga cells was described. The method uses acid-base titration to measure the surface acid/base amount on the surface of alga cells and uses the BET (Brunauer, Emmett, and Teller) equation to estimate the maximum surface acid/base amount, assuming that hydrous cell walls have carbohydrates or other structural compounds which can behave like surface Brönsted acid-base sites due to coordination of environmental H2O molecules. The method was applied to 18 diverse alga species (including 7 diatoms, 2 flagellates, 8 green algae and 1 red alga) maintained in seawater cultures. For the species examined, the surface areas of individual cells ranged from 2.8 x 10(-8) m2 for Nannochloropsis oculata to 690 x 10(-8) m2 for Dunaliella viridis, specific surface areas from 1,030 m2.g-1 for Dunaliella salina to 28,900 m2.g-1 for Pyramidomonas sp. Measurement accuracy was 15.2%. Preliminary studies show that the method may be more promising and accurate than light/electron microscopic measurements for coarse estimation of the surface area of living algae.

  5. Human swallowing simulation based on videofluorography images using Hamiltonian MPS method

    NASA Astrophysics Data System (ADS)

    Kikuchi, Takahiro; Michiwaki, Yukihiro; Kamiya, Tetsu; Toyama, Yoshio; Tamai, Tasuku; Koshizuka, Seiichi

    2015-09-01

    In developed nations, swallowing disorders and aspiration pneumonia have become serious problems. We developed a method to simulate the behavior of the organs involved in swallowing to clarify the mechanisms of swallowing and aspiration. The shape model is based on anatomically realistic geometry, and the motion model utilizes forced displacements based on realistic dynamic images to reflect the mechanisms of human swallowing. The soft tissue organs are modeled as nonlinear elastic material using the Hamiltonian MPS method. This method allows for stable simulation of the complex swallowing movement. A penalty method using metaballs is employed to simulate contact between organ walls and smooth sliding along the walls. We performed four numerical simulations under different analysis conditions to represent four cases of swallowing, including a healthy volunteer and a patient with a swallowing disorder. The simulation results were compared to examine the epiglottic downfolding mechanism, which strongly influences the risk of aspiration.

  6. Enhanced dual-frequency pattern scheme based on spatial-temporal fringes method

    NASA Astrophysics Data System (ADS)

    Wang, Minmin; Zhou, Canlin; Si, Shuchun; Lei, Zhenkun; Li, Xiaolei; Li, Hui; Li, YanJie

    2018-07-01

    One of the major challenges of employing a dual-frequency phase-shifting algorithm for phase retrieval is its sensitivity to noise. Yun et al proposed a dual-frequency method based on the Fourier transform profilometry, yet the low-frequency lobes are close to each other for accurate band-pass filtering. In the light of this problem, a novel dual-frequency pattern based on the spatial-temporal fringes (STF) method is developed in this paper. Three fringe patterns with two different frequencies are required. The low-frequency phase is obtained from two low-frequency fringe patterns by the STF method, so the signal lobes can be extracted accurately as they are far away from each other. The high-frequency phase is retrieved from another fringe pattern without the impact of the DC component. Simulations and experiments are conducted to demonstrate the excellent precision of the proposed method.

  7. A novel three-stage distance-based consensus ranking method

    NASA Astrophysics Data System (ADS)

    Aghayi, Nazila; Tavana, Madjid

    2018-05-01

    In this study, we propose a three-stage weighted sum method for identifying the group ranks of alternatives. In the first stage, a rank matrix, similar to the cross-efficiency matrix, is obtained by computing the individual rank position of each alternative based on importance weights. In the second stage, a secondary goal is defined to limit the vector of weights since the vector of weights obtained in the first stage is not unique. Finally, in the third stage, the group rank position of alternatives is obtained based on a distance of individual rank positions. The third stage determines a consensus solution for the group so that the ranks obtained have a minimum distance from the ranks acquired by each alternative in the previous stage. A numerical example is presented to demonstrate the applicability and exhibit the efficacy of the proposed method and algorithms.

  8. Powder-based adsorbents having high adsorption capacities for recovering dissolved metals and methods thereof

    DOEpatents

    Janke, Christopher J.; Dai, Sheng; Oyola, Yatsandra

    2016-05-03

    A powder-based adsorbent and a related method of manufacture are provided. The powder-based adsorbent includes polymer powder with grafted side chains and an increased surface area per unit weight to increase the adsorption of dissolved metals, for example uranium, from aqueous solutions. A method for forming the powder-based adsorbent includes irradiating polymer powder, grafting with polymerizable reactive monomers, reacting with hydroxylamine, and conditioning with an alkaline solution. Powder-based adsorbents formed according to the present method demonstrated a significantly improved uranium adsorption capacity per unit weight over existing adsorbents.

  9. Foam-based adsorbents having high adsorption capacities for recovering dissolved metals and methods thereof

    DOEpatents

    Janke, Christopher J.; Dai, Sheng; Oyola, Yatsandra

    2015-06-02

    Foam-based adsorbents and a related method of manufacture are provided. The foam-based adsorbents include polymer foam with grafted side chains and an increased surface area per unit weight to increase the adsorption of dissolved metals, for example uranium, from aqueous solutions. A method for forming the foam-based adsorbents includes irradiating polymer foam, grafting with polymerizable reactive monomers, reacting with hydroxylamine, and conditioning with an alkaline solution. Foam-based adsorbents formed according to the present method demonstrated a significantly improved uranium adsorption capacity per unit weight over existing adsorbents.

  10. An atlas-based multimodal registration method for 2D images with discrepancy structures.

    PubMed

    Lv, Wenchao; Chen, Houjin; Peng, Yahui; Li, Yanfeng; Li, Jupeng

    2018-06-04

    An atlas-based multimodal registration method for 2-dimension images with discrepancy structures was proposed in this paper. Atlas was utilized for complementing the discrepancy structure information in multimodal medical images. The scheme includes three steps: floating image to atlas registration, atlas to reference image registration, and field-based deformation. To evaluate the performance, a frame model, a brain model, and clinical images were employed in registration experiments. We measured the registration performance by the squared sum of intensity differences. Results indicate that this method is robust and performs better than the direct registration for multimodal images with discrepancy structures. We conclude that the proposed method is suitable for multimodal images with discrepancy structures. Graphical Abstract An Atlas-based multimodal registration method schematic diagram.

  11. Genetics-based methods for detection of Salmonella spp. in foods.

    PubMed

    Mozola, Mark A

    2006-01-01

    Genetic methods are now at the forefront of foodborne pathogen testing. The sensitivity, specificity, and inclusivity advantages offered by deoxyribonucleic acid (DNA) probe technology have driven an intense effort in methods development over the past 20 years. DNA probe-based methods for Salmonella spp. and other pathogens have progressed from time-consuming procedures involving the use of radioisotopes to simple, high throughput, automated assays. The analytical sensitivity of nucleic acid amplification technology has facilitated a reduction in analysis time by allowing enriched samples to be tested for previously undetectable quantities of analyte. This article will trace the evolution of the development of genetic methods for detection of Salmonella in foods, review the basic assay formats and their advantages and limitations, and discuss method performance characteristics and considerations for selection of methods.

  12. A joint tracking method for NSCC based on WLS algorithm

    NASA Astrophysics Data System (ADS)

    Luo, Ruidan; Xu, Ying; Yuan, Hong

    2017-12-01

    Navigation signal based on compound carrier (NSCC), has the flexible multi-carrier scheme and various scheme parameters configuration, which enables it to possess significant efficiency of navigation augmentation in terms of spectral efficiency, tracking accuracy, multipath mitigation capability and anti-jamming reduction compared with legacy navigation signals. Meanwhile, the typical scheme characteristics can provide auxiliary information for signal synchronism algorithm design. This paper, based on the characteristics of NSCC, proposed a kind of joint tracking method utilizing Weighted Least Square (WLS) algorithm. In this method, the LS algorithm is employed to jointly estimate each sub-carrier frequency shift with the frequency-Doppler linear relationship, by utilizing the known sub-carrier frequency. Besides, the weighting matrix is set adaptively according to the sub-carrier power to ensure the estimation accuracy. Both the theory analysis and simulation results illustrate that the tracking accuracy and sensitivity of this method outperforms the single-carrier algorithm with lower SNR.

  13. Tactile objects based on an amplitude disturbed diffraction pattern method

    NASA Astrophysics Data System (ADS)

    Liu, Yuan; Nikolovski, Jean-Pierre; Mechbal, Nazih; Hafez, Moustapha; Vergé, Michel

    2009-12-01

    Tactile sensing is becoming widely used in human-computer interfaces. Recent advances in acoustic approaches demonstrated the possibilities to transform ordinary solid objects into interactive interfaces. This letter proposes a static finger contact localization process using an amplitude disturbed diffraction pattern method. The localization method is based on the following physical phenomenon: a finger contact modifies the energy distribution of acoustic wave in a solid; these variations depend on the wave frequency and the contact position. The presented method first consists of exciting the object with an acoustic signal with plural frequency components. In a second step, a measured acoustic signal is compared with prerecorded values to deduce the contact position. This position is then used for human-machine interaction (e.g., finger tracking on computer screen). The selection of excitation signals is discussed and a frequency choice criterion based on contrast value is proposed. Tests on a sandwich plate (liquid crystal display screen) prove the simplicity and easiness to apply the process in various solids.

  14. Properties of a Formal Method to Model Emergence in Swarm-Based Systems

    NASA Technical Reports Server (NTRS)

    Rouff, Christopher; Vanderbilt, Amy; Truszkowski, Walt; Rash, James; Hinchey, Mike

    2004-01-01

    Future space missions will require cooperation between multiple satellites and/or rovers. Developers are proposing intelligent autonomous swarms for these missions, but swarm-based systems are difficult or impossible to test with current techniques. This viewgraph presentation examines the use of formal methods in testing swarm-based systems. The potential usefulness of formal methods in modeling the ANTS asteroid encounter mission is also examined.

  15. Intelligent Gearbox Diagnosis Methods Based on SVM, Wavelet Lifting and RBR

    PubMed Central

    Gao, Lixin; Ren, Zhiqiang; Tang, Wenliang; Wang, Huaqing; Chen, Peng

    2010-01-01

    Given the problems in intelligent gearbox diagnosis methods, it is difficult to obtain the desired information and a large enough sample size to study; therefore, we propose the application of various methods for gearbox fault diagnosis, including wavelet lifting, a support vector machine (SVM) and rule-based reasoning (RBR). In a complex field environment, it is less likely for machines to have the same fault; moreover, the fault features can also vary. Therefore, a SVM could be used for the initial diagnosis. First, gearbox vibration signals were processed with wavelet packet decomposition, and the signal energy coefficients of each frequency band were extracted and used as input feature vectors in SVM for normal and faulty pattern recognition. Second, precision analysis using wavelet lifting could successfully filter out the noisy signals while maintaining the impulse characteristics of the fault; thus effectively extracting the fault frequency of the machine. Lastly, the knowledge base was built based on the field rules summarized by experts to identify the detailed fault type. Results have shown that SVM is a powerful tool to accomplish gearbox fault pattern recognition when the sample size is small, whereas the wavelet lifting scheme can effectively extract fault features, and rule-based reasoning can be used to identify the detailed fault type. Therefore, a method that combines SVM, wavelet lifting and rule-based reasoning ensures effective gearbox fault diagnosis. PMID:22399894

  16. Intelligent gearbox diagnosis methods based on SVM, wavelet lifting and RBR.

    PubMed

    Gao, Lixin; Ren, Zhiqiang; Tang, Wenliang; Wang, Huaqing; Chen, Peng

    2010-01-01

    Given the problems in intelligent gearbox diagnosis methods, it is difficult to obtain the desired information and a large enough sample size to study; therefore, we propose the application of various methods for gearbox fault diagnosis, including wavelet lifting, a support vector machine (SVM) and rule-based reasoning (RBR). In a complex field environment, it is less likely for machines to have the same fault; moreover, the fault features can also vary. Therefore, a SVM could be used for the initial diagnosis. First, gearbox vibration signals were processed with wavelet packet decomposition, and the signal energy coefficients of each frequency band were extracted and used as input feature vectors in SVM for normal and faulty pattern recognition. Second, precision analysis using wavelet lifting could successfully filter out the noisy signals while maintaining the impulse characteristics of the fault; thus effectively extracting the fault frequency of the machine. Lastly, the knowledge base was built based on the field rules summarized by experts to identify the detailed fault type. Results have shown that SVM is a powerful tool to accomplish gearbox fault pattern recognition when the sample size is small, whereas the wavelet lifting scheme can effectively extract fault features, and rule-based reasoning can be used to identify the detailed fault type. Therefore, a method that combines SVM, wavelet lifting and rule-based reasoning ensures effective gearbox fault diagnosis.

  17. Real reproduction and evaluation of color based on BRDF method

    NASA Astrophysics Data System (ADS)

    Qin, Feng; Yang, Weiping; Yang, Jia; Li, Hongning; Luo, Yanlin; Long, Hongli

    2013-12-01

    It is difficult to reproduce the original color of targets really in different illuminating environment using the traditional methods. So a function which can reconstruct the characteristics of reflection about every point on the surface of target is required urgently to improve the authenticity of color reproduction, which known as the Bidirectional Reflectance Distribution Function(BRDF). A method of color reproduction based on the BRDF measurement is introduced in this paper. Radiometry is combined with the colorimetric theories to measure the irradiance and radiance of GretagMacbeth 24 ColorChecker by using PR-715 Radiation Spectrophotometer of PHOTO RESEARCH, Inc, USA. The BRDF and BRF (Bidirectional Reflectance Factor) values of every color piece corresponding to the reference area are calculated according to irradiance and radiance, thus color tristimulus values of 24 ColorChecker are reconstructed. The results reconstructed by BRDF method are compared with values calculated by the reflectance using PR-715, at last, the chromaticity coordinates in color space and color difference between each other are analyzed. The experimental result shows average color difference and sample standard deviation between the method proposed in this paper and traditional reconstruction method depended on reflectance are 2.567 and 1.3049 respectively. The conclusion indicates that the method of color reproduction based on BRDF has the more obvious advantages to describe the color information of object than the reflectance in hemisphere space through the theoretical and experimental analysis. This method proposed in this paper is effective and feasible during the research of reproducing the chromaticity.

  18. An Exact Model-Based Method for Near-Field Sources Localization with Bistatic MIMO System.

    PubMed

    Singh, Parth Raj; Wang, Yide; Chargé, Pascal

    2017-03-30

    In this paper, we propose an exact model-based method for near-field sources localization with a bistatic multiple input, multiple output (MIMO) radar system, and compare it with an approximated model-based method. The aim of this paper is to propose an efficient way to use the exact model of the received signals of near-field sources in order to eliminate the systematic error introduced by the use of approximated model in most existing near-field sources localization techniques. The proposed method uses parallel factor (PARAFAC) decomposition to deal with the exact model. Thanks to the exact model, the proposed method has better precision and resolution than the compared approximated model-based method. The simulation results show the performance of the proposed method.

  19. A Multi-level Fuzzy Evaluation Method for Smart Distribution Network Based on Entropy Weight

    NASA Astrophysics Data System (ADS)

    Li, Jianfang; Song, Xiaohui; Gao, Fei; Zhang, Yu

    2017-05-01

    Smart distribution network is considered as the future trend of distribution network. In order to comprehensive evaluate smart distribution construction level and give guidance to the practice of smart distribution construction, a multi-level fuzzy evaluation method based on entropy weight is proposed. Firstly, focus on both the conventional characteristics of distribution network and new characteristics of smart distribution network such as self-healing and interaction, a multi-level evaluation index system which contains power supply capability, power quality, economy, reliability and interaction is established. Then, a combination weighting method based on Delphi method and entropy weight method is put forward, which take into account not only the importance of the evaluation index in the experts’ subjective view, but also the objective and different information from the index values. Thirdly, a multi-level evaluation method based on fuzzy theory is put forward. Lastly, an example is conducted based on the statistical data of some cites’ distribution network and the evaluation method is proved effective and rational.

  20. Selection of Construction Methods: A Knowledge-Based Approach

    PubMed Central

    Skibniewski, Miroslaw

    2013-01-01

    The appropriate selection of construction methods to be used during the execution of a construction project is a major determinant of high productivity, but sometimes this selection process is performed without the care and the systematic approach that it deserves, bringing negative consequences. This paper proposes a knowledge management approach that will enable the intelligent use of corporate experience and information and help to improve the selection of construction methods for a project. Then a knowledge-based system to support this decision-making process is proposed and described. To define and design the system, semistructured interviews were conducted within three construction companies with the purpose of studying the way that the method' selection process is carried out in practice and the knowledge associated with it. A prototype of a Construction Methods Knowledge System (CMKS) was developed and then validated with construction industry professionals. As a conclusion, the CMKS was perceived as a valuable tool for construction methods' selection, by helping companies to generate a corporate memory on this issue, reducing the reliance on individual knowledge and also the subjectivity of the decision-making process. The described benefits as provided by the system favor a better performance of construction projects. PMID:24453925

  1. A new graph-based method for pairwise global network alignment

    PubMed Central

    Klau, Gunnar W

    2009-01-01

    Background In addition to component-based comparative approaches, network alignments provide the means to study conserved network topology such as common pathways and more complex network motifs. Yet, unlike in classical sequence alignment, the comparison of networks becomes computationally more challenging, as most meaningful assumptions instantly lead to NP-hard problems. Most previous algorithmic work on network alignments is heuristic in nature. Results We introduce the graph-based maximum structural matching formulation for pairwise global network alignment. We relate the formulation to previous work and prove NP-hardness of the problem. Based on the new formulation we build upon recent results in computational structural biology and present a novel Lagrangian relaxation approach that, in combination with a branch-and-bound method, computes provably optimal network alignments. The Lagrangian algorithm alone is a powerful heuristic method, which produces solutions that are often near-optimal and – unlike those computed by pure heuristics – come with a quality guarantee. Conclusion Computational experiments on the alignment of protein-protein interaction networks and on the classification of metabolic subnetworks demonstrate that the new method is reasonably fast and has advantages over pure heuristics. Our software tool is freely available as part of the LISA library. PMID:19208162

  2. Radial-based tail methods for Monte Carlo simulations of cylindrical interfaces

    NASA Astrophysics Data System (ADS)

    Goujon, Florent; Bêche, Bruno; Malfreyt, Patrice; Ghoufi, Aziz

    2018-03-01

    In this work, we implement for the first time the radial-based tail methods for Monte Carlo simulations of cylindrical interfaces. The efficiency of this method is then evaluated through the calculation of surface tension and coexisting properties. We show that the inclusion of tail corrections during the course of the Monte Carlo simulation impacts the coexisting and the interfacial properties. We establish that the long range corrections to the surface tension are the same order of magnitude as those obtained from planar interface. We show that the slab-based tail method does not amend the localization of the Gibbs equimolar dividing surface. Additionally, a non-monotonic behavior of surface tension is exhibited as a function of the radius of the equimolar dividing surface.

  3. A brain-region-based meta-analysis method utilizing the Apriori algorithm.

    PubMed

    Niu, Zhendong; Nie, Yaoxin; Zhou, Qian; Zhu, Linlin; Wei, Jieyao

    2016-05-18

    Brain network connectivity modeling is a crucial method for studying the brain's cognitive functions. Meta-analyses can unearth reliable results from individual studies. Meta-analytic connectivity modeling is a connectivity analysis method based on regions of interest (ROIs) which showed that meta-analyses could be used to discover brain network connectivity. In this paper, we propose a new meta-analysis method that can be used to find network connectivity models based on the Apriori algorithm, which has the potential to derive brain network connectivity models from activation information in the literature, without requiring ROIs. This method first extracts activation information from experimental studies that use cognitive tasks of the same category, and then maps the activation information to corresponding brain areas by using the automatic anatomical label atlas, after which the activation rate of these brain areas is calculated. Finally, using these brain areas, a potential brain network connectivity model is calculated based on the Apriori algorithm. The present study used this method to conduct a mining analysis on the citations in a language review article by Price (Neuroimage 62(2):816-847, 2012). The results showed that the obtained network connectivity model was consistent with that reported by Price. The proposed method is helpful to find brain network connectivity by mining the co-activation relationships among brain regions. Furthermore, results of the co-activation relationship analysis can be used as a priori knowledge for the corresponding dynamic causal modeling analysis, possibly achieving a significant dimension-reducing effect, thus increasing the efficiency of the dynamic causal modeling analysis.

  4. Clustering Scientific Publications Based on Citation Relations: A Systematic Comparison of Different Methods.

    PubMed

    Šubelj, Lovro; van Eck, Nees Jan; Waltman, Ludo

    2016-01-01

    Clustering methods are applied regularly in the bibliometric literature to identify research areas or scientific fields. These methods are for instance used to group publications into clusters based on their relations in a citation network. In the network science literature, many clustering methods, often referred to as graph partitioning or community detection techniques, have been developed. Focusing on the problem of clustering the publications in a citation network, we present a systematic comparison of the performance of a large number of these clustering methods. Using a number of different citation networks, some of them relatively small and others very large, we extensively study the statistical properties of the results provided by different methods. In addition, we also carry out an expert-based assessment of the results produced by different methods. The expert-based assessment focuses on publications in the field of scientometrics. Our findings seem to indicate that there is a trade-off between different properties that may be considered desirable for a good clustering of publications. Overall, map equation methods appear to perform best in our analysis, suggesting that these methods deserve more attention from the bibliometric community.

  5. Clustering Scientific Publications Based on Citation Relations: A Systematic Comparison of Different Methods

    PubMed Central

    Šubelj, Lovro; van Eck, Nees Jan; Waltman, Ludo

    2016-01-01

    Clustering methods are applied regularly in the bibliometric literature to identify research areas or scientific fields. These methods are for instance used to group publications into clusters based on their relations in a citation network. In the network science literature, many clustering methods, often referred to as graph partitioning or community detection techniques, have been developed. Focusing on the problem of clustering the publications in a citation network, we present a systematic comparison of the performance of a large number of these clustering methods. Using a number of different citation networks, some of them relatively small and others very large, we extensively study the statistical properties of the results provided by different methods. In addition, we also carry out an expert-based assessment of the results produced by different methods. The expert-based assessment focuses on publications in the field of scientometrics. Our findings seem to indicate that there is a trade-off between different properties that may be considered desirable for a good clustering of publications. Overall, map equation methods appear to perform best in our analysis, suggesting that these methods deserve more attention from the bibliometric community. PMID:27124610

  6. Modeling method of time sequence model based grey system theory and application proceedings

    NASA Astrophysics Data System (ADS)

    Wei, Xuexia; Luo, Yaling; Zhang, Shiqiang

    2015-12-01

    This article gives a modeling method of grey system GM(1,1) model based on reusing information and the grey system theory. This method not only extremely enhances the fitting and predicting accuracy of GM(1,1) model, but also maintains the conventional routes' merit of simple computation. By this way, we have given one syphilis trend forecast method based on reusing information and the grey system GM(1,1) model.

  7. MRI-based methods for quantification of the cerebral metabolic rate of oxygen

    PubMed Central

    Rodgers, Zachary B; Detre, John A

    2016-01-01

    The brain depends almost entirely on oxidative metabolism to meet its significant energy requirements. As such, the cerebral metabolic rate of oxygen (CMRO2) represents a key measure of brain function. Quantification of CMRO2 has helped elucidate brain functional physiology and holds potential as a clinical tool for evaluating neurological disorders including stroke, brain tumors, Alzheimer’s disease, and obstructive sleep apnea. In recent years, a variety of magnetic resonance imaging (MRI)-based CMRO2 quantification methods have emerged. Unlike positron emission tomography – the current “gold standard” for measurement and mapping of CMRO2 – MRI is non-invasive, relatively inexpensive, and ubiquitously available in modern medical centers. All MRI-based CMRO2 methods are based on modeling the effect of paramagnetic deoxyhemoglobin on the magnetic resonance signal. The various methods can be classified in terms of the MRI contrast mechanism used to quantify CMRO2: T2*, T2′, T2, or magnetic susceptibility. This review article provides an overview of MRI-based CMRO2 quantification techniques. After a brief historical discussion motivating the need for improved CMRO2 methodology, current state-of-the-art MRI-based methods are critically appraised in terms of their respective tradeoffs between spatial resolution, temporal resolution, and robustness, all of critical importance given the spatially heterogeneous and temporally dynamic nature of brain energy requirements. PMID:27089912

  8. Moving target detection method based on improved Gaussian mixture model

    NASA Astrophysics Data System (ADS)

    Ma, J. Y.; Jie, F. R.; Hu, Y. J.

    2017-07-01

    Gaussian Mixture Model is often employed to build background model in background difference methods for moving target detection. This paper puts forward an adaptive moving target detection algorithm based on improved Gaussian Mixture Model. According to the graylevel convergence for each pixel, adaptively choose the number of Gaussian distribution to learn and update background model. Morphological reconstruction method is adopted to eliminate the shadow.. Experiment proved that the proposed method not only has good robustness and detection effect, but also has good adaptability. Even for the special cases when the grayscale changes greatly and so on, the proposed method can also make outstanding performance.

  9. Optimization-based limiters for the spectral element method

    NASA Astrophysics Data System (ADS)

    Guba, Oksana; Taylor, Mark; St-Cyr, Amik

    2014-06-01

    We introduce a new family of optimization based limiters for the h-p spectral element method. The native spectral element advection operator is oscillatory, but due to its mimetic properties it is locally conservative and has a monotone property with respect to element averages. We exploit this property to construct locally conservative quasimonotone and sign-preserving limiters. The quasimonotone limiter prevents all overshoots and undershoots at the element level, but is not strictly non-oscillatory. It also maintains quasimonotonicity even with the addition of a dissipation term such as viscosity or hyperviscosity. The limiters are based on a least-squares formulation with equality and inequality constraints and are local to each element. We evaluate the new limiters using a deformational flow test case for advection on the surface of the sphere. We focus on mesh refinement for moderate (p=3) and high order (p=6) elements. As expected, the spectral element method obtains its formal order of accuracy for smooth problems without limiters. For advection of fields with cusps and discontinuities, the high order convergence is lost, but in all cases, p=6 outperforms p=3 for the same degrees of freedom.

  10. A novel energy conversion based method for velocity correction in molecular dynamics simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jin, Hanhui; Collaborative Innovation Center of Advanced Aero-Engine, Hangzhou 310027; Liu, Ningning

    2017-05-01

    Molecular dynamics (MD) simulation has become an important tool for studying micro- or nano-scale dynamics and the statistical properties of fluids and solids. In MD simulations, there are mainly two approaches: equilibrium and non-equilibrium molecular dynamics (EMD and NEMD). In this paper, a new energy conversion based correction (ECBC) method for MD is developed. Unlike the traditional systematic correction based on macroscopic parameters, the ECBC method is developed strictly based on the physical interaction processes between the pair of molecules or atoms. The developed ECBC method can apply to EMD and NEMD directly. While using MD with this method, themore » difference between the EMD and NEMD is eliminated, and no macroscopic parameters such as external imposed potentials or coefficients are needed. With this method, many limits of using MD are lifted. The application scope of MD is greatly extended.« less

  11. Innovative design method of automobile profile based on Fourier descriptor

    NASA Astrophysics Data System (ADS)

    Gao, Shuyong; Fu, Chaoxing; Xia, Fan; Shen, Wei

    2017-10-01

    Aiming at the innovation of the contours of automobile side, this paper presents an innovative design method of vehicle side profile based on Fourier descriptor. The design flow of this design method is: pre-processing, coordinate extraction, standardization, discrete Fourier transform, simplified Fourier descriptor, exchange descriptor innovation, inverse Fourier transform to get the outline of innovative design. Innovative concepts of the innovative methods of gene exchange among species and the innovative methods of gene exchange among different species are presented, and the contours of the innovative design are obtained separately. A three-dimensional model of a car is obtained by referring to the profile curve which is obtained by exchanging xenogeneic genes. The feasibility of the method proposed in this paper is verified by various aspects.

  12. A phase match based frequency estimation method for sinusoidal signals

    NASA Astrophysics Data System (ADS)

    Shen, Yan-Lin; Tu, Ya-Qing; Chen, Lin-Jun; Shen, Ting-Ao

    2015-04-01

    Accurate frequency estimation affects the ranging precision of linear frequency modulated continuous wave (LFMCW) radars significantly. To improve the ranging precision of LFMCW radars, a phase match based frequency estimation method is proposed. To obtain frequency estimation, linear prediction property, autocorrelation, and cross correlation of sinusoidal signals are utilized. The analysis of computational complex shows that the computational load of the proposed method is smaller than those of two-stage autocorrelation (TSA) and maximum likelihood. Simulations and field experiments are performed to validate the proposed method, and the results demonstrate the proposed method has better performance in terms of frequency estimation precision than methods of Pisarenko harmonic decomposition, modified covariance, and TSA, which contribute to improving the precision of LFMCW radars effectively.

  13. Evaluation of two outlier-detection-based methods for detecting tissue-selective genes from microarray data.

    PubMed

    Kadota, Koji; Konishi, Tomokazu; Shimizu, Kentaro

    2007-05-01

    Large-scale expression profiling using DNA microarrays enables identification of tissue-selective genes for which expression is considerably higher and/or lower in some tissues than in others. Among numerous possible methods, only two outlier-detection-based methods (an AIC-based method and Sprent's non-parametric method) can treat equally various types of selective patterns, but they produce substantially different results. We investigated the performance of these two methods for different parameter settings and for a reduced number of samples. We focused on their ability to detect selective expression patterns robustly. We applied them to public microarray data collected from 36 normal human tissue samples and analyzed the effects of both changing the parameter settings and reducing the number of samples. The AIC-based method was more robust in both cases. The findings confirm that the use of the AIC-based method in the recently proposed ROKU method for detecting tissue-selective expression patterns is correct and that Sprent's method is not suitable for ROKU.

  14. Method for 3D profilometry measurement based on contouring moire fringe

    NASA Astrophysics Data System (ADS)

    Shi, Zhiwei; Lin, Juhua

    2007-12-01

    3D shape measurement is one of the most active branches of optical research recently. A method of 3D profilometry measurement by the combination of Moire projection method and phase-shifting technology based on SCM (Single Chip Microcomputer) control is presented in the paper. Automatic measurement of 3D surface profiles can be carried out by applying this method with high speed and high precision.

  15. Nearest neighbor-density-based clustering methods for large hyperspectral images

    NASA Astrophysics Data System (ADS)

    Cariou, Claude; Chehdi, Kacem

    2017-10-01

    We address the problem of hyperspectral image (HSI) pixel partitioning using nearest neighbor - density-based (NN-DB) clustering methods. NN-DB methods are able to cluster objects without specifying the number of clusters to be found. Within the NN-DB approach, we focus on deterministic methods, e.g. ModeSeek, knnClust, and GWENN (standing for Graph WatershEd using Nearest Neighbors). These methods only require the availability of a k-nearest neighbor (kNN) graph based on a given distance metric. Recently, a new DB clustering method, called Density Peak Clustering (DPC), has received much attention, and kNN versions of it have quickly followed and showed their efficiency. However, NN-DB methods still suffer from the difficulty of obtaining the kNN graph due to the quadratic complexity with respect to the number of pixels. This is why GWENN was embedded into a multiresolution (MR) scheme to bypass the computation of the full kNN graph over the image pixels. In this communication, we propose to extent the MR-GWENN scheme on three aspects. Firstly, similarly to knnClust, the original labeling rule of GWENN is modified to account for local density values, in addition to the labels of previously processed objects. Secondly, we set up a modified NN search procedure within the MR scheme, in order to stabilize of the number of clusters found from the coarsest to the finest spatial resolution. Finally, we show that these extensions can be easily adapted to the three other NN-DB methods (ModeSeek, knnClust, knnDPC) for pixel clustering in large HSIs. Experiments are conducted to compare the four NN-DB methods for pixel clustering in HSIs. We show that NN-DB methods can outperform a classical clustering method such as fuzzy c-means (FCM), in terms of classification accuracy, relevance of found clusters, and clustering speed. Finally, we demonstrate the feasibility and evaluate the performances of NN-DB methods on a very large image acquired by our AISA Eagle hyperspectral

  16. Evaluating user reputation in online rating systems via an iterative group-based ranking method

    NASA Astrophysics Data System (ADS)

    Gao, Jian; Zhou, Tao

    2017-05-01

    Reputation is a valuable asset in online social lives and it has drawn increased attention. Due to the existence of noisy ratings and spamming attacks, how to evaluate user reputation in online rating systems is especially significant. However, most of the previous ranking-based methods either follow a debatable assumption or have unsatisfied robustness. In this paper, we propose an iterative group-based ranking method by introducing an iterative reputation-allocation process into the original group-based ranking method. More specifically, the reputation of users is calculated based on the weighted sizes of the user rating groups after grouping all users by their rating similarities, and the high reputation users' ratings have larger weights in dominating the corresponding user rating groups. The reputation of users and the user rating group sizes are iteratively updated until they become stable. Results on two real data sets with artificial spammers suggest that the proposed method has better performance than the state-of-the-art methods and its robustness is considerably improved comparing with the original group-based ranking method. Our work highlights the positive role of considering users' grouping behaviors towards a better online user reputation evaluation.

  17. Method for Manufacturing Bulk Metallic Glass-Based Strain Wave Gear Components

    NASA Technical Reports Server (NTRS)

    Hofmann, Douglas C. (Inventor); Wilcox, Brian H. (Inventor)

    2017-01-01

    Systems and methods in accordance with embodiments of the invention implement bulk metallic glass-based strain wave gears and strain wave gear components. In one embodiment, a method of fabricating a strain wave gear includes: shaping a BMG-based material using a mold in conjunction with one of a thermoplastic forming technique and a casting technique; where the BMG-based material is shaped into one of: a wave generator plug, an inner race, an outer race, a rolling element, a flexspline, a flexspline without a set of gear teeth, a circular spline, a circular spline without a set of gear teeth, a set of gear teeth to be incorporated within a flexspline, and a set of gear teeth to be incorporated within a circular spline.

  18. A Hybrid Wavelet-Based Method for the Peak Detection of Photoplethysmography Signals.

    PubMed

    Li, Suyi; Jiang, Shanqing; Jiang, Shan; Wu, Jiang; Xiong, Wenji; Diao, Shu

    2017-01-01

    The noninvasive peripheral oxygen saturation (SpO 2 ) and the pulse rate can be extracted from photoplethysmography (PPG) signals. However, the accuracy of the extraction is directly affected by the quality of the signal obtained and the peak of the signal identified; therefore, a hybrid wavelet-based method is proposed in this study. Firstly, we suppressed the partial motion artifacts and corrected the baseline drift by using a wavelet method based on the principle of wavelet multiresolution. And then, we designed a quadratic spline wavelet modulus maximum algorithm to identify the PPG peaks automatically. To evaluate this hybrid method, a reflective pulse oximeter was used to acquire ten subjects' PPG signals under sitting, raising hand, and gently walking postures, and the peak recognition results on the raw signal and on the corrected signal were compared, respectively. The results showed that the hybrid method not only corrected the morphologies of the signal well but also optimized the peaks identification quality, subsequently elevating the measurement accuracy of SpO 2 and the pulse rate. As a result, our hybrid wavelet-based method profoundly optimized the evaluation of respiratory function and heart rate variability analysis.

  19. A Hybrid Wavelet-Based Method for the Peak Detection of Photoplethysmography Signals

    PubMed Central

    Jiang, Shanqing; Jiang, Shan; Wu, Jiang; Xiong, Wenji

    2017-01-01

    The noninvasive peripheral oxygen saturation (SpO2) and the pulse rate can be extracted from photoplethysmography (PPG) signals. However, the accuracy of the extraction is directly affected by the quality of the signal obtained and the peak of the signal identified; therefore, a hybrid wavelet-based method is proposed in this study. Firstly, we suppressed the partial motion artifacts and corrected the baseline drift by using a wavelet method based on the principle of wavelet multiresolution. And then, we designed a quadratic spline wavelet modulus maximum algorithm to identify the PPG peaks automatically. To evaluate this hybrid method, a reflective pulse oximeter was used to acquire ten subjects' PPG signals under sitting, raising hand, and gently walking postures, and the peak recognition results on the raw signal and on the corrected signal were compared, respectively. The results showed that the hybrid method not only corrected the morphologies of the signal well but also optimized the peaks identification quality, subsequently elevating the measurement accuracy of SpO2 and the pulse rate. As a result, our hybrid wavelet-based method profoundly optimized the evaluation of respiratory function and heart rate variability analysis. PMID:29250135

  20. Comparison of sequencing-based methods to profile DNA methylation and identification of monoallelic epigenetic modifications

    USDA-ARS?s Scientific Manuscript database

    Analysis of DNA methylation patterns relies increasingly on sequencing-based profiling methods. The four most frequently used sequencing-based technologies are the bisulfite-based methods MethylC-seq and reduced representation bisulfite sequencing (RRBS), and the enrichment-based techniques methylat...