Sample records for modified chnri methodology

  1. The legacy of the Child Health and Nutrition Research Initiative (CHNRI).

    PubMed

    Black, Robert E

    2016-06-01

    Under the Global Forum for Health Research, the Child Health and Nutrition Research Initiative (CHNRI) began its operations in 1999 and became a Swiss foundation in 2006. The vision of CHNRI was to improve child health and nutrition of all children in low- and middle-income countries (LMIC) through research that informs health policy and practice. Specific objectives included expanding global knowledge on childhood disease burden and cost-effectiveness of interventions, promoting priority setting in research, ensuring inclusion of institutions and scientists in LMIC in setting priorities, promoting capacity development in LMIC and stimulating donors and countries to increase resources for research. CHNRI created a knowledge network, funded research through multiple rounds of a global competitive process and published research papers and policy briefs. A signature effort was to develop a systematic methodology for prioritizing health and nutrition research investments. The "CHNRI method" has been extensively applied to global health problems and is now the most commonly used method for prioritizing health research questions.

  2. Setting research priorities for maternal, newborn, child health and nutrition in India by engaging experts from 256 indigenous institutions contributing over 4000 research ideas: a CHNRI exercise by ICMR and INCLEN.

    PubMed

    Arora, Narendra K; Mohapatra, Archisman; Gopalan, Hema S; Wazny, Kerri; Thavaraj, Vasantha; Rasaily, Reeta; Das, Manoj K; Maheshwari, Meenu; Bahl, Rajiv; Qazi, Shamim A; Black, Robert E; Rudan, Igor

    2017-06-01

    Health research in low- and middle- income countries (LMICs) is often driven by donor priorities rather than by the needs of the countries where the research takes place. This lack of alignment of donor's priorities with local research need may be one of the reasons why countries fail to achieve set goals for population health and nutrition. India has a high burden of morbidity and mortality in women, children and infants. In order to look forward toward the Sustainable Development Goals, the Indian Council of Medical Research (ICMR) and the INCLEN Trust International (INCLEN) employed the Child Health and Nutrition Research Initiative's (CHNRI) research priority setting method for maternal, neonatal, child health and nutrition with the timeline of 2016-2025. The exercise was the largest to-date use of the CHNRI methodology, both in terms of participants and ideas generated and also expanded on the methodology. CHNRI is a crowdsourcing-based exercise that involves using the collective intelligence of a group of stakeholders, usually researchers, to generate and score research options against a set of criteria. This paper reports on a large umbrella CHNRI that was divided into four theme-specific CHNRIs (maternal, newborn, child health and nutrition). A National Steering Group oversaw the exercise and four theme-specific Research Sub-Committees technically supported finalizing the scoring criteria and refinement of research ideas for the respective thematic areas. The exercise engaged participants from 256 institutions across India - 4003 research ideas were generated from 498 experts which were consolidated into 373 research options (maternal health: 122; newborn health: 56; child health: 101; nutrition: 94); 893 experts scored these against five criteria (answerability, relevance, equity, innovation and out-of-box thinking, investment on research). Relative weights to the criteria were assigned by 79 members from the Larger Reference Group. Given India's diversity, priorities were identified at national and three regional levels: (i) the Empowered Action Group (EAG) and North-Eastern States; (ii) States and Union territories in Northern India (including West Bengal); and (iii) States and Union territories in Southern and Western parts of India. The exercise leveraged the inherent flexibility of the CHNRI method in multiple ways. It expanded on the CHNRI methodology enabling analyses for identification of research priorities at national and regional levels. However, prioritization of research options are only valuable if they are put to use, and we hope that donors will take advantage of this prioritized list of research options.

  3. Setting health research priorities using the CHNRI method: VII. A review of the first 50 applications of the CHNRI method.

    PubMed

    Rudan, Igor; Yoshida, Sachiyo; Chan, Kit Yee; Sridhar, Devi; Wazny, Kerri; Nair, Harish; Sheikh, Aziz; Tomlinson, Mark; Lawn, Joy E; Bhutta, Zulfiqar A; Bahl, Rajiv; Chopra, Mickey; Campbell, Harry; El Arifeen, Shams; Black, Robert E; Cousens, Simon

    2017-06-01

    Several recent reviews of the methods used to set research priorities have identified the CHNRI method (acronym derived from the "Child Health and Nutrition Research Initiative") as an approach that clearly became popular and widely used over the past decade. In this paper we review the first 50 examples of application of the CHNRI method, published between 2007 and 2016, and summarize the most important messages that emerged from those experiences. We conducted a literature review to identify the first 50 examples of application of the CHNRI method in chronological order. We searched Google Scholar, PubMed and so-called grey literature. Initially, between 2007 and 2011, the CHNRI method was mainly used for setting research priorities to address global child health issues, although the first cases of application outside this field (eg, mental health, disabilities and zoonoses) were also recorded. Since 2012 the CHNRI method was used more widely, expanding into the topics such as adolescent health, dementia, national health policy and education. The majority of the exercises were focused on issues that were only relevant to low- and middle-income countries, and national-level applications are on the rise. The first CHNRI-based articles adhered to the five recommended priority-setting criteria, but by 2016 more than two-thirds of all conducted exercises departed from recommendations, modifying the CHNRI method to suit each particular exercise. This was done not only by changing the number of criteria used, but also by introducing some entirely new criteria (eg, "low cost", "sustainability", "acceptability", "feasibility", "relevance" and others). The popularity of the CHNRI method in setting health research priorities can be attributed to several key conceptual advances that have addressed common concerns. The method is systematic in nature, offering an acceptable framework for handling many research questions. It is also transparent and replicable, because it clearly defines the context and priority-setting criteria. It is democratic, as it relies on "crowd-sourcing". It is inclusive, fostering "ownership" of the results by ensuring that various groups invest in the process. It is very flexible and adjustable to many different contexts and needs. Finally, it is simple and relatively inexpensive to conduct, which we believe is one of the main reasons for its uptake by many groups globally, particularly those in low- and middle-income countries.

  4. Setting research priorities for maternal, newborn, child health and nutrition in India by engaging experts from 256 indigenous institutions contributing over 4000 research ideas: a CHNRI exercise by ICMR and INCLEN

    PubMed Central

    Arora, Narendra K; Mohapatra, Archisman; Gopalan, Hema S; Wazny, Kerri; Thavaraj, Vasantha; Rasaily, Reeta; Das, Manoj K; Maheshwari, Meenu; Bahl, Rajiv; Qazi, Shamim A; Black, Robert E; Rudan, Igor

    2017-01-01

    Background Health research in low– and middle– income countries (LMICs) is often driven by donor priorities rather than by the needs of the countries where the research takes place. This lack of alignment of donor’s priorities with local research need may be one of the reasons why countries fail to achieve set goals for population health and nutrition. India has a high burden of morbidity and mortality in women, children and infants. In order to look forward toward the Sustainable Development Goals, the Indian Council of Medical Research (ICMR) and the INCLEN Trust International (INCLEN) employed the Child Health and Nutrition Research Initiative’s (CHNRI) research priority setting method for maternal, neonatal, child health and nutrition with the timeline of 2016–2025. The exercise was the largest to–date use of the CHNRI methodology, both in terms of participants and ideas generated and also expanded on the methodology. Methods CHNRI is a crowdsourcing–based exercise that involves using the collective intelligence of a group of stakeholders, usually researchers, to generate and score research options against a set of criteria. This paper reports on a large umbrella CHNRI that was divided into four theme–specific CHNRIs (maternal, newborn, child health and nutrition). A National Steering Group oversaw the exercise and four theme–specific Research Sub–Committees technically supported finalizing the scoring criteria and refinement of research ideas for the respective thematic areas. The exercise engaged participants from 256 institutions across India – 4003 research ideas were generated from 498 experts which were consolidated into 373 research options (maternal health: 122; newborn health: 56; child health: 101; nutrition: 94); 893 experts scored these against five criteria (answerability, relevance, equity, innovation and out–of–box thinking, investment on research). Relative weights to the criteria were assigned by 79 members from the Larger Reference Group. Given India’s diversity, priorities were identified at national and three regional levels: (i) the Empowered Action Group (EAG) and North–Eastern States; (ii) States and Union territories in Northern India (including West Bengal); and (iii) States and Union territories in Southern and Western parts of India. Conclusions The exercise leveraged the inherent flexibility of the CHNRI method in multiple ways. It expanded on the CHNRI methodology enabling analyses for identification of research priorities at national and regional levels. However, prioritization of research options are only valuable if they are put to use, and we hope that donors will take advantage of this prioritized list of research options. PMID:28686749

  5. Setting health research priorities using the CHNRI method: VII. A review of the first 50 applications of the CHNRI method

    PubMed Central

    Rudan, Igor; Yoshida, Sachiyo; Chan, Kit Yee; Sridhar, Devi; Wazny, Kerri; Nair, Harish; Sheikh, Aziz; Tomlinson, Mark; Lawn, Joy E; Bhutta, Zulfiqar A; Bahl, Rajiv; Chopra, Mickey; Campbell, Harry; El Arifeen, Shams; Black, Robert E; Cousens, Simon

    2017-01-01

    Background Several recent reviews of the methods used to set research priorities have identified the CHNRI method (acronym derived from the “Child Health and Nutrition Research Initiative”) as an approach that clearly became popular and widely used over the past decade. In this paper we review the first 50 examples of application of the CHNRI method, published between 2007 and 2016, and summarize the most important messages that emerged from those experiences. Methods We conducted a literature review to identify the first 50 examples of application of the CHNRI method in chronological order. We searched Google Scholar, PubMed and so–called grey literature. Results Initially, between 2007 and 2011, the CHNRI method was mainly used for setting research priorities to address global child health issues, although the first cases of application outside this field (eg, mental health, disabilities and zoonoses) were also recorded. Since 2012 the CHNRI method was used more widely, expanding into the topics such as adolescent health, dementia, national health policy and education. The majority of the exercises were focused on issues that were only relevant to low– and middle–income countries, and national–level applications are on the rise. The first CHNRI–based articles adhered to the five recommended priority–setting criteria, but by 2016 more than two–thirds of all conducted exercises departed from recommendations, modifying the CHNRI method to suit each particular exercise. This was done not only by changing the number of criteria used, but also by introducing some entirely new criteria (eg, “low cost”, “sustainability”, “acceptability”, “feasibility”, “relevance” and others). Conclusions The popularity of the CHNRI method in setting health research priorities can be attributed to several key conceptual advances that have addressed common concerns. The method is systematic in nature, offering an acceptable framework for handling many research questions. It is also transparent and replicable, because it clearly defines the context and priority–setting criteria. It is democratic, as it relies on “crowd–sourcing”. It is inclusive, fostering “ownership” of the results by ensuring that various groups invest in the process. It is very flexible and adjustable to many different contexts and needs. Finally, it is simple and relatively inexpensive to conduct, which we believe is one of the main reasons for its uptake by many groups globally, particularly those in low– and middle–income countries. PMID:28685049

  6. Research priorities for adolescent health in low- and middle-income countries: A mixed-methods synthesis of two separate exercises.

    PubMed

    Nagata, Jason M; Hathi, Sejal; Ferguson, B Jane; Hindin, Michele J; Yoshida, Sachiyo; Ross, David A

    2018-06-01

    In order to clarify priorities and stimulate research in adolescent health in low- and middle-income countries (LMICs), the World Health Organization (WHO) conducted two priority-setting exercises based on the Child Health and Nutrition Research Initiative (CHNRI) methodology related to 1) adolescent sexual and reproductive health and 2) eight areas of adolescent health including communicable diseases prevention and management, injuries and violence, mental health, non-communicable diseases management, nutrition, physical activity, substance use, and health policy. Although the CHNRI methodology has been utilized in over 50 separate research priority setting exercises, none have qualitatively synthesized the ultimate findings across studies. The purpose of this study was to conduct a mixed-method synthesis of two research priority-setting exercises for adolescent health in LMICs based on the CHNRI methodology and to situate the priority questions within the current global health agenda. All of the 116 top-ranked questions presented in each exercise were analyzed by two independent reviewers. Word clouds were generated based on keywords from the top-ranked questions. Questions were coded and content analysis was conducted based on type of delivery platform, vulnerable populations, and the Survive, Thrive, and Transform framework from the United Nations Global Strategy for Women's, Children's, and Adolescents' Health, 2016-2030. Within the 53 top-ranked intervention-related questions that specified a delivery platform, the platforms specified were schools (n = 17), primary care (n = 12), community (n = 11), parenting (n = 6), virtual media (n = 5), and peers (n = 2). Twenty questions specifically focused on vulnerable adolescents, including those living with HIV, tuberculosis, mental illness, or neurodevelopmental disorders; victims of gender-based violence; refugees; young persons who inject drugs; sex workers; slum dwellers; out-of-school youth; and youth in armed conflict. A majority of the top-ranked questions (108/116) aligned with one or a combination of the Survive (n = 39), Thrive (n = 67), and Transform (n = 28) agendas. This study advances the CHNRI methodology by conducting the first mixed-methods synthesis of multiple research priority-setting exercises by analyzing keywords (using word clouds) and themes (using content analysis).

  7. Research priorities for adolescent health in low- and middle-income countries: A mixed-methods synthesis of two separate exercises

    PubMed Central

    Nagata, Jason M; Hathi, Sejal; Ferguson, B Jane; Hindin, Michele J; Yoshida, Sachiyo; Ross, David A

    2018-01-01

    Background In order to clarify priorities and stimulate research in adolescent health in low- and middle-income countries (LMICs), the World Health Organization (WHO) conducted two priority-setting exercises based on the Child Health and Nutrition Research Initiative (CHNRI) methodology related to 1) adolescent sexual and reproductive health and 2) eight areas of adolescent health including communicable diseases prevention and management, injuries and violence, mental health, non-communicable diseases management, nutrition, physical activity, substance use, and health policy. Although the CHNRI methodology has been utilized in over 50 separate research priority setting exercises, none have qualitatively synthesized the ultimate findings across studies. The purpose of this study was to conduct a mixed-method synthesis of two research priority-setting exercises for adolescent health in LMICs based on the CHNRI methodology and to situate the priority questions within the current global health agenda. Methods All of the 116 top-ranked questions presented in each exercise were analyzed by two independent reviewers. Word clouds were generated based on keywords from the top-ranked questions. Questions were coded and content analysis was conducted based on type of delivery platform, vulnerable populations, and the Survive, Thrive, and Transform framework from the United Nations Global Strategy for Women’s, Children’s, and Adolescents’ Health, 2016-2030. Findings Within the 53 top-ranked intervention-related questions that specified a delivery platform, the platforms specified were schools (n = 17), primary care (n = 12), community (n = 11), parenting (n = 6), virtual media (n = 5), and peers (n = 2). Twenty questions specifically focused on vulnerable adolescents, including those living with HIV, tuberculosis, mental illness, or neurodevelopmental disorders; victims of gender-based violence; refugees; young persons who inject drugs; sex workers; slum dwellers; out-of-school youth; and youth in armed conflict. A majority of the top-ranked questions (108/116) aligned with one or a combination of the Survive (n = 39), Thrive (n = 67), and Transform (n = 28) agendas. Conclusions This study advances the CHNRI methodology by conducting the first mixed-methods synthesis of multiple research priority-setting exercises by analyzing keywords (using word clouds) and themes (using content analysis). PMID:29497507

  8. Setting health research priorities using the CHNRI method: IV. Key conceptual advances

    PubMed Central

    Rudan, Igor

    2016-01-01

    Introduction Child Health and Nutrition Research Initiative (CHNRI) started as an initiative of the Global Forum for Health Research in Geneva, Switzerland. Its aim was to develop a method that could assist priority setting in health research investments. The first version of the CHNRI method was published in 2007–2008. The aim of this paper was to summarize the history of the development of the CHNRI method and its key conceptual advances. Methods The guiding principle of the CHNRI method is to expose the potential of many competing health research ideas to reduce disease burden and inequities that exist in the population in a feasible and cost–effective way. Results The CHNRI method introduced three key conceptual advances that led to its increased popularity in comparison to other priority–setting methods and processes. First, it proposed a systematic approach to listing a large number of possible research ideas, using the “4D” framework (description, delivery, development and discovery research) and a well–defined “depth” of proposed research ideas (research instruments, avenues, options and questions). Second, it proposed a systematic approach for discriminating between many proposed research ideas based on a well–defined context and criteria. The five “standard” components of the context are the population of interest, the disease burden of interest, geographic limits, time scale and the preferred style of investing with respect to risk. The five “standard” criteria proposed for prioritization between research ideas are answerability, effectiveness, deliverability, maximum potential for disease burden reduction and the effect on equity. However, both the context and the criteria can be flexibly changed to meet the specific needs of each priority–setting exercise. Third, it facilitated consensus development through measuring collective optimism on each component of each research idea among a larger group of experts using a simple scoring system. This enabled the use of the knowledge of many experts in the field, “visualising” their collective opinion and presenting the list of many research ideas with their ranks, based on an intuitive score that ranges between 0 and 100. Conclusions Two recent reviews showed that the CHNRI method, an approach essentially based on “crowdsourcing”, has become the dominant approach to setting health research priorities in the global biomedical literature over the past decade. With more than 50 published examples of implementation to date, it is now widely used in many international organisations for collective decision–making on health research priorities. The applications have been helpful in promoting better balance between investments in fundamental research, translation research and implementation research. PMID:27418959

  9. Setting health research priorities using the CHNRI method: IV. Key conceptual advances.

    PubMed

    Rudan, Igor

    2016-06-01

    Child Health and Nutrition Research Initiative (CHNRI) started as an initiative of the Global Forum for Health Research in Geneva, Switzerland. Its aim was to develop a method that could assist priority setting in health research investments. The first version of the CHNRI method was published in 2007-2008. The aim of this paper was to summarize the history of the development of the CHNRI method and its key conceptual advances. The guiding principle of the CHNRI method is to expose the potential of many competing health research ideas to reduce disease burden and inequities that exist in the population in a feasible and cost-effective way. The CHNRI method introduced three key conceptual advances that led to its increased popularity in comparison to other priority-setting methods and processes. First, it proposed a systematic approach to listing a large number of possible research ideas, using the "4D" framework (description, delivery, development and discovery research) and a well-defined "depth" of proposed research ideas (research instruments, avenues, options and questions). Second, it proposed a systematic approach for discriminating between many proposed research ideas based on a well-defined context and criteria. The five "standard" components of the context are the population of interest, the disease burden of interest, geographic limits, time scale and the preferred style of investing with respect to risk. The five "standard" criteria proposed for prioritization between research ideas are answerability, effectiveness, deliverability, maximum potential for disease burden reduction and the effect on equity. However, both the context and the criteria can be flexibly changed to meet the specific needs of each priority-setting exercise. Third, it facilitated consensus development through measuring collective optimism on each component of each research idea among a larger group of experts using a simple scoring system. This enabled the use of the knowledge of many experts in the field, "visualising" their collective opinion and presenting the list of many research ideas with their ranks, based on an intuitive score that ranges between 0 and 100. Two recent reviews showed that the CHNRI method, an approach essentially based on "crowdsourcing", has become the dominant approach to setting health research priorities in the global biomedical literature over the past decade. With more than 50 published examples of implementation to date, it is now widely used in many international organisations for collective decision-making on health research priorities. The applications have been helpful in promoting better balance between investments in fundamental research, translation research and implementation research.

  10. An evaluation of emerging vaccines for childhood meningococcal disease

    PubMed Central

    2011-01-01

    Background Meningococcal meningitis is a major cause of disease worldwide, with frequent epidemics particularly affecting an area of sub-Saharan Africa known as the “meningitis belt”. Neisseria meningitidis group A (MenA) is responsible for major epidemics in Africa. Recently W-135 has emerged as an important pathogen. Currently, the strategy for control of such outbreaks is emergency use of meningococcal (MC) polysaccharide vaccines, but these have a limited ability to induce herd immunity and elicit an adequate immune response in infant and young children. In recent times initiatives have been taken to introduce meningococcal conjugate vaccine in these African countries. Currently there are two different types of MC conjugate vaccines at late stages of development covering serogroup A and W-135: a multivalent MC conjugate vaccine against serogroup A,C,Y and W-135; and a monovalent conjugate vaccine against serogroup A. We aimed to perform a structured assessment of these emerging meningococcal vaccines as a means of reducing global meningococal disease burden among children under 5 years of age. Methods We used a modified CHNRI methodology for setting priorities in health research investments. This was done in two stages. In the first stage we systematically reviewed the literature related to emerging MC vaccines relevant to 12 criteria of interest. In Stage II, we conducted an expert opinion exercise by inviting 20 experts (leading basic scientists, international public health researchers, international policy makers and representatives of pharmaceutical companies). They answered questions from CHNRI framework and their “collective optimism” towards each criterion was documented on a scale from 0 to 100%. Results For MenA conjugate vaccine the experts showed very high level of optimism (~ 90% or more) for 7 out of the 12 criteria. The experts felt that the likelihood of efficacy on meningitis was very high (~ 90%). Deliverability, acceptability to health workers, end users and the effect on equity were all seen as highly likely (~ 90%). In terms of the maximum potential impact on meningitis disease burden, the median potential effectiveness of the vaccines in reduction of overall meningitis mortality was estimated to be 20%; (interquartile range 20-40% and min. 8%, max 50 %). For the multivalent meningococcal vaccines the experts had similar optimism for most of the 12 CHNRI criteria with slightly lower optimism in answerability and low development cost criteria. The main concern was expressed over the cost of product, its affordability and cost of implementation. Conclusions With increasing recognition of the burden of meningococcal meningitis, especially during epidemics in Africa, it is vitally important that strategies are taken to reduce the morbidity and mortality attributable to this disease. Improved MC vaccines are a promising investment that could substantially contribute to reduction of child meningitis mortality world-wide. PMID:21501447

  11. An evaluation of the emerging vaccines against influenza in children

    PubMed Central

    2013-01-01

    Background Influenza is an under-appreciated cause of acute lower respiratory infections (ALRI) in children. It is estimated to cause approximately 20 million new episodes of ALRI in children annually, 97% of these occurring in developing countries. It is also estimated to result in 28000 to 112000 deaths annually in young children. Apart from hospitalisations and deaths, influenza has significant economic consequences. The current egg-based inactivated influenza vaccines have several limitations: annual vaccination, high production costs, and cannot respond adequately to meet the demand during pandemics. Methods We used a modified CHNRI methodology for setting priorities in health research investments. This was done in two stages. In Stage I, we systematically reviewed the literature related to emerging cross-protective vaccines against influenza relevant to several criteria of interest: answerability; cost of development, production and implementation; efficacy and effectiveness; deliverability, affordability and sustainability; maximum potential impact on disease burden reduction; acceptability to the end users and health workers; and effect on equity. In Stage II, we conducted an expert opinion exercise by inviting 20 experts (leading basic scientists, international public health researchers, international policy makers and representatives of pharmaceutical companies). They answered questions from the CHNRI framework and their “collective optimism” towards each criterion was documented on a scale from 0 to 100%. Results The experts expressed very high level of optimism for deliverability, impact on equity, and acceptability to health workers and end users. However, they expressed concerns over the criteria of answerability, low development cost, low product cost, low implementation cost, affordability and, to a lesser extent sustainability. In addition they felt that the vaccine would have higher efficacy and impact on disease burden reduction on overall influenza-associated disease rather than specifically influenza-associated pneumonia. Conclusion Although the landscape of emerging influenza vaccines shows several promising candidates, it is unlikely that the advancements in the newer vaccine technologies will be able to progress through to large scale production in the near future. The combined effects of continued investments in researching new vaccines and improvements of available vaccines will hopefully shorten the time needed to the development of an effective seasonal and pandemic influenza vaccine suitable for large scale production. PMID:24564565

  12. An evaluation of the emerging vaccines against influenza in children.

    PubMed

    Nair, Harish; Lau, Eva; Brooks, W; Seong, Ang; Theodoratou, Evropi; Zgaga, Lina; Huda, Tanvir; Jadhav, Suresh S; Rudan, Igor; Campbell, Harry

    2013-01-01

    Influenza is an under-appreciated cause of acute lower respiratory infections (ALRI) in children. It is estimated to cause approximately 20 million new episodes of ALRI in children annually, 97% of these occurring in developing countries. It is also estimated to result in 28000 to 112000 deaths annually in young children. Apart from hospitalisations and deaths, influenza has significant economic consequences. The current egg-based inactivated influenza vaccines have several limitations: annual vaccination, high production costs, and cannot respond adequately to meet the demand during pandemics. We used a modified CHNRI methodology for setting priorities in health research investments. This was done in two stages. In Stage I, we systematically reviewed the literature related to emerging cross-protective vaccines against influenza relevant to several criteria of interest: answerability; cost of development, production and implementation; efficacy and effectiveness; deliverability, affordability and sustainability; maximum potential impact on disease burden reduction; acceptability to the end users and health workers; and effect on equity. In Stage II, we conducted an expert opinion exercise by inviting 20 experts (leading basic scientists, international public health researchers, international policy makers and representatives of pharmaceutical companies). They answered questions from the CHNRI framework and their "collective optimism" towards each criterion was documented on a scale from 0 to 100%. The experts expressed very high level of optimism for deliverability, impact on equity, and acceptability to health workers and end users. However, they expressed concerns over the criteria of answerability, low development cost, low product cost, low implementation cost, affordability and, to a lesser extent sustainability. In addition they felt that the vaccine would have higher efficacy and impact on disease burden reduction on overall influenza-associated disease rather than specifically influenza-associated pneumonia. Although the landscape of emerging influenza vaccines shows several promising candidates, it is unlikely that the advancements in the newer vaccine technologies will be able to progress through to large scale production in the near future. The combined effects of continued investments in researching new vaccines and improvements of available vaccines will hopefully shorten the time needed to the development of an effective seasonal and pandemic influenza vaccine suitable for large scale production.

  13. Setting priorities for zinc-related health research to reduce children's disease burden worldwide: an application of the Child Health and Nutrition Research Initiative's research priority-setting method.

    PubMed

    Brown, Kenneth H; Hess, Sonja Y; Boy, Erick; Gibson, Rosalind S; Horton, Susan; Osendarp, Saskia J; Sempertegui, Fernando; Shrimpton, Roger; Rudan, Igor

    2009-03-01

    To make the best use of limited resources for supporting health-related research to reduce child mortality, it is necessary to apply a suitable method to rank competing research options. The Child Health and Nutrition Research Initiative (CHNRI) developed a new methodology for setting health research priorities. To broaden experience with this priority-setting technique, we applied the method to rank possible research priorities concerning the control of Zn deficiency. Although Zn deficiency is not generally recognized as a direct cause of child mortality, recent research indicates that it predisposes children to an increased incidence and severity of several of the major direct causes of morbidity and mortality. Leading experts in the field of Zn research in child health were identified and invited to participate in a technical working group (TWG) to establish research priorities. The individuals were chosen to represent a wide range of expertise in Zn nutrition. The seven TWG members submitted a total of ninety research options, which were then consolidated into a final list of thirty-one research options categorized by the type of resulting intervention. The identified priorities were dominated by research investment options targeting Zn supplementation, and were followed by research on Zn fortification, general aspects of Zn nutrition, dietary modification and other new interventions. In general, research options that aim to improve the efficiency of an already existing intervention strategy received higher priority scores. Challenges identified during the implementation of the methodology and suggestions to modify the priority-setting procedures are discussed.

  14. Setting Priorities in Global Child Health Research Investments: Guidelines for Implementation of the CHNRI Method

    PubMed Central

    Rudan, Igor; Gibson, Jennifer L.; Ameratunga, Shanthi; El Arifeen, Shams; Bhutta, Zulfiqar A.; Black, Maureen; Black, Robert E.; Brown, Kenneth H.; Campbell, Harry; Carneiro, Ilona; Chan, Kit Yee; Chandramohan, Daniel; Chopra, Mickey; Cousens, Simon; Darmstadt, Gary L.; Gardner, Julie Meeks; Hess, Sonja Y.; Hyder, Adnan A.; Kapiriri, Lydia; Kosek, Margaret; Lanata, Claudio F.; Lansang, Mary Ann; Lawn, Joy; Tomlinson, Mark; Tsai, Alexander C.; Webster, Jayne

    2008-01-01

    This article provides detailed guidelines for the implementation of systematic method for setting priorities in health research investments that was recently developed by Child Health and Nutrition Research Initiative (CHNRI). The target audience for the proposed method are international agencies, large research funding donors, and national governments and policy-makers. The process has the following steps: (i) selecting the managers of the process; (ii) specifying the context and risk management preferences; (iii) discussing criteria for setting health research priorities; (iv) choosing a limited set of the most useful and important criteria; (v) developing means to assess the likelihood that proposed health research options will satisfy the selected criteria; (vi) systematic listing of a large number of proposed health research options; (vii) pre-scoring check of all competing health research options; (viii) scoring of health research options using the chosen set of criteria; (ix) calculating intermediate scores for each health research option; (x) obtaining further input from the stakeholders; (xi) adjusting intermediate scores taking into account the values of stakeholders; (xii) calculating overall priority scores and assigning ranks; (xiii) performing an analysis of agreement between the scorers; (xiv) linking computed research priority scores with investment decisions; (xv) feedback and revision. The CHNRI method is a flexible process that enables prioritizing health research investments at any level: institutional, regional, national, international, or global. PMID:19090596

  15. An evaluation of respiratory administration of measles vaccine for prevention of acute lower respiratory infections in children.

    PubMed

    Higginson, Daisy; Theodoratou, Evropi; Nair, Harish; Huda, Tanvir; Zgaga, Lina; Jadhav, Suresh S; Omer, Saad B; Rudan, Igor; Campbell, Harry

    2011-04-13

    Measles was responsible for an estimated 100,000 deaths worldwide in 2008. Despite being a vaccine-preventable disease, measles remains a major cause of morbidity and mortality in young children. Although a safe and effective injectable measles vaccine has been available for over 50 years it has not been possible to achieve the uniformly high levels of coverage (required to achieve measles eradication) in most parts of the developing world. Aerosolised measles vaccines are now under development with the hope of challenging the delivery factors currently limiting the coverage of the existing vaccine. We used a modified CHNRI methodology for setting priorities in health research investments to assess the strengths and weaknesses of this emerging intervention to decrease the burden of childhood pneumonia. This was done in two stages. In Stage I, we systematically reviewed the literature related to emerging aerosol vaccines against measles relevant to several criteria of interest. Although there are a number of different aerosol vaccine approaches under development, for the purpose of this exercise, all were considered as one intervention. The criteria of interest were: answerability; cost of development, production and implementation; efficacy and effectiveness; deliverability, affordability and sustainability; maximum potential impact on disease burden reduction; acceptability to the end users and health workers; and effect on equity. In Stage II, we conducted an expert opinion exercise by inviting 20 experts (leading basic scientists, international public health researchers, international policy makers and representatives of pharmaceutical companies). The policy makers and industry representatives accepted our invitation on the condition of anonymity, due to the sensitive nature of their involvement in such exercises. They answered questions from the CHNRI framework and their "collective optimism" towards each criterion was documented on a scale from 0 to 100%. The panel of experts expressed mixed feelings about an aerosol measles vaccine. The group expressed low levels of optimism regarding the criteria of likelihood of efficacy and low cost of development (scores around 50%); moderate levels of optimism regarding answerability, low cost of production, low cost of implementation and affordability (score around 60%); and high levels of optimism regarding deliverability, impact on equity and acceptability to health workers and end-users (scores over 80%). Finally, the experts felt that this intervention will have a modest but nevertheless important impact on reduction of burden of disease due to childhood pneumonia (median: 5%, interquartile range 1-15%, minimum 0%, maximum 45%). Aerosol measles vaccine is at an advanced stage of development, with evidence of good immunogenicity. This new intervention will be presented as a feasible candidate strategy in the campaign for global elimination of measles. It also presents an unique opportunity to decrease the overall burden of disease due to severe pneumonia in young children.

  16. An evaluation of the emerging interventions against Respiratory Syncytial Virus (RSV)-associated acute lower respiratory infections in children.

    PubMed

    Nair, Harish; Verma, Vasundhara R; Theodoratou, Evropi; Zgaga, Lina; Huda, Tanvir; Simões, Eric A F; Wright, Peter F; Rudan, Igor; Campbell, Harry

    2011-04-13

    Respiratory Syncytial Virus (RSV) is the leading cause of acute lower respiratory infections (ALRI) in children. It is estimated to cause approximately 33.8 million new episodes of ALRI in children annually, 96% of these occurring in developing countries. It is also estimated to result in about 53,000 to 199,000 deaths annually in young children. Currently there are several vaccine and immunoprophylaxis candidates against RSV in the developmental phase targeting active and passive immunization. We used a modified CHNRI methodology for setting priorities in health research investments. This was done in two stages. In Stage I, we systematically reviewed the literature related to emerging vaccines against RSV relevant to 12 criteria of interest. In Stage II, we conducted an expert opinion exercise by inviting 20 experts (leading basic scientists, international public health researchers, international policy makers and representatives of pharmaceutical companies). The policy makers and industry representatives accepted our invitation on the condition of anonymity, due to the sensitive nature of their involvement in such exercises. They answered questions from the CHNRI framework and their "collective optimism" towards each criterion was documented on a scale from 0 to 100%. In the case of candidate vaccines for active immunization of infants against RSV, the experts expressed very low levels of optimism for low product cost, affordability and low cost of development; moderate levels of optimism regarding the criteria of answerability, likelihood of efficacy, deliverability, sustainability and acceptance to end users for the interventions; and high levels of optimism regarding impact on equity and acceptance to health workers. While considering the candidate vaccines targeting pregnant women, the panel expressed low levels of optimism for low product cost, affordability, answerability and low development cost; moderate levels of optimism for likelihood of efficacy, deliverability, sustainability and impact on equity; high levels of optimism regarding acceptance to end users and health workers. The group also evaluated immunoprophylaxis against RSV using monoclonal antibodies and expressed no optimism towards low product cost; very low levels of optimism regarding deliverability, affordability, sustainability, low implementation cost and impact on equity; moderate levels of optimism against the criteria of answerability, likelihood of efficacy, acceptance to end-users and health workers; and high levels of optimism regarding low development cost. They felt that either of these vaccines would have a high impact on reducing burden of childhood ALRI due to RSV and reduce the overall childhood ALRI burden by a maximum of about 10%. Although monoclonal antibodies have proven to be effective in providing protection to high-risk infants, their introduction in resource poor settings might be limited by high cost associated with them. Candidate vaccines for active immunization of infants against RSV hold greatest promise. Introduction of a low cost vaccine against RSV would reduce the inequitable distribution of burden due to childhood ALRI and will most likely have a high impact on morbidity and mortality due to severe ALRI.

  17. Setting health research priorities using the CHNRI method: VI. Quantitative properties of human collective opinion

    PubMed Central

    Yoshida, Sachiyo; Rudan, Igor; Cousens, Simon

    2016-01-01

    Introduction Crowdsourcing has become an increasingly important tool to address many problems – from government elections in democracies, stock market prices, to modern online tools such as TripAdvisor or Internet Movie Database (IMDB). The CHNRI method (the acronym for the Child Health and Nutrition Research Initiative) for setting health research priorities has crowdsourcing as the major component, which it uses to generate, assess and prioritize between many competing health research ideas. Methods We conducted a series of analyses using data from a group of 91 scorers to explore the quantitative properties of their collective opinion. We were interested in the stability of their collective opinion as the sample size increases from 15 to 90. From a pool of 91 scorers who took part in a previous CHNRI exercise, we used sampling with replacement to generate multiple random samples of different size. First, for each sample generated, we identified the top 20 ranked research ideas, among 205 that were proposed and scored, and calculated the concordance with the ranking generated by the 91 original scorers. Second, we used rank correlation coefficients to compare the ranks assigned to all 205 proposed research ideas when samples of different size are used. We also analysed the original pool of 91 scorers to to look for evidence of scoring variations based on scorers' characteristics. Results The sample sizes investigated ranged from 15 to 90. The concordance for the top 20 scored research ideas increased with sample sizes up to about 55 experts. At this point, the median level of concordance stabilized at 15/20 top ranked questions (75%), with the interquartile range also generally stable (14–16). There was little further increase in overlap when the sample size increased from 55 to 90. When analysing the ranking of all 205 ideas, the rank correlation coefficient increased as the sample size increased, with a median correlation of 0.95 reached at the sample size of 45 experts (median of the rank correlation coefficient = 0.95; IQR 0.94–0.96). Conclusions Our analyses suggest that the collective opinion of an expert group on a large number of research ideas, expressed through categorical variables (Yes/No/Not Sure/Don't know), stabilises relatively quickly in terms of identifying the ideas that have most support. In the exercise we found a high degree of reproducibility of the identified research priorities was achieved with as few as 45–55 experts. PMID:27350874

  18. Setting health research priorities using the CHNRI method: VI. Quantitative properties of human collective opinion.

    PubMed

    Yoshida, Sachiyo; Rudan, Igor; Cousens, Simon

    2016-06-01

    Crowdsourcing has become an increasingly important tool to address many problems - from government elections in democracies, stock market prices, to modern online tools such as TripAdvisor or Internet Movie Database (IMDB). The CHNRI method (the acronym for the Child Health and Nutrition Research Initiative) for setting health research priorities has crowdsourcing as the major component, which it uses to generate, assess and prioritize between many competing health research ideas. We conducted a series of analyses using data from a group of 91 scorers to explore the quantitative properties of their collective opinion. We were interested in the stability of their collective opinion as the sample size increases from 15 to 90. From a pool of 91 scorers who took part in a previous CHNRI exercise, we used sampling with replacement to generate multiple random samples of different size. First, for each sample generated, we identified the top 20 ranked research ideas, among 205 that were proposed and scored, and calculated the concordance with the ranking generated by the 91 original scorers. Second, we used rank correlation coefficients to compare the ranks assigned to all 205 proposed research ideas when samples of different size are used. We also analysed the original pool of 91 scorers to to look for evidence of scoring variations based on scorers' characteristics. The sample sizes investigated ranged from 15 to 90. The concordance for the top 20 scored research ideas increased with sample sizes up to about 55 experts. At this point, the median level of concordance stabilized at 15/20 top ranked questions (75%), with the interquartile range also generally stable (14-16). There was little further increase in overlap when the sample size increased from 55 to 90. When analysing the ranking of all 205 ideas, the rank correlation coefficient increased as the sample size increased, with a median correlation of 0.95 reached at the sample size of 45 experts (median of the rank correlation coefficient = 0.95; IQR 0.94-0.96). Our analyses suggest that the collective opinion of an expert group on a large number of research ideas, expressed through categorical variables (Yes/No/Not Sure/Don't know), stabilises relatively quickly in terms of identifying the ideas that have most support. In the exercise we found a high degree of reproducibility of the identified research priorities was achieved with as few as 45-55 experts.

  19. Setting research priorities to reduce global mortality from preterm birth and low birth weight by 2015.

    PubMed

    Bahl, Rajiv; Martines, Jose; Bhandari, Nita; Biloglav, Zrinka; Edmond, Karen; Iyengar, Sharad; Kramer, Michael; Lawn, Joy E; Manandhar, D S; Mori, Rintaro; Rasmussen, Kathleen M; Sachdev, H P S; Singhal, Nalini; Tomlinson, Mark; Victora, Cesar; Williams, Anthony F; Chan, Kit Yee; Rudan, Igor

    2012-06-01

    This paper aims to identify health research priorities that could improve the rate of progress in reducing global neonatal mortality from preterm birth and low birth weight (PB/LBW), as set out in the UN's Millennium Development Goal 4. We applied the Child Health and Nutrition Research Initiative (CHNRI) methodology for setting priorities in health research investments. In the process coordinated by the World Health Organization in 2007-2008, 21 researchers with interest in child, maternal and newborn health suggested 82 research ideas that spanned across the broad spectrum of epidemiological research, health policy and systems research, improvement of existing interventions and development of new interventions. The 82 research questions were then assessed for answerability, effectiveness, deliverability, maximum potential for mortality reduction and the effect on equity using the CHNRI method. The top 10 identified research priorities were dominated by health systems and policy research questions (eg, identification of LBW infants born at home within 24-48 hours of birth for additional care; approaches to improve quality of care of LBW infants in health facilities; identification of barriers to optimal home care practices including care seeking; and approaches to increase the use of antenatal corticosteriods in preterm labor and to improve access to hospital care for LBW infants). These were followed by priorities for improvement of the existing interventions (eg, early initiation of breastfeeding, including feeding mode and techniques for those unable to suckle directly from the breast; improved cord care, such as chlorhexidine application; and alternative methods to Kangaroo Mother Care (KMC) to keep LBW infants warm in community settings). The highest-ranked epidemiological question suggested improving criteria for identifying LBW infants who need to be cared for in a hospital. Among the new interventions, the greatest support was shown for the development of new simple and effective interventions for providing thermal care to LBW infants, if KMC is not acceptable to the mother. The context for this exercise was set within the MDG4, requiring an urgent and rapid progress in mortality reduction from low birth weight, rather than identifying long-term strategic solutions of the greatest potential. In a short-term context, the health policy and systems research to improve access and coverage by the existing interventions, coupled with further research to improve effectiveness, deliverability and acceptance of existing interventions, and epidemiological research to address the key gaps in knowledge, were all highlighted as research priorities.

  20. Setting global research priorities for developmental disabilities, including intellectual disabilities and autism

    PubMed Central

    Tomlinson, Mark; Yasamy, M. Taghi; Emerson, Eric; Officer, Alana; Richler, Diane; Saxena, Shekhar

    2015-01-01

    Objectives The prevalence of intellectual disabilities (ID) has been estimated at 10.4/1000 worldwide with higher rates among children and adolescents in lower income countries. The objective of this paper is to address research priorities for development disabilities, notably intellectual disabilities and autism, at the global level and to propose the more rational use of scarce funds in addressing this under-investigated area. Methods An expert group was identified and invited to systematically list and score research questions. They applied the priority setting methodology of the Child Health and Nutrition Research Initiative (CHNRI) to generate research questions and to evaluate them using a set of five criteria: answerability, feasibility, applicability and impact, support within the context and equity. Findings The results of this process clearly indicated that the important priorities for future research related to the need for effective and efficient approaches to early intervention, empowerment of families supporting a person with developmental disability and to address preventable causes of poor health in people with ID and autism. Conclusions For the public health and other systems to become more effective in delivering appropriate support to persons with developmental disabilities, greater (and more targeted) investment in research is required to produce evidence of what works consistent with international human rights standards. PMID:24397279

  1. Setting priorities for a research agenda to combat drug-resistant tuberculosis in children.

    PubMed

    Velayutham, B; Nair, D; Ramalingam, S; Perez-Velez, C M; Becerra, M C; Swaminathan, S

    2015-12-21

    Numerous knowledge gaps hamper the prevention and treatment of childhood drug-resistant tuberculosis (TB). Identifying research priorities is vital to inform and develop strategies to address this neglected problem. To systematically identify and rank research priorities in childhood drug-resistant TB. Adapting the Child Health and Nutrition Research Initiative (CHNRI) methodology, we compiled 53 research questions in four research areas, then classified the questions into three research types. We invited experts in childhood drug-resistant TB to score these questions through an online survey. A total of 81 respondents participated in the survey. The top-ranked research question was to identify the best combination of existing diagnostic tools for early diagnosis. Highly ranked treatment-related questions centred on the reasons for and interventions to improve treatment outcomes, adverse effects of drugs and optimal treatment duration. The prevalence of drug-resistant TB was the highest-ranked question in the epidemiology area. The development type questions that ranked highest focused on interventions for optimal diagnosis, treatment and modalities for treatment delivery. This is the first effort to identify and rank research priorities for childhood drug-resistant TB. The result is a resource to guide research to improve prevention and treatment of drug-resistant TB in children.

  2. Setting global research priorities for developmental disabilities, including intellectual disabilities and autism.

    PubMed

    Tomlinson, M; Yasamy, M T; Emerson, E; Officer, A; Richler, D; Saxena, S

    2014-12-01

    The prevalence of intellectual disabilities (ID) has been estimated at 10.4/1000 worldwide with higher rates among children and adolescents in lower income countries. The objective of this paper is to address research priorities for development disabilities, notably ID and autism, at the global level and to propose the more rational use of scarce funds in addressing this under-investigated area. An expert group was identified and invited to systematically list and score research questions. They applied the priority setting methodology of the Child Health and Nutrition Research Initiative (CHNRI) to generate research questions and to evaluate them using a set of five criteria: answerability, feasibility, applicability and impact, support within the context and equity. The results of this process clearly indicated that the important priorities for future research related to the need for effective and efficient approaches to early intervention, empowerment of families supporting a person with developmental disability and to address preventable causes of poor health in people with ID and autism. For the public health and other systems to become more effective in delivering appropriate support to persons with developmental disabilities, greater (and more targeted) investment in research is required to produce evidence of what works consistent with international human rights standards. © 2014 MENCAP and International Association of the Scientific Study of Intellectual and Developmental Disabilities and John Wiley & Sons Ltd.

  3. Setting health research priorities using the CHNRI method: V. Quantitative properties of human collective knowledge.

    PubMed

    Rudan, Igor; Yoshida, Sachiyo; Wazny, Kerri; Chan, Kit Yee; Cousens, Simon

    2016-06-01

    The CHNRI method for setting health research priorities has crowdsourcing as the major component. It uses the collective opinion of a group of experts to generate, assess and prioritize between many competing health research ideas. It is difficult to compare the accuracy of human individual and collective opinions in predicting uncertain future outcomes before the outcomes are known. However, this limitation does not apply to existing knowledge, which is an important component underlying opinion. In this paper, we report several experiments to explore the quantitative properties of human collective knowledge and discuss their relevance to the CHNRI method. We conducted a series of experiments in groups of about 160 (range: 122-175) undergraduate Year 2 medical students to compare their collective knowledge to their individual knowledge. We asked them to answer 10 questions on each of the following: (i) an area in which they have a degree of expertise (undergraduate Year 1 medical curriculum); (ii) an area in which they likely have some knowledge (general knowledge); and (iii) an area in which they are not expected to have any knowledge (astronomy). We also presented them with 20 pairs of well-known celebrities and asked them to identify the older person of the pair. In all these experiments our goal was to examine how the collective answer compares to the distribution of students' individual answers. When answering the questions in their own area of expertise, the collective answer (the median) was in the top 20.83% of the most accurate individual responses; in general knowledge, it was in the top 11.93%; and in an area with no expertise, the group answer was in the top 7.02%. However, the collective answer based on mean values fared much worse, ranging from top 75.60% to top 95.91%. Also, when confronted with guessing the older of the two celebrities, the collective response was correct in 18/20 cases (90%), while the 8 most successful individuals among the students had 19/20 correct answers (95%). However, when the system in which the students who were not sure of the correct answer were allowed to either choose an award of half of the point in all such instances, or withdraw from responding, in order to improve the score of the collective, the collective was correct in 19/20 cases (95%), while the 3 most successful individuals were correct in 17/20 cases (85%). Our experiments showed that the collective knowledge of a group with expertise in the subject should always be very close to the true value. In most cases and under most assumption, the collective knowledge will be more accurate than the knowledge of an "average" individual, but there always seems to be a small group of individuals who manage to out-perform the collective. The accuracy of collective prediction may be enhanced by allowing the individuals with low confidence in their answer to withdraw from answering.

  4. Neonatal survival in complex humanitarian emergencies: setting an evidence-based research agenda

    PubMed Central

    2014-01-01

    Background Over 40% of all deaths among children under 5 are neonatal deaths (0–28 days), and this proportion is increasing. In 2012, 2.9 million newborns died, with 99% occurring in low- and middle-income countries. Many of the countries with the highest neonatal mortality rates globally are currently or have recently been affected by complex humanitarian emergencies. Despite the global burden of neonatal morbidity and mortality and risks inherent in complex emergency situations, research investments are not commensurate to burden and little is known about the epidemiology or best practices for neonatal survival in these settings. Methods We used the Child Health and Nutrition Research Initiative (CHNRI) methodology to prioritize research questions on neonatal health in complex humanitarian emergencies. Experts evaluated 35 questions using four criteria (answerability, feasibility, relevance, equity) with three subcomponents per criterion. Using SAS 9.2, a research prioritization score (RPS) and average expert agreement score (AEA) were calculated for each question. Results Twenty-eight experts evaluated all 35 questions. RPS ranged from 0.846 to 0.679 and the AEA ranged from 0.667 to 0.411. The top ten research priorities covered a range of issues but generally fell into two categories– epidemiologic and programmatic components of neonatal health. The highest ranked question in this survey was “What strategies are effective in increasing demand for, and use of skilled attendance?” Conclusions In this study, a diverse group of experts used the CHRNI methodology to systematically identify and determine research priorities for neonatal health and survival in complex humanitarian emergencies. The priorities included the need to better understand the magnitude of the disease burden and interventions to improve neonatal health in complex humanitarian emergencies. The findings from this study will provide guidance to researchers and program implementers in neonatal and complex humanitarian fields to engage on the research priorities needed to save lives most at risk. PMID:24959198

  5. Setting health research priorities using the CHNRI method: V. Quantitative properties of human collective knowledge

    PubMed Central

    Rudan, Igor; Yoshida, Sachiyo; Wazny, Kerri; Chan, Kit Yee; Cousens, Simon

    2016-01-01

    Introduction The CHNRI method for setting health research priorities has crowdsourcing as the major component. It uses the collective opinion of a group of experts to generate, assess and prioritize between many competing health research ideas. It is difficult to compare the accuracy of human individual and collective opinions in predicting uncertain future outcomes before the outcomes are known. However, this limitation does not apply to existing knowledge, which is an important component underlying opinion. In this paper, we report several experiments to explore the quantitative properties of human collective knowledge and discuss their relevance to the CHNRI method. Methods We conducted a series of experiments in groups of about 160 (range: 122–175) undergraduate Year 2 medical students to compare their collective knowledge to their individual knowledge. We asked them to answer 10 questions on each of the following: (i) an area in which they have a degree of expertise (undergraduate Year 1 medical curriculum); (ii) an area in which they likely have some knowledge (general knowledge); and (iii) an area in which they are not expected to have any knowledge (astronomy). We also presented them with 20 pairs of well–known celebrities and asked them to identify the older person of the pair. In all these experiments our goal was to examine how the collective answer compares to the distribution of students’ individual answers. Results When answering the questions in their own area of expertise, the collective answer (the median) was in the top 20.83% of the most accurate individual responses; in general knowledge, it was in the top 11.93%; and in an area with no expertise, the group answer was in the top 7.02%. However, the collective answer based on mean values fared much worse, ranging from top 75.60% to top 95.91%. Also, when confronted with guessing the older of the two celebrities, the collective response was correct in 18/20 cases (90%), while the 8 most successful individuals among the students had 19/20 correct answers (95%). However, when the system in which the students who were not sure of the correct answer were allowed to either choose an award of half of the point in all such instances, or withdraw from responding, in order to improve the score of the collective, the collective was correct in 19/20 cases (95%), while the 3 most successful individuals were correct in 17/20 cases (85%). Conclusions Our experiments showed that the collective knowledge of a group with expertise in the subject should always be very close to the true value. In most cases and under most assumption, the collective knowledge will be more accurate than the knowledge of an “average” individual, but there always seems to be a small group of individuals who manage to out–perform the collective. The accuracy of collective prediction may be enhanced by allowing the individuals with low confidence in their answer to withdraw from answering. PMID:27350873

  6. An evaluation of oxygen systems for treatment of childhood pneumonia

    PubMed Central

    2011-01-01

    Background Oxygen therapy is recommended for all of the 1.5 – 2.7 million young children who consult health services with hypoxemic pneumonia each year, and the many more with other serious conditions. However, oxygen supplies are intermittent throughout the developing world. Although oxygen is well established as a treatment for hypoxemic pneumonia, quantitative evidence for its effect is lacking. This review aims to assess the utility of oxygen systems as a method for reducing childhood mortality from pneumonia. Methods Aiming to improve priority setting methods, The Child Health and Nutrition Research Initiative (CHNRI) has developed a common framework to score competing interventions into child health. That framework involves the assessment of 12 different criteria upon which interventions can be compared. This report follows the proposed framework, using a semi-systematic literature review and the results of a structured exercise gathering opinion from experts (leading basic scientists, international public health researchers, international policy makers and representatives of pharmaceutical companies), to assess and score each criterion as their “collective optimism” towards each, on a scale from 0 to 100%. Results A rough estimate from an analysis of the literature suggests that global strengthening of oxygen systems could save lives of up to 122,000 children from pneumonia annually. Following 12 CHNRI criteria, the experts expressed very high levels of optimism (over 80%) for answerability, low development cost and low product cost; high levels of optimism (60-80%) for low implementation cost, likelihood of efficacy, deliverability, acceptance to end users and health workers; and moderate levels of optimism (40-60%) for impact on equity, affordability and sustainability. The median estimate of potential effectiveness of oxygen systems to reduce the overall childhood pneumonia mortality was ~20% (interquartile range: 10-35%, min. 0%, max. 50%). However, problems with oxygen systems in terms of affordability, sustainability and impact on equity are noted in both expert opinion scores and on review. Conclusion Oxygen systems are likely to be an effective intervention in combating childhood mortality from pneumonia. However, a number of gaps in the evidence base exist that should be addressed to complete the investment case and research addressing these issues merit greater funding attention. PMID:21501446

  7. Research Priorities on the Relationship between Wasting and Stunting

    PubMed Central

    Khara, Tanya; Dolan, Carmel; Berkley, James A.

    2016-01-01

    Background Wasting and stunting are global public health problems that frequently co-exist. However, they are usually separated in terms of policy, guidance, programming and financing. Though both wasting and stunting are manifestations of undernutrition caused by disease and poor diet, there are critical gaps in our understanding of the physiological relationship between them, and how interventions for one may affect the other. The aim of this exercise was to establish research priorities in the relationships between wasting and stunting to guide future research investments. Methods and Findings We used the CHNRI (Child Health and Nutrition Research Initiative) methodology for setting research priorities in health. We utilised a group of experts in nutrition, growth and child health to prioritise 30 research questions against three criteria (answerability, usefulness and impact) using an online survey. Eighteen of 25 (72%) experts took part and prioritised research directly related to programming, particularly at the public health level. The highest-rated questions were: “Can interventions outside of the 1000 days, e.g. pre-school, school age and adolescence, lead to catch-up in height and in other developmental markers?”; “What timely interventions work to mitigate seasonal peaks in both wasting and stunting?”; and “What is the optimal formulation of ready-to-use foods to promote optimal ponderal growth and also support linear growth during and after recovery from severe acute malnutrition?” There was a high level of agreement between experts, particularly for the highest ranking questions. Conclusions Increased commitment to rigorous evaluations of treatment and prevention interventions at the public health level, addressing questions of the timing of intervention, and the extent to which impacts for both wasting and stunting can be achieved, is needed to inform global efforts to tackle undernutrition and its consequences. PMID:27159235

  8. Homeland Security Collaboration: Catch Phrase or Preeminent Organizational Construct?

    DTIC Science & Technology

    2009-09-01

    collaborative effort? C. RESEARCH METHODOLOGY This research project utilized a modified case study methodology. The traditional case study method ...discussing the research method , offering smart practices and culminate with findings and recommendations. Chapter II Homeland Security Collaboration...41 Centers for Regional Excellence, “Building Models.” 16 Chapter III Research Methodology:  Modified Case Study Method is

  9. Influence of Parameters of a Reactive Interatomic Potential on the Properties of Saturated Hydrocarbons

    DTIC Science & Technology

    2017-01-01

    Methodology 3 2.1 Modified Embedded-Atom Method Theory 3 2.1.1 Embedding Energy Function 3 2.1.2 Screening Factor 8 2.1.3 Modified Embedded-Atom...Simulation Methodology 2.1 Modified Embedded-Atom Method Theory In the EAM and MEAM formalisms1,2,5 the total energy of a system of atoms (Etot) is...An interatomic potential for saturated hydrocarbons using the modified embedded-atom method (MEAM), a semiempirical many-body potential based on

  10. Modified Method of Simplest Equation Applied to the Nonlinear Schrödinger Equation

    NASA Astrophysics Data System (ADS)

    Vitanov, Nikolay K.; Dimitrova, Zlatinka I.

    2018-03-01

    We consider an extension of the methodology of the modified method of simplest equation to the case of use of two simplest equations. The extended methodology is applied for obtaining exact solutions of model nonlinear partial differential equations for deep water waves: the nonlinear Schrödinger equation. It is shown that the methodology works also for other equations of the nonlinear Schrödinger kind.

  11. Approaches, tools and methods used for setting priorities in health research in the 21st century

    PubMed Central

    Yoshida, Sachiyo

    2016-01-01

    Background Health research is difficult to prioritize, because the number of possible competing ideas for research is large, the outcome of research is inherently uncertain, and the impact of research is difficult to predict and measure. A systematic and transparent process to assist policy makers and research funding agencies in making investment decisions is a permanent need. Methods To obtain a better understanding of the landscape of approaches, tools and methods used to prioritize health research, I conducted a methodical review using the PubMed database for the period 2001–2014. Results A total of 165 relevant studies were identified, in which health research prioritization was conducted. They most frequently used the CHNRI method (26%), followed by the Delphi method (24%), James Lind Alliance method (8%), the Combined Approach Matrix (CAM) method (2%) and the Essential National Health Research method (<1%). About 3% of studies reported no clear process and provided very little information on how priorities were set. A further 19% used a combination of expert panel interview and focus group discussion (“consultation process”) but provided few details, while a further 2% used approaches that were clearly described, but not established as a replicable method. Online surveys that were not accompanied by face–to–face meetings were used in 8% of studies, while 9% used a combination of literature review and questionnaire to scrutinise the research options for prioritization among the participating experts. Conclusion The number of priority setting exercises in health research published in PubMed–indexed journals is increasing, especially since 2010. These exercises are being conducted at a variety of levels, ranging from the global level to the level of an individual hospital. With the development of new tools and methods which have a well–defined structure – such as the CHNRI method, James Lind Alliance Method and Combined Approach Matrix – it is likely that the Delphi method and non–replicable consultation processes will gradually be replaced by these emerging tools, which offer more transparency and replicability. It is too early to say whether any single method can address the needs of most exercises conducted at different levels, or if better results may perhaps be achieved through combination of components of several methods. PMID:26401271

  12. Approaches, tools and methods used for setting priorities in health research in the 21(st) century.

    PubMed

    Yoshida, Sachiyo

    2016-06-01

    Health research is difficult to prioritize, because the number of possible competing ideas for research is large, the outcome of research is inherently uncertain, and the impact of research is difficult to predict and measure. A systematic and transparent process to assist policy makers and research funding agencies in making investment decisions is a permanent need. To obtain a better understanding of the landscape of approaches, tools and methods used to prioritize health research, I conducted a methodical review using the PubMed database for the period 2001-2014. A total of 165 relevant studies were identified, in which health research prioritization was conducted. They most frequently used the CHNRI method (26%), followed by the Delphi method (24%), James Lind Alliance method (8%), the Combined Approach Matrix (CAM) method (2%) and the Essential National Health Research method (<1%). About 3% of studies reported no clear process and provided very little information on how priorities were set. A further 19% used a combination of expert panel interview and focus group discussion ("consultation process") but provided few details, while a further 2% used approaches that were clearly described, but not established as a replicable method. Online surveys that were not accompanied by face-to-face meetings were used in 8% of studies, while 9% used a combination of literature review and questionnaire to scrutinise the research options for prioritization among the participating experts. The number of priority setting exercises in health research published in PubMed-indexed journals is increasing, especially since 2010. These exercises are being conducted at a variety of levels, ranging from the global level to the level of an individual hospital. With the development of new tools and methods which have a well-defined structure - such as the CHNRI method, James Lind Alliance Method and Combined Approach Matrix - it is likely that the Delphi method and non-replicable consultation processes will gradually be replaced by these emerging tools, which offer more transparency and replicability. It is too early to say whether any single method can address the needs of most exercises conducted at different levels, or if better results may perhaps be achieved through combination of components of several methods.

  13. A modified eco-efficiency framework and methodology for advancing the state of practice of sustainability analysis as applied to green infrastructure

    EPA Science Inventory

    We propose a modified eco-efficiency (EE) framework and novel sustainability analysis methodology for green infrastructure (GI) practices used in water resource management. Green infrastructure practices such as rainwater harvesting (RWH), rain gardens, porous pavements, and gree...

  14. Research Priorities for the Intersection of Alcohol and HIV/AIDS in Low and Middle Income Countries: A Priority Setting Exercise.

    PubMed

    Gordon, Sara; Rotheram-Borus, Mary Jane; Skeen, Sarah; Perry, Charles; Bryant, Kendall; Tomlinson, Mark

    2017-11-01

    The harmful use of alcohol is a component cause for more than 200 diseases. The association between alcohol consumption, risk taking behavior and a range of infectious diseases such as HIV/AIDS is well established. The prevalence of HIV/AIDS as well as harmful alcohol use in low and middle income countries is high. Alcohol has been identified as a modifiable risk factor in the prevention and treatment of HIV/AIDS. The objective of this paper is to define research priorities for the interaction of alcohol and HIV/AIDS in low and middle income countries. The Child Health and Nutrition Research Initiative (CHNRI) priority setting methodology was applied in order to assess research priorities of the interaction of alcohol and HIV/AIDS. A group of 171 global and local experts in the field of alcohol and or HIV/AIDS related research were identified and invited to generate research questions. This resulted in 205 research questions which have been categorized and refined by senior researchers into 48 research questions to be evaluated using five criteria: answerability, effectiveness, feasibility, applicability and impact, as well as equity. A total of 59 experts participated independently in the voluntary scoring exercise (a 34% response rate). There was substantial consensus among experts on priorities for research on alcohol and HIV. These tended to break down into two categories, those focusing on better understanding the nexus between alcohol and HIV and those directed towards informing practical interventions to reduce the impact of alcohol use on HIV treatment outcomes, which replicates what Bryant (Subst Use Misuse 41:1465-1507, 2006) and Parry et al. (Addiction 108:1-2, 2012) found. Responses from experts were stratified by location in order to determine any differences between groups. On average experts in the LMIC gave higher scores than the HIC experts. Recent research has shown the causal link between alcohol consumption and the incidence of HIV/AIDS including a better understanding of the pathways through which alcohol use affects ARV adherence (and other medications to treat opportunistic infections) and CD4 counts. The results of this process clearly indicated that the important priorities for future research related to the development and assessment of interventions focusing on addressing alcohol and HIV/AIDS, addressing and exploring the impact of HIV risk and comorbid alcohol use, as well as exploring the risk and protective factors in the field of alcohol and HIV/AIDS. The findings from this priority setting exercise could guide international research agenda and make research funding more effective in addressing the research on intersection of alcohol and HIV/AIDS.

  15. Design of feedback control systems for stable plants with saturating actuators

    NASA Technical Reports Server (NTRS)

    Kapasouris, Petros; Athans, Michael; Stein, Gunter

    1988-01-01

    A systematic control design methodology is introduced for multi-input/multi-output stable open loop plants with multiple saturations. This new methodology is a substantial improvement over previous heuristic single-input/single-output approaches. The idea is to introduce a supervisor loop so that when the references and/or disturbances are sufficiently small, the control system operates linearly as designed. For signals large enough to cause saturations, the control law is modified in such a way as to ensure stability and to preserve, to the extent possible, the behavior of the linear control design. Key benefits of the methodology are: the modified compensator never produces saturating control signals, integrators and/or slow dynamics in the compensator never windup, the directional properties of the controls are maintained, and the closed loop system has certain guaranteed stability properties. The advantages of the new design methodology are illustrated in the simulation of an academic example and the simulation of the multivariable longitudinal control of a modified model of the F-8 aircraft.

  16. Design for performance enhancement in feedback control systems with multiple saturating nonlinearities. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Kapasouris, Petros

    1988-01-01

    A systematic control design methodology is introduced for multi-input/multi-output systems with multiple saturations. The methodology can be applied to stable and unstable open loop plants with magnitude and/or rate control saturations and to systems in which state limitations are desired. This new methodology is a substantial improvement over previous heuristic single-input/single-output approaches. The idea is to introduce a supervisor loop so that when the references and/or disturbances are sufficiently small, the control system operates linearly as designed. For signals large enough to cause saturations, the control law is modified in such a way to ensure stability and to preserve, to the extent possible, the behavior of the linear control design. Key benefits of this methodology are: the modified compensator never produces saturating control signals, integrators and/or slow dynamics in the compensator never windup, the directional properties of the controls are maintained, and the closed loop system has certain guaranteed stability properties. The advantages of the new design methodology are illustrated by numerous simulations, including the multivariable longitudinal control of modified models of the F-8 (stable) and F-16 (unstable) aircraft.

  17. A More Flexible Approach to Valuing Flexibility

    DTIC Science & Technology

    2011-04-01

    remaining life of the program? Almost certainly. Next is the cost assessment step. This is executed in the context of whatever design options we...methodology is essentially a modifi- cation of the current life cycle model and is premised on the notion that the need for capabili- ty changes in a program...valuing the inherent ability of a system or design to accommodate change. The proposed methodology is essentially a modifi-cation of the current life

  18. Fuzzy Linear Programming and its Application in Home Textile Firm

    NASA Astrophysics Data System (ADS)

    Vasant, P.; Ganesan, T.; Elamvazuthi, I.

    2011-06-01

    In this paper, new fuzzy linear programming (FLP) based methodology using a specific membership function, named as modified logistic membership function is proposed. The modified logistic membership function is first formulated and its flexibility in taking up vagueness in parameter is established by an analytical approach. The developed methodology of FLP has provided a confidence in applying to real life industrial production planning problem. This approach of solving industrial production planning problem can have feedback with the decision maker, the implementer and the analyst.

  19. MS-based analytical methodologies to characterize genetically modified crops.

    PubMed

    García-Cañas, Virginia; Simó, Carolina; León, Carlos; Ibáñez, Elena; Cifuentes, Alejandro

    2011-01-01

    The development of genetically modified crops has had a great impact on the agriculture and food industries. However, the development of any genetically modified organism (GMO) requires the application of analytical procedures to confirm the equivalence of the GMO compared to its isogenic non-transgenic counterpart. Moreover, the use of GMOs in foods and agriculture faces numerous criticisms from consumers and ecological organizations that have led some countries to regulate their production, growth, and commercialization. These regulations have brought about the need of new and more powerful analytical methods to face the complexity of this topic. In this regard, MS-based technologies are increasingly used for GMOs analysis to provide very useful information on GMO composition (e.g., metabolites, proteins). This review focuses on the MS-based analytical methodologies used to characterize genetically modified crops (also called transgenic crops). First, an overview on genetically modified crops development is provided, together with the main difficulties of their analysis. Next, the different MS-based analytical approaches applied to characterize GM crops are critically discussed, and include "-omics" approaches and target-based approaches. These methodologies allow the study of intended and unintended effects that result from the genetic transformation. This information is considered to be essential to corroborate (or not) the equivalence of the GM crop with its isogenic non-transgenic counterpart. Copyright © 2010 Wiley Periodicals, Inc.

  20. In situ characterization of organo-modified and unmodified montmorillonite aqueous suspensions by UV-visible spectroscopy.

    PubMed

    Alin, Jonas; Rubino, Maria; Auras, Rafael

    2015-10-15

    UV-visible (UV-Vis) spectroscopy (Tyndall spectra) was applied and tested for its ability to measure organo-modified and unmodified montmorillonite (MMT) clays in aqueous suspensions. A full factorial design of experiments was used to study the influence of pH, NaCl and clay concentrations on the average particle size of the clay agglomerates. The methodology was evaluated by observing results that were consistent with previous research about the unmodified clay's behavior in aqueous suspensions. The results from this evaluation corresponded to accepted theories about the unmodified clay's behavior, indicating that the methodology is precise enough to distinguish the effects of the studied factors on these clay suspensions. The effect of clay concentration was related to the amount of ions per clay particle for the unmodified clay, but was not significant for the organo-modified MMT. The average particle size of the organo-modified MMT in suspension was significantly larger than that of the unmodified clay. Size of the organo-modified MMT agglomerates in suspension decreased in the presence of NaCl and at both high and low pH; this behavior was opposite to that of the unmodified clay. These results demonstrate that the UV-Vis methodology is well-suited for characterizing clay particle size in aqueous suspensions. The technique also is simple, rapid, and low-cost. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. Information System Design Methodology Based on PERT/CPM Networking and Optimization Techniques.

    ERIC Educational Resources Information Center

    Bose, Anindya

    The dissertation attempts to demonstrate that the program evaluation and review technique (PERT)/Critical Path Method (CPM) or some modified version thereof can be developed into an information system design methodology. The methodology utilizes PERT/CPM which isolates the basic functional units of a system and sets them in a dynamic time/cost…

  2. Methodological Adaptations for Investigating the Perceptions of Language-Impaired Adolescents Regarding the Relative Importance of Selected Communication Skills

    ERIC Educational Resources Information Center

    Reed, Vicki A.; Brammall, Helen

    2006-01-01

    This article describes the systematic and detailed processes undertaken to modify a research methodology for use with language-impaired adolescents. The original methodology had been used previously with normally achieving adolescents and speech pathologists to obtain their opinions about the relative importance of selected communication skills…

  3. Can we predict the outcome for people with patellofemoral pain? A systematic review on prognostic factors and treatment effect modifiers.

    PubMed

    Matthews, M; Rathleff, M S; Claus, A; McPoil, T; Nee, R; Crossley, K; Vicenzino, B

    2017-12-01

    Patellofemoral pain (PFP) is a multifactorial and often persistent knee condition. One strategy to enhance patient outcomes is using clinically assessable patient characteristics to predict the outcome and match a specific treatment to an individual. A systematic review was conducted to determine which baseline patient characteristics were (1) associated with patient outcome (prognosis); or (2) modified patient outcome from a specific treatment (treatment effect modifiers). 6 electronic databases were searched (July 2016) for studies evaluating the association between those with PFP, their characteristics and outcome. All studies were appraised using the Epidemiological Appraisal Instrument. Studies that aimed to identify treatment effect modifiers underwent a checklist for methodological quality. The 24 included studies evaluated 180 participant characteristics. 12 studies investigated prognosis, and 12 studies investigated potential treatment effect modifiers. Important methodological limitations were identified. Some prognostic studies used a retrospective design. Studies aiming to identify treatment effect modifiers often analysed too many variables for the limiting sample size and typically failed to use a control or comparator treatment group. 16 factors were reported to be associated with a poor outcome, with longer duration of symptoms the most reported (>4 months). Preliminary evidence suggests increased midfoot mobility may predict those who have a successful outcome to foot orthoses. Current evidence can identify those with increased risk of a poor outcome, but methodological limitations make it difficult to predict the outcome after one specific treatment compared with another. Adequately designed randomised trials are needed to identify treatment effect modifiers. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  4. An evaluation of total starch and starch gelatinization methodologies in pelleted animal feed.

    PubMed

    Zhu, L; Jones, C; Guo, Q; Lewis, L; Stark, C R; Alavi, S

    2016-04-01

    The quantification of total starch content (TS) or degree of starch gelatinization (DG) in animal feed is always challenging because of the potential interference from other ingredients. In this study, the differences in TS or DG measurement in pelleted swine feed due to variations in analytical methodology were quantified. Pelleted swine feed was used to create 6 different diets manufactured with various processing conditions in a 2 × 3 factorial design (2 conditioning temperatures, 77 or 88°C, and 3 conditioning retention times, 15, 30, or 60 s). Samples at each processing stage (cold mash, hot mash, hot pelletized feed, and final cooled pelletized feed) were collected for each of the 6 treatments and analyzed for TS and DG. Two different methodologies were evaluated for TS determination (the AOAC International method 996.11 vs. the modified glucoamylase method) and DG determination (the modified glucoamylase method vs. differential scanning calorimetry [DSC]). For TS determination, the AOAC International method 996.11 measured lower TS values in cold pellets compared with the modified glucoamylase method. The AOAC International method resulted in lower TS in cold mash than cooled pelletized feed, whereas the modified glucoamylase method showed no significant differences in TS content before or after pelleting. For DG, the modified glucoamylase method demonstrated increased DG with each processing step. Furthermore, increasing the conditioning temperature and time resulted in a greater DG when evaluated by the modified glucoamylase method. However, results demonstrated that DSC is not suitable as a quantitative tool for determining DG in multicomponent animal feeds due to interferences from nonstarch transformations, such as protein denaturation.

  5. Tularosa Basin Play Fairway Analysis: Methodology Flow Charts

    DOE Data Explorer

    Adam Brandt

    2015-11-15

    These images show the comprehensive methodology used for creation of a Play Fairway Analysis to explore the geothermal resource potential of the Tularosa Basin, New Mexico. The deterministic methodology was originated by the petroleum industry, but was custom-modified to function as a knowledge-based geothermal exploration tool. The stochastic PFA flow chart uses weights of evidence, and is data-driven.

  6. An Enzyme-Mediated Methodology for the Site-Specific Radiolabeling of Antibodies Based on Catalyst-Free Click Chemistry

    PubMed Central

    Zeglis, Brian M.; Davis, Charles B.; Aggeler, Robert; Kang, Hee Chol; Chen, Aimei; Agnew, Brian J.; Lewis, Jason S.

    2013-01-01

    An enzyme- and click chemistry-mediated methodology for the site-selective radiolabeling of antibodies on the heavy chain glycans has been developed and validated. To this end, a model system based on the prostate specific membrane antigen-targeting antibody J591, the positron-emitting radiometal 89Zr, and the chelator desferrioxamine has been employed. The methodology consists of four steps: (1) the removal of sugars on the heavy chain region of the antibody to expose terminal N-acetylglucosamine residues; (2) the incorporation of azide-modified N-acetylgalactosamine monosaccharides into the glycans of the antibody; (3) the catalyst-free click conjugation of desferrioxamine-modified dibenzocyclooctynes to the azide-bearing sugars; and (4) the radiolabeling of the chelator-modified antibody with 89Zr. The site-selective labeling methodology has proven facile, reproducible, and robust, producing 89Zr-labeled radioimmunoconjguates that display high stability and immunoreactivity in vitro (>95%) in addition to high selective tumor uptake (67.5 ± 5.0 %ID/g) and tumor-to-background contrast in athymic nude mice bearing PSMA-expressing subcutaneous LNCaP xenografts. Ultimately, this strategy could play a critical role in the development of novel well-defined and highly immunoreactive radioimmunoconjugates for both the laboratory and clinic. PMID:23688208

  7. Improving health aid for a better planet: The planning, monitoring and evaluation tool (PLANET)

    PubMed Central

    Sridhar, Devi; Car, Josip; Chopra, Mickey; Campbell, Harry; Woods, Ngaire; Rudan, Igor

    2015-01-01

    Background International development assistance for health (DAH) quadrupled between 1990 and 2012, from US$ 5.6 billion to US$ 28.1 billion. This generates an increasing need for transparent and replicable tools that could be used to set investment priorities, monitor the distribution of funding in real time, and evaluate the impact of those investments. Methods In this paper we present a methodology that addresses these three challenges. We call this approach PLANET, which stands for planning, monitoring and evaluation tool. Fundamentally, PLANET is based on crowdsourcing approach to obtaining information relevant to deployment of large–scale programs. Information is contributed in real time by a diverse group of participants involved in the program delivery. Findings PLANET relies on real–time information from three levels of participants in large–scale programs: funders, managers and recipients. At each level, information is solicited to assess five key risks that are most relevant to each level of operations. The risks at the level of funders involve systematic neglect of certain areas, focus on donor’s interests over that of program recipients, ineffective co–ordination between donors, questionable mechanisms of delivery and excessive loss of funding to “middle men”. At the level of managers, the risks are corruption, lack of capacity and/or competence, lack of information and /or communication, undue avoidance of governmental structures / preference to non–governmental organizations and exclusion of local expertise. At the level of primary recipients, the risks are corruption, parallel operations / “verticalization”, misalignment with local priorities and lack of community involvement, issues with ethics, equity and/or acceptability, and low likelihood of sustainability beyond the end of the program’s implementation. Interpretation PLANET is intended as an additional tool available to policy–makers to prioritize, monitor and evaluate large–scale development programs. In this, it should complement tools such as LiST (for health care/interventions), EQUIST (for health care/interventions) and CHNRI (for health research), which also rely on information from local experts and on local context to set priorities in a transparent, user–friendly, replicable, quantifiable and specific, algorithmic–like manner. PMID:26322228

  8. Improving health aid for a better planet: The planning, monitoring and evaluation tool (PLANET).

    PubMed

    Sridhar, Devi; Car, Josip; Chopra, Mickey; Campbell, Harry; Woods, Ngaire; Rudan, Igor

    2015-12-01

    International development assistance for health (DAH) quadrupled between 1990 and 2012, from US$ 5.6 billion to US$ 28.1 billion. This generates an increasing need for transparent and replicable tools that could be used to set investment priorities, monitor the distribution of funding in real time, and evaluate the impact of those investments. In this paper we present a methodology that addresses these three challenges. We call this approach PLANET, which stands for planning, monitoring and evaluation tool. Fundamentally, PLANET is based on crowdsourcing approach to obtaining information relevant to deployment of large-scale programs. Information is contributed in real time by a diverse group of participants involved in the program delivery. PLANET relies on real-time information from three levels of participants in large-scale programs: funders, managers and recipients. At each level, information is solicited to assess five key risks that are most relevant to each level of operations. The risks at the level of funders involve systematic neglect of certain areas, focus on donor's interests over that of program recipients, ineffective co-ordination between donors, questionable mechanisms of delivery and excessive loss of funding to "middle men". At the level of managers, the risks are corruption, lack of capacity and/or competence, lack of information and /or communication, undue avoidance of governmental structures / preference to non-governmental organizations and exclusion of local expertise. At the level of primary recipients, the risks are corruption, parallel operations / "verticalization", misalignment with local priorities and lack of community involvement, issues with ethics, equity and/or acceptability, and low likelihood of sustainability beyond the end of the program's implementation. PLANET is intended as an additional tool available to policy-makers to prioritize, monitor and evaluate large-scale development programs. In this, it should complement tools such as LiST (for health care/interventions), EQUIST (for health care/interventions) and CHNRI (for health research), which also rely on information from local experts and on local context to set priorities in a transparent, user-friendly, replicable, quantifiable and specific, algorithmic-like manner.

  9. Modified teaching approach for an enhanced medical physics graduate education experience

    PubMed Central

    Rutel, IB

    2011-01-01

    Lecture-based teaching promotes a passive interaction with students. Opportunities to modify this format are available to enhance the overall learning experience for both students and instructors. The description for a discussion-based learning format is presented as it applies to a graduate curriculum with technical (formal mathematical derivation) topics. The presented hybrid method involves several techniques, including problem-based learning, modeling, and online lectures, eliminating didactic lectures. The results from an end-of-course evaluation show that the students appear to prefer the modified format over the more traditional methodology of “lecture only” contact time. These results are motivation for further refinement and continued implementation of the described methodology in the current course and potentially other courses within the department graduate curriculum. PMID:22279505

  10. Fracture toughness of irradiated modified 9Cr-1Mo steel

    NASA Astrophysics Data System (ADS)

    Kim, Sung Ho; Yoon, Ji-Hyun; Ryu, Woo Seog; Lee, Chan Bock; Hong, Jun Hwa

    2009-04-01

    The effects of irradiation on fracture toughness of modified 9Cr-1Mo steel in the transition region were investigated. Half size precracked Charpy specimens were irradiated up to 1.2 × 10 21n/cm 2 ( E > 0.1 MeV) at 340 °C and 400 °C in the Korean research reactor. The irradiation induced transition temperature shift for a modified 9Cr-1Mo was evaluated by using the Master Curve methodology. The T0 temperature for the unirradiated specimens were measured as -67.7 °C and -72.4 °C from the tests with standard PCVN (precracked charpy V-notch) and half sized PCVN specimens, respectively. The T0 shifts of specimens after irradiation at 340 °C and 400 °C were 70.7 °C and 66.1 °C, respectively. The Weibull slopes for the fracture toughness data obtained from the unirradiated and irradiated modified 9Cr-1Mo steels were determined to confirm the applicability of master curve methodology to modified 9Cr-1Mo steel.

  11. "MARK I" MEASUREMENT METHODOLOGY FOR POLLUTION PREVENTION PROGRESS OCCURRING AS A RESULT OF PRODUCT DECISIONS

    EPA Science Inventory

    A methodology for assessing progress in pollution prevention resulting from product redesign, reformulation or replacement is described. The method compares the pollution generated by the original product with that from the modified or replacement product, taking into account, if...

  12. Development of test methodology for dynamic mechanical analysis instrumentation

    NASA Technical Reports Server (NTRS)

    Allen, V. R.

    1982-01-01

    Dynamic mechanical analysis instrumentation was used for the development of specific test methodology in the determination of engineering parameters of selected materials, esp. plastics and elastomers, over a broad range of temperature with selected environment. The methodology for routine procedures was established with specific attention given to sample geometry, sample size, and mounting techniques. The basic software of the duPont 1090 thermal analyzer was used for data reduction which simplify the theoretical interpretation. Clamps were developed which allowed 'relative' damping during the cure cycle to be measured for the fiber-glass supported resin. The correlation of fracture energy 'toughness' (or impact strength) with the low temperature (glassy) relaxation responses for a 'rubber-modified' epoxy system was negative in result because the low-temperature dispersion mode (-80 C) of the modifier coincided with that of the epoxy matrix, making quantitative comparison unrealistic.

  13. Force on Force Modeling with Formal Task Structures and Dynamic Geometry

    DTIC Science & Technology

    2017-03-24

    task framework, derived using the MMF methodology to structure a complex mission. It further demonstrated the integration of effects from a range of...application methodology was intended to support a combined developmental testing (DT) and operational testing (OT) strategy for selected systems under test... methodology to develop new or modify existing Models and Simulations (M&S) to: • Apply data from multiple, distributed sources (including test

  14. Detection of Genetically Modified Sugarcane by Using Terahertz Spectroscopy and Chemometrics

    NASA Astrophysics Data System (ADS)

    Liu, J.; Xie, H.; Zha, B.; Ding, W.; Luo, J.; Hu, C.

    2018-03-01

    A methodology is proposed to identify genetically modified sugarcane from non-genetically modified sugarcane by using terahertz spectroscopy and chemometrics techniques, including linear discriminant analysis (LDA), support vector machine-discriminant analysis (SVM-DA), and partial least squares-discriminant analysis (PLS-DA). The classification rate of the above mentioned methods is compared, and different types of preprocessing are considered. According to the experimental results, the best option is PLS-DA, with an identification rate of 98%. The results indicated that THz spectroscopy and chemometrics techniques are a powerful tool to identify genetically modified and non-genetically modified sugarcane.

  15. Stochastic HKMDHE: A multi-objective contrast enhancement algorithm

    NASA Astrophysics Data System (ADS)

    Pratiher, Sawon; Mukhopadhyay, Sabyasachi; Maity, Srideep; Pradhan, Asima; Ghosh, Nirmalya; Panigrahi, Prasanta K.

    2018-02-01

    This contribution proposes a novel extension of the existing `Hyper Kurtosis based Modified Duo-Histogram Equalization' (HKMDHE) algorithm, for multi-objective contrast enhancement of biomedical images. A novel modified objective function has been formulated by joint optimization of the individual histogram equalization objectives. The optimal adequacy of the proposed methodology with respect to image quality metrics such as brightness preserving abilities, peak signal-to-noise ratio (PSNR), Structural Similarity Index (SSIM) and universal image quality metric has been experimentally validated. The performance analysis of the proposed Stochastic HKMDHE with existing histogram equalization methodologies like Global Histogram Equalization (GHE) and Contrast Limited Adaptive Histogram Equalization (CLAHE) has been given for comparative evaluation.

  16. A Goal Seeking Strategy for Constructing Systems from Alternative Components

    NASA Technical Reports Server (NTRS)

    Valentine, Mark E.

    1999-01-01

    This paper describes a methodology to efficiently construct feasible systems then modify feasible systems to meet successive goals by selecting from alternative components, a problem recognized to be n-p complete. The methodology provides a means to catalog and model alternative components. A presented system modeling Structure is robust enough to model a wide variety of systems and provides a means to compare and evaluate alternative systems. These models act as input to a methodology for selecting alternative components to construct feasible systems and modify feasible systems to meet design goals and objectives. The presented algorithm's ability to find a restricted solution, as defined by a unique set of requirements, is demonstrated against an exhaustive search of a sample of proposed shuttle modifications. The utility of the algorithm is demonstrated by comparing results from the algorithm with results from three NASA shuttle evolution studies using their value systems and assumptions.

  17. Establishing survey validity and reliability for American Indians through "think aloud" and test-retest methods.

    PubMed

    Hauge, Cindy Horst; Jacobs-Knight, Jacque; Jensen, Jamie L; Burgess, Katherine M; Puumala, Susan E; Wilton, Georgiana; Hanson, Jessica D

    2015-06-01

    The purpose of this study was to use a mixed-methods approach to determine the validity and reliability of measurements used within an alcohol-exposed pregnancy prevention program for American Indian women. To develop validity, content experts provided input into the survey measures, and a "think aloud" methodology was conducted with 23 American Indian women. After revising the measurements based on this input, a test-retest was conducted with 79 American Indian women who were randomized to complete either the original measurements or the new, modified measurements. The test-retest revealed that some of the questions performed better for the modified version, whereas others appeared to be more reliable for the original version. The mixed-methods approach was a useful methodology for gathering feedback on survey measurements from American Indian participants and in indicating specific survey questions that needed to be modified for this population. © The Author(s) 2015.

  18. NMR-Metabolic Methodology in the Study of GM Foods

    USDA-ARS?s Scientific Manuscript database

    The 1H NMR methodology used in the study of genetically modified (GM) foodstuff is discussed. The study of transgenic lettuce (Lactuca sativa cv "Luxor") over-expressing the KNAT1 gene from Arabidopsis is presented as a novel study-case. The 1H NMR metabolic profiling was carried out. Twenty-two wat...

  19. Technological Leverage in Higher Education: An Evolving Pedagogy

    ERIC Educational Resources Information Center

    Pillai, K. Rajasekharan; Prakash, Ashish Viswanath

    2017-01-01

    Purpose: The purpose of the study is to analyse the perception of students toward a computer-based exam on a custom-made digital device and their willingness to adopt the same for high-stake summative assessment. Design/methodology/approach: This study followed an analytical methodology using survey design. A modified version of students'…

  20. A modified approach to 2-(N-aryl)-1,3-oxazoles: application to the synthesis of the IMPDH inhibitor BMS-337197 and analogues.

    PubMed

    Dhar, T G Murali; Guo, Junqing; Shen, Zhongqi; Pitts, William J; Gu, Henry H; Chen, Bang-Chi; Zhao, Rulin; Bednarz, Mark S; Iwanowicz, Edwin J

    2002-06-13

    [structure: see text] A modified approach to the synthesis of 2-(N-aryl)-1,3-oxazoles, employing an optimized iminophosphorane/heterocumulene-mediated methodology, and its application to the synthesis of BMS-337197, a potent inhibitor of IMPDH, are described.

  1. A Modified Dialogic Reading Intervention for Preschool Children with Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Fleury, Veronica P.; Schwartz, Ilene S.

    2017-01-01

    We examined the effect of a modified dialogic reading intervention on levels of verbal participation and vocabulary growth in nine preschool children with autism spectrum disorder (ASD) using single-case design methodology. Baseline book reading resulted in consistently low levels of verbal participation followed by an immediate increase in verbal…

  2. A Bootstrap Generalization of Modified Parallel Analysis for IRT Dimensionality Assessment

    ERIC Educational Resources Information Center

    Finch, Holmes; Monahan, Patrick

    2008-01-01

    This article introduces a bootstrap generalization to the Modified Parallel Analysis (MPA) method of test dimensionality assessment using factor analysis. This methodology, based on the use of Marginal Maximum Likelihood nonlinear factor analysis, provides for the calculation of a test statistic based on a parametric bootstrap using the MPA…

  3. Modified Video Course Methodology for Distance Learning.

    ERIC Educational Resources Information Center

    Springer, Stephen B.

    In recent years, colleges have made extensive efforts to provide distance learning opportunities for adult students. At Southwest Texas State University, a required course in the Occupational Education program has been delivered in a modified video format. The video was made of an actual class being taught in a production studio. The main…

  4. Human-Automation Interaction Design for Adaptive Cruise Control Systems of Ground Vehicles.

    PubMed

    Eom, Hwisoo; Lee, Sang Hun

    2015-06-12

    A majority of recently developed advanced vehicles have been equipped with various automated driver assistance systems, such as adaptive cruise control (ACC) and lane keeping assistance systems. ACC systems have several operational modes, and drivers can be unaware of the mode in which they are operating. Because mode confusion is a significant human error factor that contributes to traffic accidents, it is necessary to develop user interfaces for ACC systems that can reduce mode confusion. To meet this requirement, this paper presents a new human-automation interaction design methodology in which the compatibility of the machine and interface models is determined using the proposed criteria, and if the models are incompatible, one or both of the models is/are modified to make them compatible. To investigate the effectiveness of our methodology, we designed two new interfaces by separately modifying the machine model and the interface model and then performed driver-in-the-loop experiments. The results showed that modifying the machine model provides a more compact, acceptable, effective, and safe interface than modifying the interface model.

  5. Human-Automation Interaction Design for Adaptive Cruise Control Systems of Ground Vehicles

    PubMed Central

    Eom, Hwisoo; Lee, Sang Hun

    2015-01-01

    A majority of recently developed advanced vehicles have been equipped with various automated driver assistance systems, such as adaptive cruise control (ACC) and lane keeping assistance systems. ACC systems have several operational modes, and drivers can be unaware of the mode in which they are operating. Because mode confusion is a significant human error factor that contributes to traffic accidents, it is necessary to develop user interfaces for ACC systems that can reduce mode confusion. To meet this requirement, this paper presents a new human-automation interaction design methodology in which the compatibility of the machine and interface models is determined using the proposed criteria, and if the models are incompatible, one or both of the models is/are modified to make them compatible. To investigate the effectiveness of our methodology, we designed two new interfaces by separately modifying the machine model and the interface model and then performed driver-in-the-loop experiments. The results showed that modifying the machine model provides a more compact, acceptable, effective, and safe interface than modifying the interface model. PMID:26076406

  6. Cystic fibrosis modifier genes.

    PubMed Central

    Davies, Jane; Alton, Eric; Griesenbach, Uta

    2005-01-01

    Since the recognition that CFTR genotype was not a good predictor of pulmonary disease severity in CF, several candidate modifier genes have been identified. It is unlikely that a single modifier gene will be found, but more probable that several haplotypes in combination may contribute, which in itself presents a major methodological challenge. The aims of such studies are to increase our understanding of disease pathogenesis, to aid prognosis and ultimately to lead to the development of novel treatments. PMID:16025767

  7. Optimization of monomethoxy polyethyleneglycol-modified oxalate decarboxylase by response surface methodology.

    PubMed

    Long, Han; Cai, XingHua; Yang, Hui; He, JunBin; Wu, Jia; Lin, RiHui

    2017-09-01

    In order to improve the stability of oxalate decarboxylase (Oxdc), response surface methodology (RSM), based on a four-factor three-level Box-Behnken central composite design was used to optimize the reaction conditions of oxalate decarboxylase (Oxdc) modified with monomethoxy polyethyleneglycol (mPEG5000). Four independent variables such as the ratio of mPEG-aldehyde to Oxdc, reaction time, temperature, and reaction pH were investigated in this work. The structure of modified Oxdc was identified by sodium dodecyl sulfate-polyacrylamide gel electrophoresis (SDS-PAGE) and Fourier transform infrared (FTIR) spectroscopy, the stability of the modified Oxdc was also investigated. The optimal conditions were as follows: the mole ratio of mPEG-aldehyde to Oxdc of 1:47.6, time of 13.1 h, temperature at 29.9 °C, and the reaction pH of 5.3. Under optimal conditions, experimental modified rate (MR = 73.69%) and recovery rate (RR = 67.58%) were matched well with the predicted value (MR = 75.11%) and (RR = 69.17%). SDS-PAGE and FTIR analysis showed that mPEG was covalently bound to the Oxdc. Compared with native Oxdc, the modified Oxdc (mPEG-Oxdc) showed higher thermal stability and better tolerance to trypsin or different pH treatment. This work will provide a further theoretical reference for enzyme modification and conditional optimization.

  8. Turkish version of the modified Constant-Murley score and standardized test protocol: reliability and validity.

    PubMed

    Çelik, Derya

    2016-01-01

    The Constant-Murley score (CMS) is widely used to evaluate disabilities associated with shoulder injuries, but it has been criticized for relying on imprecise terminology and a lack of standardized methodology. A modified guideline, therefore, was published in 2008 with several recommendations. This new version has not yet been translated or culturally adapted for Turkish-speaking populations. The purpose of this study was to translate and cross-culturally adapt the modified CMS and its test protocol, as well as define and measure its reliability and validity. The modified CMS was translated into Turkish, consistent with published methodological guidelines. The measurement properties of the Turkish version of the modified CMS were tested in 30 patients (12 males, 18 females; mean age: 59.5±13.5 years) with a variety of shoulder pathologies. Intraclass correlation coefficients (ICC) were used to estimate test-retest reliability. Construct validity was analyzed with the Turkish version of the American Shoulder and Elbow Surgeons (ASES) Standardized Shoulder Assessment Form and Short-Form Health Survey (SF-12). No difficulties were found in the translation process. The Turkish version of the modified CMS showed excellent test-retest reliability (ICC=0.86). The correlation coefficients between the Turkish version of the modified CMS and the ASES, SF-12-physical component score, and SF-12 mental component scores were found to be 0.48, 0.35, and 0.05, respectively. No floor or ceiling effects were found. The translation and cultural adaptation of the modified CMS and its standardized test protocol into Turkish were successful. The Turkish version of the modified CMS has sufficient reliability and validity to measure a variety of shoulder disorders for Turkish-speaking individuals.

  9. Detection and traceability of genetically modified organisms in the food production chain.

    PubMed

    Miraglia, M; Berdal, K G; Brera, C; Corbisier, P; Holst-Jensen, A; Kok, E J; Marvin, H J P; Schimmel, H; Rentsch, J; van Rie, J P P F; Zagon, J

    2004-07-01

    Both labelling and traceability of genetically modified organisms are current issues that are considered in trade and regulation. Currently, labelling of genetically modified foods containing detectable transgenic material is required by EU legislation. A proposed package of legislation would extend this labelling to foods without any traces of transgenics. These new legislations would also impose labelling and a traceability system based on documentation throughout the food and feed manufacture system. The regulatory issues of risk analysis and labelling are currently harmonised by Codex Alimentarius. The implementation and maintenance of the regulations necessitates sampling protocols and analytical methodologies that allow for accurate determination of the content of genetically modified organisms within a food and feed sample. Current methodologies for the analysis of genetically modified organisms are focused on either one of two targets, the transgenic DNA inserted- or the novel protein(s) expressed- in a genetically modified product. For most DNA-based detection methods, the polymerase chain reaction is employed. Items that need consideration in the use of DNA-based detection methods include the specificity, sensitivity, matrix effects, internal reference DNA, availability of external reference materials, hemizygosity versus homozygosity, extrachromosomal DNA, and international harmonisation. For most protein-based methods, enzyme-linked immunosorbent assays with antibodies binding the novel protein are employed. Consideration should be given to the selection of the antigen bound by the antibody, accuracy, validation, and matrix effects. Currently, validation of detection methods for analysis of genetically modified organisms is taking place. In addition, new methodologies are developed, including the use of microarrays, mass spectrometry, and surface plasmon resonance. Challenges for GMO detection include the detection of transgenic material in materials with varying chromosome numbers. The existing and proposed regulatory EU requirements for traceability of genetically modified products fit within a broader tendency towards traceability of foods in general and, commercially, towards products that can be distinguished from each other. Traceability systems document the history of a product and may serve the purpose of both marketing and health protection. In this framework, segregation and identity preservation systems allow for the separation of genetically modified and non-modified products from "farm to fork". Implementation of these systems comes with specific technical requirements for each particular step of the food processing chain. In addition, the feasibility of traceability systems depends on a number of factors, including unique identifiers for each genetically modified product, detection methods, permissible levels of contamination, and financial costs. In conclusion, progress has been achieved in the field of sampling, detection, and traceability of genetically modified products, while some issues remain to be solved. For success, much will depend on the threshold level for adventitious contamination set by legislation. Copryright 2004 Elsevier Ltd.

  10. Modified SEAGULL

    NASA Technical Reports Server (NTRS)

    Salas, M. D.; Kuehn, M. S.

    1994-01-01

    Original version of program incorporated into program SRGULL (LEW-15093) for use on National Aero-Space Plane project, its duty being to model forebody, inlet, and nozzle portions of vehicle. However, real-gas chemistry effects in hypersonic flow fields limited accuracy of that version, because it assumed perfect-gas properties. As a result, SEAGULL modified according to real-gas equilibrium-chemistry methodology. This program analyzes two-dimensional, hypersonic flows of real gases. Modified version of SEAGULL maintains as much of original program as possible, and retains ability to execute original perfect-gas version.

  11. The Statistical point of view of Quality: the Lean Six Sigma methodology

    PubMed Central

    Viti, Andrea; Terzi, Alberto

    2015-01-01

    Six Sigma and Lean are two quality improvement methodologies. The Lean Six Sigma methodology is applicable to repetitive procedures. Therefore, the use of this methodology in the health-care arena has focused mainly on areas of business operations, throughput, and case management and has focused on efficiency outcomes. After the revision of methodology, the paper presents a brief clinical example of the use of Lean Six Sigma as a quality improvement method in the reduction of the complications during and after lobectomies. Using Lean Six Sigma methodology, the multidisciplinary teams could identify multiple modifiable points across the surgical process. These process improvements could be applied to different surgical specialties and could result in a measurement, from statistical point of view, of the surgical quality. PMID:25973253

  12. The Statistical point of view of Quality: the Lean Six Sigma methodology.

    PubMed

    Bertolaccini, Luca; Viti, Andrea; Terzi, Alberto

    2015-04-01

    Six Sigma and Lean are two quality improvement methodologies. The Lean Six Sigma methodology is applicable to repetitive procedures. Therefore, the use of this methodology in the health-care arena has focused mainly on areas of business operations, throughput, and case management and has focused on efficiency outcomes. After the revision of methodology, the paper presents a brief clinical example of the use of Lean Six Sigma as a quality improvement method in the reduction of the complications during and after lobectomies. Using Lean Six Sigma methodology, the multidisciplinary teams could identify multiple modifiable points across the surgical process. These process improvements could be applied to different surgical specialties and could result in a measurement, from statistical point of view, of the surgical quality.

  13. Analysis of crack initiation and growth in the high level vibration test at Tadotsu

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kassir, M.K.; Park, Y.J.; Hofmayer, C.H.

    1993-08-01

    The High Level Vibration Test data are used to assess the accuracy and usefulness of current engineering methodologies for predicting crack initiation and growth in a cast stainless steel pipe elbow under complex, large amplitude loading. The data were obtained by testing at room temperature a large scale modified model of one loop of a PWR primary coolant system at the Tadotsu Engineering Laboratory in Japan. Fatigue crack initiation time is reasonably predicted by applying a modified local strain approach (Coffin-Mason-Goodman equation) in conjunction with Miner`s rule of cumulative damage. Three fracture mechanics methodologies are applied to investigate the crackmore » growth behavior observed in the hot leg of the model. These are: the {Delta}K methodology (Paris law), {Delta}J concepts and a recently developed limit load stress-range criterion. The report includes a discussion on the pros and cons of the analysis involved in each of the methods, the role played by the key parameters influencing the formulation and a comparison of the results with the actual crack growth behavior observed in the vibration test program. Some conclusions and recommendations for improvement of the methodologies are also provided.« less

  14. HRR Upgrade to mass loss calorimeter and modified Schlyter test for FR Wood

    Treesearch

    Mark A. Dietenberger; Charles R. Boardman

    2013-01-01

    Enhanced Heat Release Rate (HRR) methodology has been extended to the Mass Loss Calorimeter (MLC) and the Modified Schlyter flame spread test to evaluate fire retardant effectiveness used on wood based materials. Modifications to MLC include installation of thermopile on the chimney walls to correct systematic errors to the sensible HRR calculations to account for...

  15. Early Detection for Dengue Using Local Indicator of Spatial Association (LISA) Analysis.

    PubMed

    Parra-Amaya, Mayra Elizabeth; Puerta-Yepes, María Eugenia; Lizarralde-Bejarano, Diana Paola; Arboleda-Sánchez, Sair

    2016-03-29

    Dengue is a viral disease caused by a flavivirus that is transmitted by mosquitoes of the genus Aedes . There is currently no specific treatment or commercial vaccine for its control and prevention; therefore, mosquito population control is the only alternative for preventing the occurrence of dengue. For this reason, entomological surveillance is recommended by World Health Organization (WHO) to measure dengue risk in endemic areas; however, several works have shown that the current methodology (aedic indices) is not sufficient for predicting dengue. In this work, we modified indices proposed for epidemic periods. The raw value of the epidemiological wave could be useful for detecting risk in epidemic periods; however, risk can only be detected if analyses incorporate the maximum epidemiological wave. Risk classification was performed according to Local Indicators of Spatial Association (LISA) methodology. The modified indices were analyzed using several hypothetical scenarios to evaluate their sensitivity. We found that modified indices could detect spatial and differential risks in epidemic and endemic years, which makes them a useful tool for the early detection of a dengue outbreak. In conclusion, the modified indices could predict risk at the spatio-temporal level in endemic years and could be incorporated in surveillance activities in endemic places.

  16. Development of a Valid and Reliable Knee Articular Cartilage Condition-Specific Study Methodological Quality Score.

    PubMed

    Harris, Joshua D; Erickson, Brandon J; Cvetanovich, Gregory L; Abrams, Geoffrey D; McCormick, Frank M; Gupta, Anil K; Verma, Nikhil N; Bach, Bernard R; Cole, Brian J

    2014-02-01

    Condition-specific questionnaires are important components in evaluation of outcomes of surgical interventions. No condition-specific study methodological quality questionnaire exists for evaluation of outcomes of articular cartilage surgery in the knee. To develop a reliable and valid knee articular cartilage-specific study methodological quality questionnaire. Cross-sectional study. A stepwise, a priori-designed framework was created for development of a novel questionnaire. Relevant items to the topic were identified and extracted from a recent systematic review of 194 investigations of knee articular cartilage surgery. In addition, relevant items from existing generic study methodological quality questionnaires were identified. Items for a preliminary questionnaire were generated. Redundant and irrelevant items were eliminated, and acceptable items modified. The instrument was pretested and items weighed. The instrument, the MARK score (Methodological quality of ARticular cartilage studies of the Knee), was tested for validity (criterion validity) and reliability (inter- and intraobserver). A 19-item, 3-domain MARK score was developed. The 100-point scale score demonstrated face validity (focus group of 8 orthopaedic surgeons) and criterion validity (strong correlation to Cochrane Quality Assessment score and Modified Coleman Methodology Score). Interobserver reliability for the overall score was good (intraclass correlation coefficient [ICC], 0.842), and for all individual items of the MARK score, acceptable to perfect (ICC, 0.70-1.000). Intraobserver reliability ICC assessed over a 3-week interval was strong for 2 reviewers (≥0.90). The MARK score is a valid and reliable knee articular cartilage condition-specific study methodological quality instrument. This condition-specific questionnaire may be used to evaluate the quality of studies reporting outcomes of articular cartilage surgery in the knee.

  17. Investigation of Science Inquiry Items for Use on an Alternate Assessment Based on Modified Achievement Standards Using Cognitive Lab Methodology

    ERIC Educational Resources Information Center

    Dickenson, Tammiee S.; Gilmore, Joanna A.; Price, Karen J.; Bennett, Heather L.

    2013-01-01

    This study evaluated the benefits of item enhancements applied to science-inquiry items for incorporation into an alternate assessment based on modified achievement standards for high school students. Six items were included in the cognitive lab sessions involving both students with and without disabilities. The enhancements (e.g., use of visuals,…

  18. Evaluation of a new approach to compute intervertebral disc height measurements from lateral radiographic views of the spine.

    PubMed

    Allaire, Brett T; DePaolis Kaluza, M Clara; Bruno, Alexander G; Samelson, Elizabeth J; Kiel, Douglas P; Anderson, Dennis E; Bouxsein, Mary L

    2017-01-01

    Current standard methods to quantify disc height, namely distortion compensated Roentgen analysis (DCRA), have been mostly utilized in the lumbar and cervical spine and have strict exclusion criteria. Specifically, discs adjacent to a vertebral fracture are excluded from measurement, thus limiting the use of DCRA in studies that include older populations with a high prevalence of vertebral fractures. Thus, we developed and tested a modified DCRA algorithm that does not depend on vertebral shape. Participants included 1186 men and women from the Framingham Heart Study Offspring and Third Generation Multidetector CT Study. Lateral CT scout images were used to place 6 morphometry points around each vertebra at 13 vertebral levels in each participant. Disc heights were calculated utilizing these morphometry points using DCRA methodology and our modified version of DCRA, which requires information from fewer morphometry points than the standard DCRA. Modified DCRA and standard DCRA measures of disc height are highly correlated, with concordance correlation coefficients above 0.999. Both measures demonstrate good inter- and intra-operator reproducibility. 13.9 % of available disc heights were not evaluable or excluded using the standard DCRA algorithm, while only 3.3 % of disc heights were not evaluable using our modified DCRA algorithm. Using our modified DCRA algorithm, it is not necessary to exclude vertebrae with fracture or other deformity from disc height measurements as in the standard DCRA. Modified DCRA also yields identical measurements to the standard DCRA. Thus, the use of modified DCRA for quantitative assessment of disc height will lead to less missing data without any loss of accuracy, making it a preferred alternative to the current standard methodology.

  19. A Diagnostic Model for Dementia in Clinical Practice-Case Methodology Assisting Dementia Diagnosis.

    PubMed

    Londos, Elisabet

    2015-04-02

    Dementia diagnosis is important for many different reasons. Firstly, to separate dementia, or major neurocognitive disorder, from MCI (mild cognitive impairment), mild neurocognitive disorder. Secondly, to define the specific underlying brain disorder to aid treatment, prognosis and decisions regarding care needs and assistance. The diagnostic method of dementias is a puzzle of different data pieces to be fitted together in the best possible way to reach a clinical diagnosis. Using a modified case methodology concept, risk factors affecting cognitive reserve and symptoms constituting the basis of the brain damage hypothesis, can be visualized, balanced and reflected against test results as well as structural and biochemical markers. The model's origin is the case method initially described in Harvard business school, here modified to serve dementia diagnostics.

  20. A Diagnostic Model for Dementia in Clinical Practice—Case Methodology Assisting Dementia Diagnosis

    PubMed Central

    Londos, Elisabet

    2015-01-01

    Dementia diagnosis is important for many different reasons. Firstly, to separate dementia, or major neurocognitive disorder, from MCI (mild cognitive impairment), mild neurocognitive disorder. Secondly, to define the specific underlying brain disorder to aid treatment, prognosis and decisions regarding care needs and assistance. The diagnostic method of dementias is a puzzle of different data pieces to be fitted together in the best possible way to reach a clinical diagnosis. Using a modified case methodology concept, risk factors affecting cognitive reserve and symptoms constituting the basis of the brain damage hypothesis, can be visualized, balanced and reflected against test results as well as structural and biochemical markers. The model’s origin is the case method initially described in Harvard business school, here modified to serve dementia diagnostics. PMID:26854146

  1. Genetic Modifiers and Oligogenic Inheritance

    PubMed Central

    Kousi, Maria; Katsanis, Nicholas

    2015-01-01

    Despite remarkable progress in the identification of mutations that drive genetic disorders, progress in understanding the effect of genetic background on the penetrance and expressivity of causal alleles has been modest, in part because of the methodological challenges in identifying genetic modifiers. Nonetheless, the progressive discovery of modifier alleles has improved both our interpretative ability and our analytical tools to dissect such phenomena. In this review, we analyze the genetic properties and behaviors of modifiers as derived from studies in patient populations and model organisms and we highlight conceptual and technological tools used to overcome some of the challenges inherent in modifier mapping and cloning. Finally, we discuss how the identification of these modifiers has facilitated the elucidation of biological pathways and holds the potential to improve the clinical predictive value of primary causal mutations and to develop novel drug targets. PMID:26033081

  2. Development of economic consequence methodology for process risk analysis.

    PubMed

    Zadakbar, Omid; Khan, Faisal; Imtiaz, Syed

    2015-04-01

    A comprehensive methodology for economic consequence analysis with appropriate models for risk analysis of process systems is proposed. This methodology uses loss functions to relate process deviations in a given scenario to economic losses. It consists of four steps: definition of a scenario, identification of losses, quantification of losses, and integration of losses. In this methodology, the process deviations that contribute to a given accident scenario are identified and mapped to assess potential consequences. Losses are assessed with an appropriate loss function (revised Taguchi, modified inverted normal) for each type of loss. The total loss is quantified by integrating different loss functions. The proposed methodology has been examined on two industrial case studies. Implementation of this new economic consequence methodology in quantitative risk assessment will provide better understanding and quantification of risk. This will improve design, decision making, and risk management strategies. © 2014 Society for Risk Analysis.

  3. Modeling of biosorption of Cu(II) by alkali-modified spent tea leaves using response surface methodology (RSM) and artificial neural network (ANN)

    NASA Astrophysics Data System (ADS)

    Ghosh, Arpita; Das, Papita; Sinha, Keka

    2015-06-01

    In the present work, spent tea leaves were modified with Ca(OH)2 and used as a new, non-conventional and low-cost biosorbent for the removal of Cu(II) from aqueous solution. Response surface methodology (RSM) and artificial neural network (ANN) were used to develop predictive models for simulation and optimization of the biosorption process. The influence of process parameters (pH, biosorbent dose and reaction time) on the biosorption efficiency was investigated through a two-level three-factor (23) full factorial central composite design with the help of Design Expert. The same design was also used to obtain a training set for ANN. Finally, both modeling methodologies were statistically compared by the root mean square error and absolute average deviation based on the validation data set. Results suggest that RSM has better prediction performance as compared to ANN. The biosorption followed Langmuir adsorption isotherm and it followed pseudo-second-order kinetic. The optimum removal efficiency of the adsorbent was found as 96.12 %.

  4. Early Detection for Dengue Using Local Indicator of Spatial Association (LISA) Analysis

    PubMed Central

    Parra-Amaya, Mayra Elizabeth; Puerta-Yepes, María Eugenia; Lizarralde-Bejarano, Diana Paola; Arboleda-Sánchez, Sair

    2016-01-01

    Dengue is a viral disease caused by a flavivirus that is transmitted by mosquitoes of the genus Aedes. There is currently no specific treatment or commercial vaccine for its control and prevention; therefore, mosquito population control is the only alternative for preventing the occurrence of dengue. For this reason, entomological surveillance is recommended by World Health Organization (WHO) to measure dengue risk in endemic areas; however, several works have shown that the current methodology (aedic indices) is not sufficient for predicting dengue. In this work, we modified indices proposed for epidemic periods. The raw value of the epidemiological wave could be useful for detecting risk in epidemic periods; however, risk can only be detected if analyses incorporate the maximum epidemiological wave. Risk classification was performed according to Local Indicators of Spatial Association (LISA) methodology. The modified indices were analyzed using several hypothetical scenarios to evaluate their sensitivity. We found that modified indices could detect spatial and differential risks in epidemic and endemic years, which makes them a useful tool for the early detection of a dengue outbreak. In conclusion, the modified indices could predict risk at the spatio-temporal level in endemic years and could be incorporated in surveillance activities in endemic places. PMID:28933396

  5. A modified eco-efficiency framework and methodology for advancing the state of practice of sustainability analysis as applied to green infrastructure.

    PubMed

    Ghimire, Santosh R; Johnston, John M

    2017-09-01

    We propose a modified eco-efficiency (EE) framework and novel sustainability analysis methodology for green infrastructure (GI) practices used in water resource management. Green infrastructure practices such as rainwater harvesting (RWH), rain gardens, porous pavements, and green roofs are emerging as viable strategies for climate change adaptation. The modified framework includes 4 economic, 11 environmental, and 3 social indicators. Using 6 indicators from the framework, at least 1 from each dimension of sustainability, we demonstrate the methodology to analyze RWH designs. We use life cycle assessment and life cycle cost assessment to calculate the sustainability indicators of 20 design configurations as Decision Management Objectives (DMOs). Five DMOs emerged as relatively more sustainable along the EE analysis Tradeoff Line, and we used Data Envelopment Analysis (DEA), a widely applied statistical approach, to quantify the modified EE measures as DMO sustainability scores. We also addressed the subjectivity and sensitivity analysis requirements of sustainability analysis, and we evaluated the performance of 10 weighting schemes that included classical DEA, equal weights, National Institute of Standards and Technology's stakeholder panel, Eco-Indicator 99, Sustainable Society Foundation's Sustainable Society Index, and 5 derived schemes. We improved upon classical DEA by applying the weighting schemes to identify sustainability scores that ranged from 0.18 to 1.0, avoiding the nonuniqueness problem and revealing the least to most sustainable DMOs. Our methodology provides a more comprehensive view of water resource management and is generally applicable to GI and industrial, environmental, and engineered systems to explore the sustainability space of alternative design configurations. Integr Environ Assess Manag 2017;13:821-831. Published 2017. This article is a US Government work and is in the public domain in the USA. Integrated Environmental Assessment and Management published by Wiley Periodicals, Inc. on behalf of Society of Environmental Toxicology & Chemistry (SETAC). Published 2017. This article is a US Government work and is in the public domain in the USA. Integrated Environmental Assessment and Management published by Wiley Periodicals, Inc. on behalf of Society of Environmental Toxicology & Chemistry (SETAC).

  6. Climate change and forest trees in the Pacific Northwest: guide to vulnerability assessment methodology

    Treesearch

    W. Devine; C. Aubry; J. Miller; K. Potter; A. Bower

    2012-01-01

    This guide provides a step-by-step description of the methodology used to apply the Forest Tree Genetic Risk Assessment System (ForGRAS; Potter and Crane 2010) to the tree species of the Pacific Northwest in a recent climate change vulnerability assessment (Devine et al. 2012). We describe our modified version of the ForGRAS model, and we review the model’s basic...

  7. Towards a Delamination Fatigue Methodology for Composite Materials

    NASA Technical Reports Server (NTRS)

    OBrien, Thomas K.

    2007-01-01

    A methodology that accounts for both delaminaton onset and growth in composite structural components is proposed for improved fatigue life prediction to reduce life cycle costs and improve accept/reject criteria for manufacturing flaws. The benefits of using a Delamination Onset Threshold (DOT) approach in combination with a Modified Damage Tolerance (MDT) approach is highlighted. The use of this combined approach to establish accept/reject criteria, requiring less conservative initial manufacturing flaw sizes, is illustrated.

  8. How to apply clinical cases and medical literature in the framework of a modified "failure mode and effects analysis" as a clinical reasoning tool--an illustration using the human biliary system.

    PubMed

    Wong, Kam Cheong

    2016-04-06

    Clinicians use various clinical reasoning tools such as Ishikawa diagram to enhance their clinical experience and reasoning skills. Failure mode and effects analysis, which is an engineering methodology in origin, can be modified and applied to provide inputs into an Ishikawa diagram. The human biliary system is used to illustrate a modified failure mode and effects analysis. The anatomical and physiological processes of the biliary system are reviewed. Failure is defined as an abnormality caused by infective, inflammatory, obstructive, malignancy, autoimmune and other pathological processes. The potential failures, their effect(s), main clinical features, and investigation that can help a clinician to diagnose at each anatomical part and physiological process are reviewed and documented in a modified failure mode and effects analysis table. Relevant medical and surgical cases are retrieved from the medical literature and weaved into the table. A total of 80 clinical cases which are relevant to the modified failure mode and effects analysis for the human biliary system have been reviewed and weaved into a designated table. The table is the backbone and framework for further expansion. Reviewing and updating the table is an iterative and continual process. The relevant clinical features in the modified failure mode and effects analysis are then extracted and included in the relevant Ishikawa diagram. This article illustrates an application of engineering methodology in medicine, and it sows the seeds of potential cross-pollination between engineering and medicine. Establishing a modified failure mode and effects analysis can be a teamwork project or self-directed learning process, or a mix of both. Modified failure mode and effects analysis can be deployed to obtain inputs for an Ishikawa diagram which in turn can be used to enhance clinical experiences and clinical reasoning skills for clinicians, medical educators, and students.

  9. Optimization of Nanocomposite Modified Asphalt Mixtures Fatigue Life using Response Surface Methodology

    NASA Astrophysics Data System (ADS)

    Bala, N.; Napiah, M.; Kamaruddin, I.; Danlami, N.

    2018-04-01

    In this study, modelling and optimization of materials polyethylene, polypropylene and nanosilica for nanocomposite modified asphalt mixtures has been examined to obtain optimum quantities for higher fatique life. Response Surface Methodology (RSM) was applied for the optimization based on Box Behnken design (BBD). Interaction effects of independent variables polymers and nanosilica on fatique life were evaluated. The result indicates that the individual effects of polymers and nanosilica content are both important. However, the content of nanosilica used has more significant effect on fatique life resistance. Also, the mean error obtained from optimization results is less than 5% for all the responses, this indicates that predicted values are in agreement with experimental results. Furthermore, it was concluded that asphalt mixture design with high performance properties, optimization using RSM is a very effective approach.

  10. Fabrication of gallium hexacyanoferrate modified carbon ionic liquid paste electrode for sensitive determination of hydrogen peroxide and glucose.

    PubMed

    Haghighi, Behzad; Khosravi, Mehdi; Barati, Ali

    2014-07-01

    Gallium hexacyanoferrate (GaHCFe) and graphite powder were homogeneously dispersed into n-dodecylpyridinium hexafluorophosphate and paraffin to fabricate GaHCFe modified carbon ionic liquid paste electrode (CILPE). Mixture experimental design was employed to optimize the fabrication of GaHCFe modified CILPE (GaHCFe-CILPE). A pair of well-defined redox peaks due to the redox reaction of GaHCFe through one-electron process was observed for the fabricated electrode. The fabricated GaHCFe-CILPE exhibited good electrocatalytic activity towards reduction and oxidation of H2O2. The observed sensitivities for the electrocatalytic oxidation and reduction of H2O2 at the operating potentials of +0.8 and -0.2V were about 13.8 and 18.3 mA M(-1), respectively. The detection limit (S/N=3) for H2O2 was about 1 μM. Additionally, glucose oxidase (GOx) was immobilized on GaHCFe-CILPE using two methodology, entrapment into Nafion matrix and cross-linking with glutaraldehyde and bovine serum albumin, in order to fabricate glucose biosensor. Linear dynamic rage, sensitivity and detection limit for glucose obtained by the biosensor fabricated using cross-linking methodology were 0.1-6mM, 0.87 mA M(-1) and 30 μM, respectively and better than those obtained (0.2-6mM, 0.12 mA M(-1) and 50 μM) for the biosensor fabricated using entrapment methodology. Copyright © 2014 Elsevier B.V. All rights reserved.

  11. Wild worm embryogenesis harbors ubiquitous polygenic modifier variation.

    PubMed

    Paaby, Annalise B; White, Amelia G; Riccardi, David D; Gunsalus, Kristin C; Piano, Fabio; Rockman, Matthew V

    2015-08-22

    Embryogenesis is an essential and stereotypic process that nevertheless evolves among species. Its essentiality may favor the accumulation of cryptic genetic variation (CGV) that has no effect in the wild-type but that enhances or suppresses the effects of rare disruptions to gene function. Here, we adapted a classical modifier screen to interrogate the alleles segregating in natural populations of Caenorhabditis elegans: we induced gene knockdowns and used quantitative genetic methodology to examine how segregating variants modify the penetrance of embryonic lethality. Each perturbation revealed CGV, indicating that wild-type genomes harbor myriad genetic modifiers that may have little effect individually but which in aggregate can dramatically influence penetrance. Phenotypes were mediated by many modifiers, indicating high polygenicity, but the alleles tend to act very specifically, indicating low pleiotropy. Our findings demonstrate the extent of conditional functionality in complex trait architecture.

  12. General implementation of arbitrary nonlinear quadrature phase gates

    NASA Astrophysics Data System (ADS)

    Marek, Petr; Filip, Radim; Ogawa, Hisashi; Sakaguchi, Atsushi; Takeda, Shuntaro; Yoshikawa, Jun-ichi; Furusawa, Akira

    2018-02-01

    We propose general methodology of deterministic single-mode quantum interaction nonlinearly modifying single quadrature variable of a continuous-variable system. The methodology is based on linear coupling of the system to ancillary systems subsequently measured by quadrature detectors. The nonlinear interaction is obtained by using the data from the quadrature detection for dynamical manipulation of the coupling parameters. This measurement-induced methodology enables direct realization of arbitrary nonlinear quadrature interactions without the need to construct them from the lowest-order gates. Such nonlinear interactions are crucial for more practical and efficient manipulation of continuous quadrature variables as well as qubits encoded in continuous-variable systems.

  13. Adaptation of EVIAVE methodology for monitoring and follow-up when evaluating the environmental impact of landfills

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arrieta, Gabriela, E-mail: tonina1903@hotmail.com; Requena, Ignacio, E-mail: requena@decsai.ugr.es; Toro, Javier, E-mail: jjtoroca@unal.edu.co

    Treatment and final disposal of Municipal Solid Waste can have a significant role in the generation of negative environmental impacts. As a prevention strategy, such activities are subjected to the process of Environmental Impact Assessment (EIA). Still, the follow-up of Environmental Management Plans or mitigation measures is limited, for one due to a lack of methodological approaches. In searching for possibilities, the University of Granada (Spain) developed a diagnostic methodology named EVIAVE, which allows one to quantify, by means of indexes, the environmental impact of landfills in view of their location and the conditions of exploitation. EVIAVE is applicable withinmore » the legal framework of the European Union and can be adapted to the environmental and legal conditions of other countries. This study entails its adaptation in Colombia, for the follow-up and control of the EIA process for landfills. Modifications involved inclusion of the environmental elements flora and fauna, and the evaluation of the environmental descriptors in agreement with the concept of vulnerability. The application of the modified EVIAVE in Colombian landfills allowed us to identify the elements affected by the operating conditions and maintenance. It may be concluded that this methodology is viable and effective for the follow-up and environmental control of EIA processes for landfills, and to analyze the associated risks, as it takes into account related environmental threats and vulnerabilities. - Highlights: • A modified methodology is used to monitor and follow-up environmental impacts in landfills. • The improved methodology includes the Vulnerability of Flora and Fauna to evaluate environmental impact of landfills. • The methodology serves to identify and evaluate the sources of risk generated in the construction and siting of landfills. • Environmental vulnerability indicators improve effectiveness of the control and follow-up phases of landfill management. • The follow-up of environmental management plans may help diminish the implementation gap in Environmental Impact Assessment.« less

  14. Distal biceps brachii tendon repair: a systematic review of patient outcome determination using modified Coleman methodology score criteria.

    PubMed

    Nyland, John; Causey, Brandon; Wera, Jeff; Krupp, Ryan; Tate, David; Gupta, Amit

    2017-07-01

    This systematic literature review evaluated the methodological research design quality of studies that evaluated patient outcomes following distal biceps brachii tendon repair and developed evidence-based recommendations for future patient clinical outcomes research. Following the preferred reporting items for systematic reviews and meta-analyses criteria, and using "biceps brachii", "tendon", "repair" and "outcome assessment" search terms, the CINAHL, Academic Search Premier and MEDLINE databases were searched from January 1960-October 2015. The modified Coleman methodology score (MCMS) served as the primary outcome measure. Descriptive statistical analysis was performed for composite and component MCMS and for patient outcome assessment methodology use frequency. A total of 93 studies were evaluated. Overall MCMS was low (57.1 ± 14). Only 12 (12.9 %) had prospective cohort or randomized controlled trial designs. There was a moderate relationship between publication year and MCMS (r = 0.53, P < 0.0001). Although 61 studies (65.6 %) had adequate surgical descriptions, only 3 (3.2 %) had well-described rehabilitation. Of 2253 subjects, only 39 (1.7 %) were women. Studies published after 2008 had higher MCMS scores than studies published earlier (61.3 ± 10 versus 52.9 ± 16, P = 0.003). Although overall research study methodological scores improved on average since 2008, generally low MCMS scores, retrospective designs, lack of eccentric elbow flexor or supinator strength testing, and poorly described surgical and rehabilitation descriptions remain commonplace. These findings decrease clinical study validity and generalizability. III.

  15. A methodology to modify land uses in a transit oriented development scenario.

    PubMed

    Sahu, Akshay

    2018-05-01

    Developing nations are adopting transit oriented development (TOD) strategies to decongest their transportation systems. These strategies are often adopted after the preparation of land use plans. The goal of this study was to build a methodology to modify these land uses using soft computing. This can help to achieve alternate land use plans relevant to TOD. The methodology incorporates TOD characteristics and objectives. Global TOD parameters (density, diversity, and distance to transit) were studied. Expert opinions gave weights and ranges for the parameters in an Indian TOD scenario. Rules to allocate land use was developed. Objective functions were defined. Four objectives were used. First was to maximize employment density, residential density and percent of mix land use. Second was to shape density and diversity with respect to distance. Third was to minimize degree of land use change, and fourth was to increase compactness of the land use allocation. The methodology was applied to two sectors of Naya Raipur, the new planned administrative capital of the state of Chhattisgarh, India. The city has implemented TOD in the form of Bus rapid transit system (BRTS) over an existing land use. Thousand random plans were generated through the methodology. Top 30 plans were selected as parent population for modifications through genetic algorithm (GA). Alternate plans were generated at the end of GA cycle. The best alternate plan was compared with successful BRTS and TOD land uses for its merits and demerits. It was also compared with the initial land use plan for empirical validation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Development of a competency framework for optometrists with a specialist interest in glaucoma.

    PubMed

    Myint, J; Edgar, D F; Kotecha, A; Crabb, D P; Lawrenson, J G

    2010-09-01

    To develop a competency framework, using a modified Delphi methodology, for optometrists with a specialist interest in glaucoma, which would provide a basis for training and accreditation. A modified iterative Delphi technique was used using a 16-member panel consisting almost exclusively of sub-specialist optometrists and ophthalmologists. The first round involved scoring the relevance of a draft series of competencies using a 9-point Likert scale with a free-text option to modify any competency or suggest additional competencies. The revised framework was subjected to a second round of scoring and free-text comment. The Delphi process was followed by a face-to-face structured workshop to debate and agree the final framework. The version of the framework agreed at the workshop was sent out for a 4-month period of external stakeholder validation. There was a 100% response to round 1 and an 94% response to round 2. All panel members attended the workshop. The final version of the competency framework was validated by a subsequent stakeholder consultation and contained 19 competencies for the diagnosis of glaucoma and 7 further competencies for monitoring and treatment. Application of a consensus methodology consisting of a modified Delphi technique allowed the development of a competency framework for glaucoma specialisation by optometrists. This will help to shape the development of a speciality curriculum and potentially could be adapted for other healthcare professionals.

  17. Modifiable worker risk factors contributing to workplace absence: a stakeholder-centred best-evidence synthesis of systematic reviews.

    PubMed

    Wagner, Shannon; White, Marc; Schultz, Izabela; Murray, Eleanor; Bradley, Susan M; Hsu, Vernita; McGuire, Lisa; Schulz, Werner

    2014-01-01

    A challenge facing stakeholders is the identification and translation of relevant high quality research to inform policy and practice. This study engaged academic and community stakeholders in conducting a best evidence-synthesis to identify modifiable risk and protective worker factors across health conditions impacting work-related absence. To identify modifiable worker disability risk and protective factors across common health conditions impacting work-related absence. We searched Medline, Embase, CINHAL, The Cochrane Library, PsycINFO, BusinessSourceComplete, and ABI/Inform from 2000 to 2011. Quantitative, qualitative, or mixed methods systematic reviews of work-focused population were considered for inclusion. Two or more reviewers independently reviewed articles for inclusion and methodological screening. The search strategy, expert input and grey literature identified 2,467 unique records. One hundred and forty-two full text articles underwent comprehensive review. Twenty-four systematic reviews met eligibility criteria. Modifiable worker factors found to have consistent evidence across two or more health conditions included emotional distress, negative enduring psychology/personality factors, negative health and disability perception, decreased physical activity, lack of family support, poor general health, increased functional disability, increased pain, increased fatigue and lack of motivation to return to work. Systematic reviews are limited by availability of high quality studies, lack of consistency of methodological screening and reporting, and variability of outcome measures used.

  18. Synthesis of low-cost adsorbent from rice bran for the removal of reactive dye based on the response surface methodology

    NASA Astrophysics Data System (ADS)

    Hong, Gui-Bing; Wang, Yi-Kai

    2017-11-01

    Rice bran is a major by-product of the rice milling industry and is abundant in Taiwan. This study proposed a simple method for modifying rice bran to make it a low-cost adsorbent to remove reactive blue 4 (RB4) from aqueous solutions. The effects of independent variables such as dye concentration (100-500 ppm), adsorbent dosage (20-120 mg) and temperature (30-60 °C) on the dye adsorption capacity of the modified rice bran adsorbent were investigated by using the response surface methodology (RSM). The results showed that the dye maximum adsorption capacity of the modified rice bran adsorbent was 151.3 mg g-1 with respect to a dye concentration of 500 ppm, adsorbent dosage of 65.36 mg, and temperature of 60 °C. The adsorption kinetics data followed the pseudo-second-order kinetic model, and the isotherm data fit the Langmuir isotherm model well. The maximum monolayer adsorption capacity was 178.57-185.19 mg g-1, which was comparable to that of other agricultural waste adsorbents used to remove RB4 from aqueous solutions in the literature. The thermodynamics analysis results indicated that the adsorption of RB4 onto the modified rice bran adsorbent is an endothermic, spontaneous monolayer adsorption that occurs through a physical process.

  19. Preparation of modified semi-coke by microwave heating and adsorption kinetics of methylene blue.

    PubMed

    Wang, Xin; Peng, Jin-Hui; Duan, Xin-Hui; Srinivasakannan, Chandrasekar

    2013-01-01

    Preparation of modified semi-coke has been achieved, using phosphoric acid as the modifying agent, by microwave heating from virgin semi-coke. Process optimization using a Central Composite Design (CCD) design of Response Surface Methodology (RSM) technique for the preparation of modifies semi-coke is presented in this paper. The optimum conditions for producing modified semi-coke were: concentration of phosphoric acid 2.04, heating time 20 minutes and temperature 587 degrees C, with the optimum iodine of 862 mg/g and yield of 47.48%. The textural characteristics of modified semi-coke were analyzed using scanning electron microscopy (SEM) and nitrogen adsorption isotherm. The BET surface area of modified semi-coke was estimated to be 989.60 m2/g, with the pore volume of 0.74 cm3/g and a pore diameter of 3.009 nm, with micro-pore volume contributing to 62.44%. The Methylene Blue monolayer adsorption capacity was found to be mg/g at K. The adsorption capacity of the modified semi-coke highlights its suitability for liquid phase adsorption application with a potential usage in waste water treatment.

  20. Predicting the Accuracy of Unguided Artillery Projectiles

    DTIC Science & Technology

    2016-09-01

    metrics using error models . E . OUTLINE OF THESIS Chapter I provides an introduction to artillery and briefly describes the types of artillery fire...METHODOLOGY ....................................................................................6 E . OUTLINE OF THESIS ...30 B . MODIFIED POINT MASS TRAJECTORY MODEL (MPMTM

  1. Methodologies for estimating advisory curve speeds on Oregon highways.

    DOT National Transportation Integrated Search

    2008-01-01

    This report reviews an Oregon research effort to evaluate the identification and marking of advisory speeds on Oregon : highways. In particular, this research effort focused on the implications of modified advisory speed thresholds and : identificati...

  2. The effect of music on cognitive performance: insight from neurobiological and animal studies.

    PubMed

    Rickard, Nikki S; Toukhsati, Samia R; Field, Simone E

    2005-12-01

    The past 50 years have seen numerous claims that music exposure enhances human cognitive performance. Critical evaluation of studies across a variety of contexts, however, reveals important methodological weaknesses. The current article argues that an interdisciplinary approach is required to advance this research. A case is made for the use of appropriate animal models to avoid many confounds associated with human music research. Although such research has validity limitations for humans, reductionist methodology enables a more controlled exploration of music's elementary effects. This article also explores candidate mechanisms for this putative effect. A review of neurobiological evidence from human and comparative animal studies confirms that musical stimuli modify autonomic and neurochemical arousal indices, and may also modify synaptic plasticity. It is proposed that understanding how music affects animals provides a valuable conjunct to human research and may be vital in uncovering how music might be used to enhance cognitive performance.

  3. Determination of anionic surface active agents using silica coated magnetite nanoparticles modified with cationic surfactant aggregates.

    PubMed

    Pena-Pereira, Francisco; Duarte, Regina M B O; Trindade, Tito; Duarte, Armando C

    2013-07-19

    The development of a novel methodology for extraction and preconcentration of the most commonly used anionic surface active agents (SAAs), linear alkylbenzene sulfonates (LAS), is presented herein. The present method, based on the use of silica-magnetite nanoparticles modified with cationic surfactant aggregates, was developed for determination of C10-C13 LAS homologues. The proposed methodology allowed quantitative recoveries of C10-C13 LAS homologues by using a reduced amount of magnetic nanoparticles. Limits of detection were in the range 0.8-1.9μgL(-1) for C10-C13 LAS homologues, while the repeatability, expressed as relative standard deviation (RSD), ranged from 2.0 to 3.9% (N=6). Finally, the proposed method was successfully applied to the analysis of a variety of natural water samples. Copyright © 2013 Elsevier B.V. All rights reserved.

  4. An efficient and cost-effective method for preparing transmission electron microscopy samples from powders

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wen, Haiming; Lin, Yaojun; Seidman, David N.

    The preparation of transmission electron microcopy (TEM) samples from powders with particle sizes larger than ~100 nm poses a challenge. The existing methods are complicated and expensive, or have a low probability of success. Herein, we report a modified methodology for preparation of TEM samples from powders, which is efficient, cost-effective, and easy to perform. This method involves mixing powders with an epoxy on a piece of weighing paper, curing the powder–epoxy mixture to form a bulk material, grinding the bulk to obtain a thin foil, punching TEM discs from the foil, dimpling the discs, and ion milling the dimpledmore » discs to electron transparency. Compared with the well established and robust grinding–dimpling–ion-milling method for TEM sample preparation for bulk materials, our modified approach for preparing TEM samples from powders only requires two additional simple steps. In this article, step-by-step procedures for our methodology are described in detail, and important strategies to ensure success are elucidated. Furthermore, our methodology has been applied successfully for preparing TEM samples with large thin areas and high quality for many different mechanically milled metallic powders.« less

  5. Extending injury prevention methodology to chemical terrorism preparedness: the Haddon Matrix and sarin.

    PubMed

    Varney, Shawn; Hirshon, Jon Mark; Dischinger, Patricia; Mackenzie, Colin

    2006-01-01

    The Haddon Matrix offers a classic epidemiological model for studying injury prevention. This methodology places the public health concepts of agent, host, and environment within the three sequential phases of an injury-producing incident-pre-event, event, and postevent. This study uses this methodology to illustrate how it could be applied in systematically preparing for a mass casualty disaster such as an unconventional sarin attack in a major urban setting. Nineteen city, state, federal, and military agencies responded to the Haddon Matrix chemical terrorism preparedness exercise and offered feedback in the data review session. Four injury prevention strategies (education, engineering, enforcement, and economics) were applied to the individual factors and event phases of the Haddon Matrix. The majority of factors identified in all phases were modifiable, primarily through educational interventions focused on individual healthcare providers and first responders. The Haddon Matrix provides a viable means of studying an unconventional problem, allowing for the identification of modifiable factors to decrease the type and severity of injuries following a mass casualty disaster such as a sarin release. This strategy could be successfully incorporated into disaster planning for other weapons attacks that could potentially cause mass casualties.

  6. An efficient and cost-effective method for preparing transmission electron microscopy samples from powders

    DOE PAGES

    Wen, Haiming; Lin, Yaojun; Seidman, David N.; ...

    2015-09-09

    The preparation of transmission electron microcopy (TEM) samples from powders with particle sizes larger than ~100 nm poses a challenge. The existing methods are complicated and expensive, or have a low probability of success. Herein, we report a modified methodology for preparation of TEM samples from powders, which is efficient, cost-effective, and easy to perform. This method involves mixing powders with an epoxy on a piece of weighing paper, curing the powder–epoxy mixture to form a bulk material, grinding the bulk to obtain a thin foil, punching TEM discs from the foil, dimpling the discs, and ion milling the dimpledmore » discs to electron transparency. Compared with the well established and robust grinding–dimpling–ion-milling method for TEM sample preparation for bulk materials, our modified approach for preparing TEM samples from powders only requires two additional simple steps. In this article, step-by-step procedures for our methodology are described in detail, and important strategies to ensure success are elucidated. Furthermore, our methodology has been applied successfully for preparing TEM samples with large thin areas and high quality for many different mechanically milled metallic powders.« less

  7. Safety of disease-modifying drugs for multiple sclerosis in pregnancy: current challenges and future considerations for effective pharmacovigilance.

    PubMed

    Lu, Ellen; Wang, Bing Wei; Guimond, Colleen; Synnes, Anne; Sadovnick, A Dessa; Dahlgren, Leanne; Traboulsee, Anthony; Tremlett, Helen

    2013-03-01

    When contemplating a pregnancy, women treated for multiple sclerosis (MS) with a disease-modifying drug must decide to discontinue their medication before conception or risk exposing their unborn child to potential drug toxicity. Few studies exist as reference for patients and physicians, and of those available, the majority are less than ideal due to real-world constraints, ethical issues and methodological shortcomings. The authors provide a brief summary of existing animal and human data with current recommendations regarding the safety of IFN-β, glatiramer acetate, natalizumab, mitoxantrone, fingolimod and teriflunomide during pregnancy and lactation in women with MS. We also assess the quality, strengths and limitations of the existing studies including challenges with study design. The investigation of outcomes such as spontaneous abortion and congenital anomalies are highlighted with potential methodological improvements for future studies on drug safety in pregnancy suggested. The authors explore the pharmacokinetics and pharmacodynamics of the MS disease-modifying drugs for their possible mechanistic role in fetal harm and discuss the potential role of clinical trials. Future pharmacovigilance studies should continue to pursue multicenter collaboration with an emphasis on appropriate study design.

  8. Conventional Weapons Effects on Reinforced Soil Walls.

    DTIC Science & Technology

    1995-03-01

    parametric study of the influence of specific design variables on wall panel response. D. METHODOLOGY A single degree of freedom model was modified to...t. dAev AVoF design methodologies for reinforced soil subjected o bl...at lad, LM, r..s.o.s. of these systems to such loading must be established...for many years and then on short notice be shipped to a location for use. The design for this shelter would be done as needed, although some non -site

  9. Testing for genetically modified organisms (GMOs): Past, present and future perspectives.

    PubMed

    Holst-Jensen, Arne

    2009-01-01

    This paper presents an overview of GMO testing methodologies and how these have evolved and may evolve in the next decade. Challenges and limitations for the application of the test methods as well as to the interpretation of results produced with the methods are highlighted and discussed, bearing in mind the various interests and competences of the involved stakeholders. To better understand the suitability and limitations of detection methodologies the evolution of transformation processes for creation of GMOs is briefly reviewed.

  10. Wild worm embryogenesis harbors ubiquitous polygenic modifier variation

    PubMed Central

    Paaby, Annalise B; White, Amelia G; Riccardi, David D; Gunsalus, Kristin C; Piano, Fabio; Rockman, Matthew V

    2015-01-01

    Embryogenesis is an essential and stereotypic process that nevertheless evolves among species. Its essentiality may favor the accumulation of cryptic genetic variation (CGV) that has no effect in the wild-type but that enhances or suppresses the effects of rare disruptions to gene function. Here, we adapted a classical modifier screen to interrogate the alleles segregating in natural populations of Caenorhabditis elegans: we induced gene knockdowns and used quantitative genetic methodology to examine how segregating variants modify the penetrance of embryonic lethality. Each perturbation revealed CGV, indicating that wild-type genomes harbor myriad genetic modifiers that may have little effect individually but which in aggregate can dramatically influence penetrance. Phenotypes were mediated by many modifiers, indicating high polygenicity, but the alleles tend to act very specifically, indicating low pleiotropy. Our findings demonstrate the extent of conditional functionality in complex trait architecture. DOI: http://dx.doi.org/10.7554/eLife.09178.001 PMID:26297805

  11. Durable warmth retention finishing of down using titanium dioxide optimized by RSM

    NASA Astrophysics Data System (ADS)

    Li, Huihao; Qi, Lu; Li, Jun

    2017-03-01

    A new product, referred to herein as modified down, was prepared by grafting down fiber with titanium dioxide. Grafting modification brings new functionalities to down Using response surface methodology (RSM); the effect of titanium dioxide concentration, KH550 concentration, and baking temperature on the warmth retention is studied using the response surface method (RSM) to obtain the optimal experimental formula and models. The optimal preparation conditions for modified down were 19.35% titanium dioxide, 15.81% KH550, 10min baking time, and 115 °C temperature. The warmth retention of the modified down was 79.98%, The structure and property of modified down were characterized and analyzed by using Flat Plate Warmth Retaining Tester, FT-IR, and TG. The CLO value increased by 27.28%, the thermal resistance increased by 27.34%. The ultimate residual quantities of the modified down fibers were 30.05%.

  12. Predicting the Reliability of Ceramics Under Transient Loads and Temperatures With CARES/Life

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.; Jadaan, Osama M.; Palfi, Tamas; Baker, Eric H.

    2003-01-01

    A methodology is shown for predicting the time-dependent reliability of ceramic components against catastrophic rupture when subjected to transient thermomechanical loads (including cyclic loads). The methodology takes into account the changes in material response that can occur with temperature or time (i.e., changing fatigue and Weibull parameters with temperature or time). This capability has been added to the NASA CARES/Life (Ceramic Analysis and Reliability Evaluation of Structures/Life) code. The code has been modified to have the ability to interface with commercially available finite element analysis (FEA) codes executed for transient load histories. Examples are provided to demonstrate the features of the methodology as implemented in the CARES/Life program.

  13. Validation of On-Orbit Methodology for the Assessment of Cardiac Function and Changes in the Circulating Volume Using Ultrasound and "Braslet-M" Occlusion Cuffs

    NASA Technical Reports Server (NTRS)

    Bogomolov, V. V.; Duncan, J. M.; Alferova, I. V.; Dulchavsky, S. A.; Ebert, D.; Hamilton, D. R.; Matveev, V. P.; Sargsyan, A. E.

    2008-01-01

    Recent advances in remotely guided imaging techniques on ISS allow the acquisition of high quality ultrasound data using crewmember operators with no medical background and minimal training. However, ongoing efforts are required to develop and validate methodology for complex imaging protocols to ensure their repeatability, efficiency, and suitability for use aboard the ISS. This Station Developmental Test Objective (SDTO) tests a cardiovascular evaluation methodology that takes advantage of the ISS Ultrasound capability, the Braslet-M device, and modified respiratory maneuvers (Valsalva and Mueller), to broaden the spectrum of anatomical and functional information on human cardiovascular system during long-duration space missions. The proposed methodology optimizes and combines new and previously demonstrated methods, and is expected to benefit medically indicated assessments, operational research protocols, and data collections for science. Braslet-M is a current Russian operational countermeasure that compresses the upper thigh to impede the venous return from lower extremities. The goal of the SDTO is to establish and validate a repeatable ultrasound-based methodology for the assessment of a number of cardiovascular criteria in microgravity. Braslet-M device is used as a means to acutely alter volume distribution while focused ultrasound measurements are performed. Modified respiratory maneuvers are done upon volume manipulations to record commensurate changes in anatomical and functional parameters. The overall cardiovascular effects of the Braslet-M device are not completely understood, and although not a primary objective of this SDTO, this effort will provide pilot data regarding the suitability of Braslet-M for its intended purpose, effects, and the indications for its use.

  14. An improved approach for flight readiness certification: Probabilistic models for flaw propagation and turbine blade failure. Volume 1: Methodology and applications

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for designs failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.

  15. An improved approach for flight readiness certification: Methodology for failure risk assessment and application examples. Volume 2: Software documentation

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes, These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.

  16. An improved approach for flight readiness certification: Methodology for failure risk assessment and application examples, volume 1

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.

  17. Reef Fish Survey Techniques: Assessing the Potential for Standardizing Methodologies.

    PubMed

    Caldwell, Zachary R; Zgliczynski, Brian J; Williams, Gareth J; Sandin, Stuart A

    2016-01-01

    Dramatic changes in populations of fishes living on coral reefs have been documented globally and, in response, the research community has initiated efforts to assess and monitor reef fish assemblages. A variety of visual census techniques are employed, however results are often incomparable due to differential methodological performance. Although comparability of data may promote improved assessment of fish populations, and thus management of often critically important nearshore fisheries, to date no standardized and agreed-upon survey method has emerged. This study describes the use of methods across the research community and identifies potential drivers of method selection. An online survey was distributed to researchers from academic, governmental, and non-governmental organizations internationally. Although many methods were identified, 89% of survey-based projects employed one of three methods-belt transect, stationary point count, and some variation of the timed swim method. The selection of survey method was independent of the research design (i.e., assessment goal) and region of study, but was related to the researcher's home institution. While some researchers expressed willingness to modify their current survey protocols to more standardized protocols (76%), their willingness decreased when methodologies were tied to long-term datasets spanning five or more years. Willingness to modify current methodologies was also less common among academic researchers than resource managers. By understanding both the current application of methods and the reported motivations for method selection, we hope to focus discussions towards increasing the comparability of quantitative reef fish survey data.

  18. Expect the Best.

    ERIC Educational Resources Information Center

    Omotani, Barbara J.; Omotani, Les

    1996-01-01

    School leaders can create an environment that supports highly effective beliefs, attitudes, and behaviors in teachers. Effective teachers believe every student has abundant, innate potential. Instead of watering down standards and expectations, they modify three key variables (time, grouping, and methodology) to help specific students achieve…

  19. A Methodology for Modeling Nuclear Power Plant Passive Component Aging in Probabilistic Risk Assessment under the Impact of Operating Conditions, Surveillance and Maintenance Activities

    NASA Astrophysics Data System (ADS)

    Guler Yigitoglu, Askin

    In the context of long operation of nuclear power plants (NPPs) (i.e., 60-80 years, and beyond), investigation of the aging of passive systems, structures and components (SSCs) is important to assess safety margins and to decide on reactor life extension as indicated within the U.S. Department of Energy (DOE) Light Water Reactor Sustainability (LWRS) Program. In the traditional probabilistic risk assessment (PRA) methodology, evaluating the potential significance of aging of passive SSCs on plant risk is challenging. Although passive SSC failure rates can be added as initiating event frequencies or basic event failure rates in the traditional event-tree/fault-tree methodology, these failure rates are generally based on generic plant failure data which means that the true state of a specific plant is not reflected in a realistic manner on aging effects. Dynamic PRA methodologies have gained attention recently due to their capability to account for the plant state and thus address the difficulties in the traditional PRA modeling of aging effects of passive components using physics-based models (and also in the modeling of digital instrumentation and control systems). Physics-based models can capture the impact of complex aging processes (e.g., fatigue, stress corrosion cracking, flow-accelerated corrosion, etc.) on SSCs and can be utilized to estimate passive SSC failure rates using realistic NPP data from reactor simulation, as well as considering effects of surveillance and maintenance activities. The objectives of this dissertation are twofold: The development of a methodology for the incorporation of aging modeling of passive SSC into a reactor simulation environment to provide a framework for evaluation of their risk contribution in both the dynamic and traditional PRA; and the demonstration of the methodology through its application to pressurizer surge line pipe weld and steam generator tubes in commercial nuclear power plants. In the proposed methodology, a multi-state physics based model is selected to represent the aging process. The model is modified via sojourn time approach to reflect the operational and maintenance history dependence of the transition rates. Thermal-hydraulic parameters of the model are calculated via the reactor simulation environment and uncertainties associated with both parameters and the models are assessed via a two-loop Monte Carlo approach (Latin hypercube sampling) to propagate input probability distributions through the physical model. The effort documented in this thesis towards this overall objective consists of : i) defining a process for selecting critical passive components and related aging mechanisms, ii) aging model selection, iii) calculating the probability that aging would cause the component to fail, iv) uncertainty/sensitivity analyses, v) procedure development for modifying an existing PRA to accommodate consideration of passive component failures, and, vi) including the calculated failure probability in the modified PRA. The proposed methodology is applied to pressurizer surge line pipe weld aging and steam generator tube degradation in pressurized water reactors.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jornet, N; Carrasco de Fez, P; Jordi, O

    Purpose: To evaluate the accuracy in total scatter factor (Sc,p) determination for small fields using commercial plastic scintillator detector (PSD). The manufacturer's spectral discrimination method to subtract Cerenkov light from the signal is discussed. Methods: Sc,p for field sizes ranging from 0.5 to 10 cm were measured using PSD Exradin (Standard Imaging) connected to two channel electrometer measuring the signals in two different spectral regions to subtract the Cerenkov signal from the PSD signal. A Pinpoint ionisation chamber 31006 (PTW) and a non-shielded semiconductor detector EFD (Scanditronix) were used for comparison. Measures were performed for a 6 MV X-ray beam.more » The Sc,p are measured at 10 cm depth in water for a SSD=100 cm and normalized to a 10'10 cm{sup 2} field size at the isocenter. All detectors were placed with their symmetry axis parallel to the beam axis.We followed the manufacturer's recommended calibration methodology to subtract the Cerenkov contribution to the signal as well as a modified method using smaller field sizes. The Sc,p calculated by using both calibration methodologies were compared. Results: Sc,p measured with the semiconductor and the PinPoint detectors agree, within 1.5%, for field sizes between 10'10 and 1'1 cm{sup 2}. Sc,p measured with the PSD using the manufacturer's calibration methodology were systematically 4% higher than those measured with the semiconductor detector for field sizes smaller than 5'5 cm{sup 2}. By using a modified calibration methodology for smalls fields and keeping the manufacturer calibration methodology for fields larger than 5'5cm{sup 2} field Sc,p matched semiconductor results within 2% field sizes larger than 1.5 cm. Conclusion: The calibration methodology proposed by the manufacturer is not appropriate for dose measurements in small fields. The calibration parameters are not independent of the incident radiation spectrum for this PSD. This work was partially financed by grant 2012 of Barcelona board of the AECC.« less

  1. Deterministic Multiaxial Creep and Creep Rupture Enhancements for CARES/Creep Integrated Design Code

    NASA Technical Reports Server (NTRS)

    Jadaan, Osama M.

    1998-01-01

    High temperature and long duration applications of monolithic ceramics can place their failure mode in the creep rupture regime. A previous model advanced by the authors described a methodology by which the creep rupture life of a loaded component can be predicted. That model was based on the life fraction damage accumulation rule in association with the modified Monkman-Grant creep rupture criterion. However, that model did not take into account the deteriorating state of the material due to creep damage (e.g., cavitation) as time elapsed. In addition, the material creep parameters used in that life prediction methodology, were based on uniaxial creep curves displaying primary and secondary creep behavior, with no tertiary regime. The objective of this paper is to present a creep life prediction methodology based on a modified form of the Kachanov-Rabotnov continuum damage mechanics (CDM) theory. In this theory, the uniaxial creep rate is described in terms of sum, temperature, time, and the current state of material damage. This scalar damage state parameter is basically an abstract measure of the current state of material damage due to creep deformation. The damage rate is assumed to vary with stress, temperature, time, and the current state of damage itself. Multiaxial creep and creep rupture formulations of the CDM approach are presented in this paper. Parameter estimation methodologies based on nonlinear regression analysis are also described for both, isothermal constant stress states and anisothermal variable stress conditions This creep life prediction methodology was preliminarily added to the integrated design code CARES/Creep (Ceramics Analysis and Reliability Evaluation of Structures/Creep), which is a postprocessor program to commercially available finite element analysis (FEA) packages. Two examples, showing comparisons between experimental and predicted creep lives of ceramic specimens, are used to demonstrate the viability of Ns methodology and the CARES/Creep program.

  2. Deterministic and Probabilistic Creep and Creep Rupture Enhancement to CARES/Creep: Multiaxial Creep Life Prediction of Ceramic Structures Using Continuum Damage Mechanics and the Finite Element Method

    NASA Technical Reports Server (NTRS)

    Jadaan, Osama M.; Powers, Lynn M.; Gyekenyesi, John P.

    1998-01-01

    High temperature and long duration applications of monolithic ceramics can place their failure mode in the creep rupture regime. A previous model advanced by the authors described a methodology by which the creep rupture life of a loaded component can be predicted. That model was based on the life fraction damage accumulation rule in association with the modified Monkman-Grant creep ripture criterion However, that model did not take into account the deteriorating state of the material due to creep damage (e.g., cavitation) as time elapsed. In addition, the material creep parameters used in that life prediction methodology, were based on uniaxial creep curves displaying primary and secondary creep behavior, with no tertiary regime. The objective of this paper is to present a creep life prediction methodology based on a modified form of the Kachanov-Rabotnov continuum damage mechanics (CDM) theory. In this theory, the uniaxial creep rate is described in terms of stress, temperature, time, and the current state of material damage. This scalar damage state parameter is basically an abstract measure of the current state of material damage due to creep deformation. The damage rate is assumed to vary with stress, temperature, time, and the current state of damage itself. Multiaxial creep and creep rupture formulations of the CDM approach are presented in this paper. Parameter estimation methodologies based on nonlinear regression analysis are also described for both, isothermal constant stress states and anisothermal variable stress conditions This creep life prediction methodology was preliminarily added to the integrated design code CARES/Creep (Ceramics Analysis and Reliability Evaluation of Structures/Creep), which is a postprocessor program to commercially available finite element analysis (FEA) packages. Two examples, showing comparisons between experimental and predicted creep lives of ceramic specimens, are used to demonstrate the viability of this methodology and the CARES/Creep program.

  3. Methodological adaptations for investigating the perceptions of language-impaired adolescents regarding the relative importance of selected communication skills.

    PubMed

    Reed, Vicki A; Brammall, Helen

    2006-01-01

    This article describes the systematic and detailed processes undertaken to modify a research methodology for use with language-impaired adolescents. The original methodology had been used previously with normally achieving adolescents and speech pathologists to obtain their opinions about the relative importance of selected communication skills for adolescents' positive peer relationships. Modifications attempted to address language-impaired adolescents' characteristic metalinguistic, literacy, cognitive, and information processing weaknesses. Revising the original wording of the communication skills, reducing the reading level of the skills from grade 10 to 4.6, using a Q-sort approach to ranking the importance of the skills, and revising the instructions and administration procedures led to what pilot testing results indicated was a valid methodology for use with language-impaired adolescents. Results of a preliminary study using the revised methodology suggested that language-impaired adolescents may perceive the relative importance of some communication skills differently from their normally achieving peers.

  4. [Genetically modified food and allergies - an update].

    PubMed

    Niemann, Birgit; Pöting, Annette; Braeuning, Albert; Lampen, Alfonso

    2016-07-01

    Approval by the European Commission is mandatory for placing genetically modified plants as food or feed on the market in member states of the European Union (EU). The approval is preceded by a safety assessment based on the guidance of the European Food Safety Authority EFSA. The assessment of allergenicity of genetically modified plants and their newly expressed proteins is an integral part of this assessment process. Guidance documents for the assessment of allergenicity are currently under revision. For this purpose, an expert workshop was conducted in Brussels on June 17, 2015. There, methodological improvements for the assessment of coeliac disease-causing properties of proteins, as well as the use of complex models for in vitro digestion of proteins were discussed. Using such techniques a refinement of the current, proven system of allergenicity assessment of genetically modified plants can be achieved.

  5. Eigensolutions, Shannon entropy and information energy for modified Tietz-Hua potential

    NASA Astrophysics Data System (ADS)

    Onate, C. A.; Onyeaju, M. C.; Ituen, E. E.; Ikot, A. N.; Ebomwonyi, O.; Okoro, J. O.; Dopamu, K. O.

    2018-04-01

    The Tietz-Hua potential is modified by the inclusion of De ( {{Ch - 1}/{1 - C_{h e^{{ - bh ( {r - re } )}} }}} )be^{{ - bh ( {r - re } )}} term to the Tietz-Hua potential model since a potential of such type is very good in the description and vibrational energy levels for diatomic molecules. The energy eigenvalues and the corresponding eigenfunctions are explicitly obtained using the methodology of parametric Nikiforov-Uvarov. By putting the potential parameter b = 0, in the modified Tietz-Hua potential quickly reduces to the Tietz-Hua potential. To show more applications of our work, we have computed the Shannon entropy and Information energy under the modified Tietz-Hua potential. However, the computation of the Shannon entropy and Information energy is an extension of the work of Falaye et al., who computed only the Fisher information under Tietz-Hua potential.

  6. New scoring methodology improves the sensitivity of the Alzheimer's Disease Assessment Scale-Cognitive subscale (ADAS-Cog) in clinical trials.

    PubMed

    Verma, Nishant; Beretvas, S Natasha; Pascual, Belen; Masdeu, Joseph C; Markey, Mia K

    2015-11-12

    As currently used, the Alzheimer's Disease Assessment Scale-Cognitive subscale (ADAS-Cog) has low sensitivity for measuring Alzheimer's disease progression in clinical trials. A major reason behind the low sensitivity is its sub-optimal scoring methodology, which can be improved to obtain better sensitivity. Using item response theory, we developed a new scoring methodology (ADAS-CogIRT) for the ADAS-Cog, which addresses several major limitations of the current scoring methodology. The sensitivity of the ADAS-CogIRT methodology was evaluated using clinical trial simulations as well as a negative clinical trial, which had shown an evidence of a treatment effect. The ADAS-Cog was found to measure impairment in three cognitive domains of memory, language, and praxis. The ADAS-CogIRT methodology required significantly fewer patients and shorter trial durations as compared to the current scoring methodology when both were evaluated in simulated clinical trials. When validated on data from a real clinical trial, the ADAS-CogIRT methodology had higher sensitivity than the current scoring methodology in detecting the treatment effect. The proposed scoring methodology significantly improves the sensitivity of the ADAS-Cog in measuring progression of cognitive impairment in clinical trials focused in the mild-to-moderate Alzheimer's disease stage. This provides a boost to the efficiency of clinical trials requiring fewer patients and shorter durations for investigating disease-modifying treatments.

  7. MODIFYING THE AVIAN REPRODUCTION TEST GUIDELINES FOR DETERMINING DOSE-RESPONSE RELATIONSHIPS

    EPA Science Inventory

    As a subgroup of the OECD Expert Group on Assessment of Endocrine Disrupting Effects in Birds, we reviewed unresolved methodological issues important for the development of a two-generation toxicity text, discussed advantages and disadvantages of alternative approaches, and propo...

  8. Intra-Articular Cellular Therapy for Osteoarthritis and Focal Cartilage Defects of the Knee: A Systematic Review of the Literature and Study Quality Analysis.

    PubMed

    Chahla, Jorge; Piuzzi, Nicolas S; Mitchell, Justin J; Dean, Chase S; Pascual-Garrido, Cecilia; LaPrade, Robert F; Muschler, George F

    2016-09-21

    Intra-articular cellular therapy injections constitute an appealing strategy that may modify the intra-articular milieu or regenerate cartilage in the settings of osteoarthritis and focal cartilage defects. However, little consensus exists regarding the indications for cellular therapies, optimal cell sources, methods of preparation and delivery, or means by which outcomes should be reported. We present a systematic review of the current literature regarding the safety and efficacy of cellular therapy delivered by intra-articular injection in the knee that provided a Level of Evidence of III or higher. A total of 420 papers were screened. Methodological quality was assessed using a modified Coleman methodology score. Only 6 studies (4 Level II and 2 Level III) met the criteria to be included in this review; 3 studies were on treatment of osteoarthritis and 3 were on treatment of focal cartilage defects. These included 4 randomized controlled studies without blinding, 1 prospective cohort study, and 1 retrospective therapeutic case-control study. The studies varied widely with respect to cell sources, cell characterization, adjuvant therapies, and assessment of outcomes. Outcome was reported in a total of 300 knees (124 in the osteoarthritis studies and 176 in the cartilage defect studies). Mean follow-up was 21.0 months (range, 12 to 36 months). All studies reported improved outcomes with intra-articular cellular therapy and no major adverse events. The mean modified Coleman methodology score was 59.1 ± 16 (range, 32 to 82). The studies of intra-articular cellular therapy injections for osteoarthritis and focal cartilage defects in the human knee suggested positive results with respect to clinical improvement and safety. However, the improvement was modest and a placebo effect cannot be disregarded. The overall quality of the literature was poor, and the methodological quality was fair, even among Level-II and III studies. Effective clinical assessment and optimization of injection therapies will demand greater attention to study methodology, including blinding; standardized quantitative methods for cell harvesting, processing, characterization, and delivery; and standardized reporting of clinical and structural outcomes. Therapeutic Level III. See Instructions for Authors for a complete description of levels of evidence. Copyright © 2016 by The Journal of Bone and Joint Surgery, Incorporated.

  9. THERP and HEART integrated methodology for human error assessment

    NASA Astrophysics Data System (ADS)

    Castiglia, Francesco; Giardina, Mariarosa; Tomarchio, Elio

    2015-11-01

    THERP and HEART integrated methodology is proposed to investigate accident scenarios that involve operator errors during high-dose-rate (HDR) treatments. The new approach has been modified on the basis of fuzzy set concept with the aim of prioritizing an exhaustive list of erroneous tasks that can lead to patient radiological overexposures. The results allow for the identification of human errors that are necessary to achieve a better understanding of health hazards in the radiotherapy treatment process, so that it can be properly monitored and appropriately managed.

  10. 77 FR 55737 - Small Business Size Standards: Finance and Insurance and Management of Companies and Enterprises

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-11

    ...The U.S. Small Business Administration (SBA) proposes to increase small business size standards for 37 industries in North American Industry Classification System (NAICS) Sector 52, Finance and Insurance, and for two industries in NAICS Sector 55, Management of Companies and Enterprises. In addition, SBA proposes to change the measure of size from average assets to average receipts for NAICS 522293, International Trade Financing. As part of its ongoing comprehensive size standards review, SBA evaluated all receipts based and assets based size standards in NAICS Sectors 52 and 55 to determine whether they should be retained or revised. This proposed rule is one of a series of proposed rules that will review size standards of industries grouped by NAICS Sector. SBA issued a White Paper entitled ``Size Standards Methodology'' and published a notice in the October 21, 2009 issue of the Federal Register to advise the public that the document is available on its Web site at www.sba.gov/size for public review and comments. The ``Size Standards Methodology'' White Paper explains how SBA establishes, reviews, and modifies its receipts based and employee based small business size standards. In this proposed rule, SBA has applied its methodology that pertains to establishing, reviewing, and modifying a receipts based size standard.

  11. Conducting participatory photography with children with disabilities: a literature review.

    PubMed

    Eisen, Isabel; Cunningham, Barbara Jane; Campbell, Wenonah

    2018-03-28

    This review summarized studies that used participatory photography with children with disabilities, including those with communication impairments, and described modifications made to the methodology to facilitate their participation in qualitative research. In the fall of 2016, we searched Psycinfo (OVID), ERIC, CINAHL and Web of Science to identify studies that used participatory photography with children with disabilities. The search was repeated in January 2018 to retrieve any new publications. The first author extracted data that described the characteristics of each study and the modifications used. Of the 258 articles identified, 19 met inclusion criteria. Participants ranged from 4-21 years old and had a variety of disabilities. Study topics included education, leisure activities and adulthood. Researchers modified participatory photography to enhance accessibility by: modifying cameras; providing individual training; teaching consent through role play; allowing children to direct adults to take photographs; including additional forms of media; using diaries and questionnaires; providing individual interviews with simplified questions; using multiple forms of communication; and modifying how photographs are shared. Participatory photography can be an effective method for studying the lived experiences of children with disabilities, particularly those with communication impairments. Methodological modifications can enhance the accessibility of this approach for this population. Implications for Rehabilitation Participatory photography may be an effective qualitative research method for learning about the perspectives and experiences of children with disabilities on a wide array of topics. There are many specific modifications that researchers can use to support the inclusion of children with disabilities in participatory photography research. The findings of studies that use participatory photography methodology may provide rehabilitation professionals with important insights into the lives of children with disabilities.

  12. Methodologic ramifications of paying attention to sex and gender differences in clinical research.

    PubMed

    Prins, Martin H; Smits, Kim M; Smits, Luc J

    2007-01-01

    Methodologic standards for studies on sex and gender differences should be developed to improve reporting of studies and facilitate their inclusion in systematic reviews. The essence of these studies lies within the concept of effect modification. This article reviews important methodologic issues in the design and reporting of pharmacogenetic studies. Differences in effect based on sex or gender should preferably be expressed in absolute terms (risk differences) to facilitate clinical decisions on treatment. Information on the distribution of potential effect modifiers or prognostic factors should be available to prevent a biased comparison of differences in effect between genotypes. Other considerations included the possibility of selective nonavailability of biomaterial and the choice of a statistical model to study effect modification. To ensure high study quality, additional methodologic issues should be taken into account when designing and reporting studies on sex and gender differences.

  13. An improved approach for flight readiness certification: Probabilistic models for flaw propagation and turbine blade failure. Volume 2: Software documentation

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflights systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for design, failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.

  14. Experience with environmental issues in GM crop production and the likely future scenarios.

    PubMed

    Gaugitsch, Helmut

    2002-02-28

    In the Cartagena Protocol on Biosafety, standards for risk assessment of genetically modified organisms (GMOs) have been set. The criteria and information basis for the risk assessment of GMOs have been modified by the EU Directive 2001/18/EC. Various approaches to further improve the criteria for environmental risk assessment of GMOs are described in this study. Reports on the ecological impacts of the cultivation of certain non-transgenic crop plants with novel or improved traits as analogy models to transgenic plants showed that the effects of agricultural practice can be at least equally important as the effects of gene transfer and invasiveness, although the latter currently play a major role in risk assessment of transgenic crops. Based on these results the applicability of the methodology of 'Life Cycle Analysis (LCA)' for genetically modified plants in comparison with conventionally bred and organically grown crop plants was evaluated. The methodology was regarded as applicable with some necessary future improvements. In current projects, the assessment of toxicology and allergenicity of GM crops are analysed, and suggestions for standardization are developed. Based on results and recommendations from these efforts there are still the challenges of how to operationalize the precautionary principle and how to take into account ecologically sensitive ecosystems, including centres of origin and centres of genetic diversity.

  15. A DUST-SETTLING CHAMBER FOR SAMPLING-INSTRUMENT COMPARISON STUDIES

    EPA Science Inventory

    Introduction: Few methods exist that can evenly and reproducibly deposit dusts onto surfaces for surface-sampling methodological studies. A dust-deposition chamber was designed for that purpose.

    Methods: A 1-m3 Rochester-type chamber was modified to produce high airborne d...

  16. Millennial Students' Mental Models of Information Retrieval

    ERIC Educational Resources Information Center

    Holman, Lucy

    2009-01-01

    This qualitative study examines first-year college students' online search habits in order to identify patterns in millennials' mental models of information retrieval. The study employed a combination of modified contextual inquiry and concept mapping methodologies to elicit students' mental models. The researcher confirmed previously observed…

  17. EVALUATION OF ALTERNATIVE GAUSSIAN PLUME DISPERSION MODELING TECHNIQUES IN ESTIMATING SHORT-TERM SULFUR DIOXIDE CONCENTRATIONS

    EPA Science Inventory

    A routinely applied atmospheric dispersion model was modified to evaluate alternative modeling techniques which allowed for more detailed source data, onsite meteorological data, and several dispersion methodologies. These were evaluated with hourly SO2 concentrations measured at...

  18. 76 FR 65504 - Proposed Agency Information Collection

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-21

    ..., including the validity of the methodology and assumptions used; (c) ways to enhance the quality, utility... Reliability Standard, FAC- 008-3--Facility Ratings, developed by the North American Electric Reliability... Reliability Standard FAC- 008-3 is pending before the Commission. The proposed Reliability Standard modifies...

  19. A Tensor Product Formulation of Strassen's Matrix Multiplication Algorithm with Memory Reduction

    DOE PAGES

    Kumar, B.; Huang, C. -H.; Sadayappan, P.; ...

    1995-01-01

    In this article, we present a program generation strategy of Strassen's matrix multiplication algorithm using a programming methodology based on tensor product formulas. In this methodology, block recursive programs such as the fast Fourier Transforms and Strassen's matrix multiplication algorithm are expressed as algebraic formulas involving tensor products and other matrix operations. Such formulas can be systematically translated to high-performance parallel/vector codes for various architectures. In this article, we present a nonrecursive implementation of Strassen's algorithm for shared memory vector processors such as the Cray Y-MP. A previous implementation of Strassen's algorithm synthesized from tensor product formulas required working storagemore » of size O(7 n ) for multiplying 2 n × 2 n matrices. We present a modified formulation in which the working storage requirement is reduced to O(4 n ). The modified formulation exhibits sufficient parallelism for efficient implementation on a shared memory multiprocessor. Performance results on a Cray Y-MP8/64 are presented.« less

  20. A Novel Method to Generate and Expand Clinical-Grade, Genetically Modified, Tumor-Infiltrating Lymphocytes

    PubMed Central

    Forget, Marie-Andrée; Tavera, René J.; Haymaker, Cara; Ramachandran, Renjith; Malu, Shuti; Zhang, Minying; Wardell, Seth; Fulbright, Orenthial J.; Toth, Chistopher Leroy; Gonzalez, Audrey M.; Thorsen, Shawne T.; Flores, Esteban; Wahl, Arely; Peng, Weiyi; Amaria, Rodabe N.; Hwu, Patrick; Bernatchez, Chantale

    2017-01-01

    Following the clinical success achieved with the first generation of adoptive cell therapy (ACT) utilizing in vitro expanded tumor-infiltrating lymphocytes (TILs), the second and third generations of TIL ACT are evolving toward the use of genetically modified TIL. TIL therapy generally involves the transfer of a high number of TIL, ranging from 109 to 1011 cells. One of the technical difficulties in genetically modifying TIL, using a retroviral vector, is the ability to achieve large expansion of transduced TIL, while keeping the technique suitable to a Good Manufacturing Practices (GMP) environment. Consequently, we developed and optimized a novel method for the efficient production of large numbers of GMP-grade, gene-modified TIL for the treatment of patients with ACT. The chemokine receptor CXCR2 was used as the gene of interest for methodology development. The optimized procedure is currently used in the production of gene-modified TIL for two clinical trials for the treatment of metastatic melanoma at MD Anderson Cancer Center. PMID:28824634

  1. An improved approach for flight readiness certification: Methodology for failure risk assessment and application examples. Volume 3: Structure and listing of programs

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.

  2. Artificial neural networks in evaluation and optimization of modified release solid dosage forms.

    PubMed

    Ibrić, Svetlana; Djuriš, Jelena; Parojčić, Jelena; Djurić, Zorica

    2012-10-18

    Implementation of the Quality by Design (QbD) approach in pharmaceutical development has compelled researchers in the pharmaceutical industry to employ Design of Experiments (DoE) as a statistical tool, in product development. Among all DoE techniques, response surface methodology (RSM) is the one most frequently used. Progress of computer science has had an impact on pharmaceutical development as well. Simultaneous with the implementation of statistical methods, machine learning tools took an important place in drug formulation. Twenty years ago, the first papers describing application of artificial neural networks in optimization of modified release products appeared. Since then, a lot of work has been done towards implementation of new techniques, especially Artificial Neural Networks (ANN) in modeling of production, drug release and drug stability of modified release solid dosage forms. The aim of this paper is to review artificial neural networks in evaluation and optimization of modified release solid dosage forms.

  3. Artificial Neural Networks in Evaluation and Optimization of Modified Release Solid Dosage Forms

    PubMed Central

    Ibrić, Svetlana; Djuriš, Jelena; Parojčić, Jelena; Djurić, Zorica

    2012-01-01

    Implementation of the Quality by Design (QbD) approach in pharmaceutical development has compelled researchers in the pharmaceutical industry to employ Design of Experiments (DoE) as a statistical tool, in product development. Among all DoE techniques, response surface methodology (RSM) is the one most frequently used. Progress of computer science has had an impact on pharmaceutical development as well. Simultaneous with the implementation of statistical methods, machine learning tools took an important place in drug formulation. Twenty years ago, the first papers describing application of artificial neural networks in optimization of modified release products appeared. Since then, a lot of work has been done towards implementation of new techniques, especially Artificial Neural Networks (ANN) in modeling of production, drug release and drug stability of modified release solid dosage forms. The aim of this paper is to review artificial neural networks in evaluation and optimization of modified release solid dosage forms. PMID:24300369

  4. Agricultural Entrepreneurship Orientation: Is Academic Training a Missing Link?

    ERIC Educational Resources Information Center

    Mohammadinezhad, Soodeh; Sharifzadeh, Maryam

    2017-01-01

    Purpose: The purpose of this paper is to investigate the importance of academic courses on agricultural entrepreneurship. Design/methodology/approach: Modified global entrepreneurship and development index (GEDI) was used to determine entrepreneurial dimensions among 19 graduated students of agricultural colleges resided in Iran. Fuzzy analytical…

  5. Choice-Based Segmentation as an Enrollment Management Tool

    ERIC Educational Resources Information Center

    Young, Mark R.

    2002-01-01

    This article presents an approach to enrollment management based on target marketing strategies developed from a choice-based segmentation methodology. Students are classified into "switchable" or "non-switchable" segments based on their probability of selecting specific majors. A modified multinomial logit choice model is used to identify…

  6. Modified graphene oxide sensors for ultra-sensitive detection of nitrate ions in water.

    PubMed

    Ren, Wen; Mura, Stefania; Irudayaraj, Joseph M K

    2015-10-01

    Nitrate ions is a very common contaminant in drinking water and has a significant impact on the environment, necessitating routine monitoring. Due to its chemical and physical properties, it is hard to directly detect nitrate ions with high sensitivity in a simple and inexpensive manner. Herein with amino group modified graphene oxide (GO) as a sensing element, we show a direct and ultra-sensitive method to detect nitrate ions, at a lowest detected concentration of 5 nM in river water samples, much lower than the reported methods based on absorption spectroscopy. Furthermore, unlike the reported strategies based on absorption spectroscopy wherein the nitrate concentration is determined by monitoring an increase in aggregation of gold nanoparticles (GNPs), our method evaluates the concentration of nitrate ions based on reduction in aggregation of GNPs for monitoring in real samples. To improve sensitivity, several optimizations were performed, including the assessment of the amount of modified GO required, concentration of GNPs and incubation time. The detection methodology was characterized by zeta potential, TEM and SEM. Our results indicate that an enrichment of modified GO with nitrate ions contributed to excellent sensitivity and the entire detection procedure could be completed within 75 min with only 20 μl of sample. This simple and rapid methodology was applied to monitor nitrate ions in real samples with excellent sensitivity and minimum pretreatment. The proposed approach paves the way for a novel means to detect anions in real samples and highlights the potential of GO based detection strategy for water quality monitoring. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. Symposium 'Methodology in Medical Education Research' organised by the Methodology in Medical Education Research Committee of the German Society of Medical Education May, 25th to 26th 2013 at Charité, Berlin.

    PubMed

    Schüttpelz-Brauns, Katrin; Kiessling, Claudia; Ahlers, Olaf; Hautz, Wolf E

    2015-01-01

    In 2013, the Methodology in Medical Education Research Committee ran a symposium on "Research in Medical Education" as part of its ongoing faculty development activities. The symposium aimed to introduce to participants educational research methods with a specific focus on research in medical education. Thirty-five participants were able to choose from workshops covering qualitative methods, quantitative methods and scientific writing throughout the one and a half days. The symposium's evaluation showed participant satisfaction with the format as well as suggestions for future improvement. Consequently, the committee will offer the symposium again in a modified form in proximity to the next annual Congress of the German Society of Medical Education.

  8. The international environment UNISPACE '82 and the ITU: A relationship between orbit-spectrum resource allocation and orbital debris

    NASA Technical Reports Server (NTRS)

    Olmstead, D.

    1985-01-01

    The 1985 Space WARC will examine and potentially modify the current geostationary orbit spectrum resource allocation methodology. Discussions in this international political environment could likely associate the geostationary orbital debris issue with the politicized issue of orbit spectrum allocation.

  9. An Integrated Scale for Measuring an Organizational Learning System

    ERIC Educational Resources Information Center

    Jyothibabu, C.; Farooq, Ayesha; Pradhan, Bibhuti Bhusan

    2010-01-01

    Purpose: The purpose of this paper is to develop an integrated measurement scale for an organizational learning system by capturing the learning enablers, learning results and performance outcome in an organization. Design/methodology/approach: A new measurement scale was developed by integrating and modifying two existing scales, identified…

  10. Strike Four! Do-Over Policies Institutionalize GPA Distortion

    ERIC Educational Resources Information Center

    Marx, Jonathan; Meeler, David

    2013-01-01

    Purpose: The aim of this paper is to illustrate how universities play an institutional role in inflating student grade point averages (GPA) by modifying academic polices such as course withdraw, repeats, and satisfactory/unsatisfactory grade options. Design/methodology/approach: Three research strategies are employed: an examination of eight…

  11. Usability and Instructional Design Heuristics for E-Learning Evaluation.

    ERIC Educational Resources Information Center

    Reeves, Thomas C.; Benson, Lisa; Elliott, Dean; Grant, Michael; Holschuh, Doug; Kim, Beaumie; Kim, Hyeonjin; Lauber, Erick; Loh, Sebastian

    Heuristic evaluation is a methodology for investigating the usability of software originally developed by Nielsen (1993, 2000). Nielsen's protocol was modified and refined for evaluating e-learning programs by participants in a doctoral seminar held at the University of Georgia in 2001. The modifications primarily involved expanding Nielsen's…

  12. [Optimization of ethylene production from ethanol dehydration using Zn-Mn-Co/HZSM-5 by response surface methodology].

    PubMed

    Wang, Wei; Cheng, Keke; Xue, Jianwei; Zhang, Jian'an

    2011-03-01

    The effects of reaction temperature, ethanol concentration and weight hourly space velocity (WHSV) on the ethylene production from ethanol dehydration using zinc, manganese and cobalt modified HZSM-5 catalyst were investigated by response surface methodology (RSM). The results showed that the most significant effect among factors was reaction temperature and the factors had interaction. The optimum conditions were found as 34.4% ethanol concentration, 261.3 0 degrees C of reaction temperature and 1.18 h(-1) of WHSV, under these conditions the yield of ethylene achieved 98.69%.

  13. Methodologies for Verification and Validation of Space Launch System (SLS) Structural Dynamic Models: Appendices

    NASA Technical Reports Server (NTRS)

    Coppolino, Robert N.

    2018-01-01

    Verification and validation (V&V) is a highly challenging undertaking for SLS structural dynamics models due to the magnitude and complexity of SLS subassemblies and subassemblies. Responses to challenges associated with V&V of Space Launch System (SLS) structural dynamics models are presented in Volume I of this paper. Four methodologies addressing specific requirements for V&V are discussed. (1) Residual Mode Augmentation (RMA). (2) Modified Guyan Reduction (MGR) and Harmonic Reduction (HR, introduced in 1976). (3) Mode Consolidation (MC). Finally, (4) Experimental Mode Verification (EMV). This document contains the appendices to Volume I.

  14. Lateral stability analysis for X-29A drop model using system identification methodology

    NASA Technical Reports Server (NTRS)

    Raney, David L.; Batterson, James G.

    1989-01-01

    A 22-percent dynamically scaled replica of the X-29A forward-swept-wing airplane has been flown in radio-controlled drop tests at the NASA Langley Research Center. A system identification study of the recorded data was undertaken to examine the stability and control derivatives that influence the lateral behavior of this vehicle with particular emphasis on an observed wing rock phenomenon. All major lateral stability derivatives and the damping-in-roll derivative were identified for angles of attack from 5 to 80 degrees by using a data-partitioning methodology and a modified stepwise regression algorithm.

  15. WAMA: a method of optimizing reticle/die placement to increase litho cell productivity

    NASA Astrophysics Data System (ADS)

    Dor, Amos; Schwarz, Yoram

    2005-05-01

    This paper focuses on reticle/field placement methodology issues, the disadvantages of typical methods used in the industry, and the innovative way that the WAMA software solution achieves optimized placement. Typical wafer placement methodologies used in the semiconductor industry considers a very limited number of parameters, like placing the maximum amount of die on the wafer circle and manually modifying die placement to minimize edge yield degradation. This paper describes how WAMA software takes into account process characteristics, manufacturing constraints and business objectives to optimize placement for maximum stepper productivity and maximum good die (yield) on the wafer.

  16. Methodological quality of meta-analyses of single-case experimental studies.

    PubMed

    Jamshidi, Laleh; Heyvaert, Mieke; Declercq, Lies; Fernández-Castilla, Belén; Ferron, John M; Moeyaert, Mariola; Beretvas, S Natasha; Onghena, Patrick; Van den Noortgate, Wim

    2017-12-28

    Methodological rigor is a fundamental factor in the validity and credibility of the results of a meta-analysis. Following an increasing interest in single-case experimental design (SCED) meta-analyses, the current study investigates the methodological quality of SCED meta-analyses. We assessed the methodological quality of 178 SCED meta-analyses published between 1985 and 2015 through the modified Revised-Assessment of Multiple Systematic Reviews (R-AMSTAR) checklist. The main finding of the current review is that the methodological quality of the SCED meta-analyses has increased over time, but is still low according to the R-AMSTAR checklist. A remarkable percentage of the studies (93.80% of the included SCED meta-analyses) did not even reach the midpoint score (22, on a scale of 0-44). The mean and median methodological quality scores were 15.57 and 16, respectively. Relatively high scores were observed for "providing the characteristics of the included studies" and "doing comprehensive literature search". The key areas of deficiency were "reporting an assessment of the likelihood of publication bias" and "using the methods appropriately to combine the findings of studies". Although the results of the current review reveal that the methodological quality of the SCED meta-analyses has increased over time, still more efforts are needed to improve their methodological quality. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Development of the Comprehensive Cervical Dystonia Rating Scale: Methodology

    PubMed Central

    Comella, Cynthia L.; Fox, Susan H.; Bhatia, Kailash P.; Perlmutter, Joel S.; Jinnah, Hyder A.; Zurowski, Mateusz; McDonald, William M.; Marsh, Laura; Rosen, Ami R.; Waliczek, Tracy; Wright, Laura J.; Galpern, Wendy R.; Stebbins, Glenn T.

    2016-01-01

    We present the methodology utilized for development and clinimetric testing of the Comprehensive Cervical Dystonia (CD) Rating scale, or CCDRS. The CCDRS includes a revision of the Toronto Western Spasmodic Torticollis Rating Scale (TWSTRS-2), a newly developed psychiatric screening tool (TWSTRS-PSYCH), and the previously validated Cervical Dystonia Impact Profile (CDIP-58). For the revision of the TWSTRS, the original TWSTRS was examined by a committee of dystonia experts at a dystonia rating scales workshop organized by the Dystonia Medical Research Foundation. During this workshop, deficiencies in the standard TWSTRS were identified and recommendations for revision of the severity and pain subscales were incorporated into the TWSTRS-2. Given that no scale currently evaluates the psychiatric features of cervical dystonia (CD), we used a modified Delphi methodology and a reiterative process of item selection to develop the TWSTRS-PSYCH. We also included the CDIP-58 to capture the impact of CD on quality of life. The three scales (TWSTRS2, TWSTRS-PSYCH, and CDIP-58) were combined to construct the CCDRS. Clinimetric testing of reliability and validity of the CCDRS are described. The CCDRS was designed to be used in a modular fashion that can measure the full spectrum of CD. This scale will provide rigorous assessment for studies of natural history as well as novel symptom-based or disease-modifying therapies. PMID:27088112

  18. Development of the Comprehensive Cervical Dystonia Rating Scale: Methodology.

    PubMed

    Comella, Cynthia L; Fox, Susan H; Bhatia, Kailash P; Perlmutter, Joel S; Jinnah, Hyder A; Zurowski, Mateusz; McDonald, William M; Marsh, Laura; Rosen, Ami R; Waliczek, Tracy; Wright, Laura J; Galpern, Wendy R; Stebbins, Glenn T

    2015-06-01

    We present the methodology utilized for development and clinimetric testing of the Comprehensive Cervical Dystonia (CD) Rating scale, or CCDRS. The CCDRS includes a revision of the Toronto Western Spasmodic Torticollis Rating Scale (TWSTRS-2), a newly developed psychiatric screening tool (TWSTRS-PSYCH), and the previously validated Cervical Dystonia Impact Profile (CDIP-58). For the revision of the TWSTRS, the original TWSTRS was examined by a committee of dystonia experts at a dystonia rating scales workshop organized by the Dystonia Medical Research Foundation. During this workshop, deficiencies in the standard TWSTRS were identified and recommendations for revision of the severity and pain subscales were incorporated into the TWSTRS-2. Given that no scale currently evaluates the psychiatric features of cervical dystonia (CD), we used a modified Delphi methodology and a reiterative process of item selection to develop the TWSTRS-PSYCH. We also included the CDIP-58 to capture the impact of CD on quality of life. The three scales (TWSTRS2, TWSTRS-PSYCH, and CDIP-58) were combined to construct the CCDRS. Clinimetric testing of reliability and validity of the CCDRS are described. The CCDRS was designed to be used in a modular fashion that can measure the full spectrum of CD. This scale will provide rigorous assessment for studies of natural history as well as novel symptom-based or disease-modifying therapies.

  19. Mercury methylation and reduction potentials in marine water: An improved methodology using 197Hg radiotracer.

    PubMed

    Koron, Neža; Bratkič, Arne; Ribeiro Guevara, Sergio; Vahčič, Mitja; Horvat, Milena

    2012-01-01

    A highly sensitive laboratory methodology for simultaneous determination of methylation and reduction of spiked inorganic mercury (Hg(2+)) in marine water labelled with high specific activity radiotracer ((197)Hg prepared from enriched (196)Hg stable isotope) was developed. A conventional extraction protocol for methylmercury (CH(3)Hg(+)) was modified in order to significantly reduce the partitioning of interfering labelled Hg(2+) into the final extract, thus allowing the detection of as little as 0.1% of the Hg(2+) spike transformed to labelled CH(3)Hg(+). The efficiency of the modified CH(3)Hg(+) extraction procedure was assessed by radiolabelled CH(3)Hg(+) spikes corresponding to concentrations of methylmercury between 0.05 and 4ngL(-1). The recoveries were 73.0±6.0% and 77.5±3.9% for marine and MilliQ water, respectively. The reduction potential was assessed by purging and trapping the radiolabelled elemental Hg in a permanganate solution. The method allows detection of the reduction of as little as 0.001% of labelled Hg(2+) spiked to natural waters. To our knowledge, the optimised methodology is among the most sensitive available to study the Hg methylation and reduction potential, therefore allowing experiments to be done at spikes close to natural levels (1-10ngL(-1)). Copyright © 2011 Elsevier Ltd. All rights reserved.

  20. Systematic investigation of gastrointestinal diseases in China (SILC): validation of survey methodology

    PubMed Central

    2009-01-01

    Background Symptom-based surveys suggest that the prevalence of gastrointestinal diseases is lower in China than in Western countries. The aim of this study was to validate a methodology for the epidemiological investigation of gastrointestinal symptoms and endoscopic findings in China. Methods A randomized, stratified, multi-stage sampling methodology was used to select 18 000 adults aged 18-80 years from Shanghai, Beijing, Xi'an, Wuhan and Guangzhou. Participants from Shanghai were invited to provide blood samples and undergo upper gastrointestinal endoscopy. All participants completed Chinese versions of the Reflux Disease Questionnaire (RDQ) and the modified Rome II questionnaire; 20% were also invited to complete the 36-item Short Form Health Survey (SF-36) and Epworth Sleepiness Scale (ESS). The psychometric properties of the questionnaires were evaluated statistically. Results The study was completed by 16 091 individuals (response rate: 89.4%), with 3219 (89.4% of those invited) completing the SF-36 and ESS. All 3153 participants in Shanghai provided blood samples and 1030 (32.7%) underwent endoscopy. Cronbach's alpha coefficients were 0.89, 0.89, 0.80 and 0.91, respectively, for the RDQ, modified Rome II questionnaire, ESS and SF-36, supporting internal consistency. Factor analysis supported construct validity of all questionnaire dimensions except SF-36 psychosocial dimensions. Conclusion This population-based study has great potential to characterize the relationship between gastrointestinal symptoms and endoscopic findings in China. PMID:19925662

  1. Systematic investigation of gastrointestinal diseases in China (SILC): validation of survey methodology.

    PubMed

    Yan, Xiaoyan; Wang, Rui; Zhao, Yanfang; Ma, Xiuqiang; Fang, Jiqian; Yan, Hong; Kang, Xiaoping; Yin, Ping; Hao, Yuantao; Li, Qiang; Dent, John; Sung, Joseph; Zou, Duowu; Johansson, Saga; Halling, Katarina; Liu, Wenbin; He, Jia

    2009-11-19

    Symptom-based surveys suggest that the prevalence of gastrointestinal diseases is lower in China than in Western countries. The aim of this study was to validate a methodology for the epidemiological investigation of gastrointestinal symptoms and endoscopic findings in China. A randomized, stratified, multi-stage sampling methodology was used to select 18,000 adults aged 18-80 years from Shanghai, Beijing, Xi'an, Wuhan and Guangzhou. Participants from Shanghai were invited to provide blood samples and undergo upper gastrointestinal endoscopy. All participants completed Chinese versions of the Reflux Disease Questionnaire (RDQ) and the modified Rome II questionnaire; 20% were also invited to complete the 36-item Short Form Health Survey (SF-36) and Epworth Sleepiness Scale (ESS). The psychometric properties of the questionnaires were evaluated statistically. The study was completed by 16,091 individuals (response rate: 89.4%), with 3219 (89.4% of those invited) completing the SF-36 and ESS. All 3153 participants in Shanghai provided blood samples and 1030 (32.7%) underwent endoscopy. Cronbach's alpha coefficients were 0.89, 0.89, 0.80 and 0.91, respectively, for the RDQ, modified Rome II questionnaire, ESS and SF-36, supporting internal consistency. Factor analysis supported construct validity of all questionnaire dimensions except SF-36 psychosocial dimensions. This population-based study has great potential to characterize the relationship between gastrointestinal symptoms and endoscopic findings in China.

  2. Study plan to identify long term national telecommunications need and priorities applying Delphi techniques (handbook). [technological forecasting - United States of America

    NASA Technical Reports Server (NTRS)

    1974-01-01

    A handbook that explains the basic Delphi methodology and discusses modified Delphi techniques is presented. The selection of communications experts to participate in a study, the construction of questionnaires on potential communications developments, and requisite technology is treated. No two modified Delphi studies were the same, which reflects the flexibility and adaptability of the technique. Each study must be specifically tailored to a particular case, and consists of seeking a consensus of opinion among experts about a particular subject and attendant conditions that may prevail in the future.

  3. Ultrasonic activated efficient synthesis of chromenes using amino-silane modified Fe3O4 nanoparticles: A versatile integration of high catalytic activity and facile recovery

    NASA Astrophysics Data System (ADS)

    Safari, Javad; Zarnegar, Zohre

    2014-08-01

    An efficient synthesis of 2-amino-4H-chromenes is achieved by one pot three component coupling reaction of aldehyde, malononitrile, and resorcinol using amino-silane modified Fe3O4 nanoparticles (MNPs-NH2) heterogeneous nanocatalyst under sonic condition. The attractive advantages of the present process are mild reaction conditions, short reaction times, easy isolation of products, good yields and simple operational procedures. Combination of the advantages of ultrasonic irradiation and magnetic nanoparticles provides important methodology to carry out catalytic transformations.

  4. Catchment area-based evaluation of the AMC-dependent SCS-CN-based rainfall-runoff models

    NASA Astrophysics Data System (ADS)

    Mishra, S. K.; Jain, M. K.; Pandey, R. P.; Singh, V. P.

    2005-09-01

    Using a large set of rainfall-runoff data from 234 watersheds in the USA, a catchment area-based evaluation of the modified version of the Mishra and Singh (2002a) model was performed. The model is based on the Soil Conservation Service Curve Number (SCS-CN) methodology and incorporates the antecedent moisture in computation of direct surface runoff. Comparison with the existing SCS-CN method showed that the modified version performed better than did the existing one on the data of all seven area-based groups of watersheds ranging from 0.01 to 310.3 km2.

  5. A Descent Rate Control Approach to Developing an Autonomous Descent Vehicle

    NASA Astrophysics Data System (ADS)

    Fields, Travis D.

    Circular parachutes have been used for aerial payload/personnel deliveries for over 100 years. In the past two decades, significant work has been done to improve the landing accuracies of cargo deliveries for humanitarian and military applications. This dissertation discusses the approach developed in which a circular parachute is used in conjunction with an electro-mechanical reefing system to manipulate the landing location. Rather than attempt to steer the autonomous descent vehicle directly, control of the landing location is accomplished by modifying the amount of time spent in a particular wind layer. Descent rate control is performed by reversibly reefing the parachute canopy. The first stage of the research investigated the use of a single actuation during descent (with periodic updates), in conjunction with a curvilinear target. Simulation results using real-world wind data are presented, illustrating the utility of the methodology developed. Additionally, hardware development and flight-testing of the single actuation autonomous descent vehicle are presented. The next phase of the research focuses on expanding the single actuation descent rate control methodology to incorporate a multi-actuation path-planning system. By modifying the parachute size throughout the descent, the controllability of the system greatly increases. The trajectory planning methodology developed provides a robust approach to accurately manipulate the landing location of the vehicle. The primary benefits of this system are the inherent robustness to release location errors and the ability to overcome vehicle uncertainties (mass, parachute size, etc.). A separate application of the path-planning methodology is also presented. An in-flight path-prediction system was developed for use in high-altitude ballooning by utilizing the path-planning methodology developed for descent vehicles. The developed onboard system improves landing location predictions in-flight using collected flight information during the ascent and descent. Simulation and real-world flight tests (using the developed low-cost hardware) demonstrate the significance of the improvements achievable when flying the developed system.

  6. Assessment of air quality in Haora River basin using fuzzy multiple-attribute decision making techniques.

    PubMed

    Singh, Ajit Pratap; Chakrabarti, Sumanta; Kumar, Sumit; Singh, Anjaney

    2017-08-01

    This paper deals with assessment of air quality in Haora River basin using two techniques. Initially, air quality indices were evaluated using a modified EPA method. The indices were also evaluated using a fuzzy comprehensive assessment (FCA) method. The results obtained from the fuzzy comprehensive assessment method were compared to that obtained from the modified EPA method. To illustrate the applicability of the methodology proposed herein, a case study has been presented. Air samples have been collected at 10 sampling sites located along Haora River. Six important air pollutants, namely, carbon monoxide, sulfur dioxide, nitrogen dioxide, suspended particulate matter (SPM), PM 10 , and lead, were monitored continuously, and air quality maps were generated on the GIS platform. Comparison of the methodologies has clearly highlighted superiority and robustness of the fuzzy comprehensive assessment method in determining air quality indices under study. It has effectively addressed the inherent uncertainties involved in the evaluation, modeling, and interpretation of sampling data, which was beyond the scope of the traditional weighted approaches employed otherwise. The FCA method is robust and prepares a credible platform of air quality evaluation and identification, in face of the uncertainties that remain eclipsed in the traditional approaches like the modified EPA method. The insights gained through the present study are believed to be of pivotal significance in guiding the development and implementation of effective environmental remedial action plans in the study area.

  7. Methodological framework for heart rate variability analysis during exercise: application to running and cycling stress testing.

    PubMed

    Hernando, David; Hernando, Alberto; Casajús, Jose A; Laguna, Pablo; Garatachea, Nuria; Bailón, Raquel

    2018-05-01

    Standard methodologies of heart rate variability analysis and physiological interpretation as a marker of autonomic nervous system condition have been largely published at rest, but not so much during exercise. A methodological framework for heart rate variability (HRV) analysis during exercise is proposed, which deals with the non-stationary nature of HRV during exercise, includes respiratory information, and identifies and corrects spectral components related to cardiolocomotor coupling (CC). This is applied to 23 male subjects who underwent different tests: maximal and submaximal, running and cycling; where the ECG, respiratory frequency and oxygen consumption were simultaneously recorded. High-frequency (HF) power results largely modified from estimations with the standard fixed band to those obtained with the proposed methodology. For medium and high levels of exercise and recovery, HF power results in a 20 to 40% increase. When cycling, HF power increases around 40% with respect to running, while CC power is around 20% stronger in running.

  8. Experience-based co-design in an adult psychological therapies service.

    PubMed

    Cooper, Kate; Gillmore, Chris; Hogg, Lorna

    2016-01-01

    Experience-based co-design (EBCD) is a methodology for service improvement and development, which puts service-user voices at the heart of improving health services. The aim of this paper was to implement the EBCD methodology in a mental health setting, and to investigate the challenges which arise during this process. In order to achieve this, a modified version of the EBCD methodology was undertaken, which involved listening to the experiences of the people who work in and use the mental health setting and sharing these experiences with the people who could effect change within the service, through collaborative work between service-users, staff and managers. EBCD was implemented within the mental health setting and was well received by service-users, staff and stakeholders. A number of modifications were necessary in this setting, for example high levels of support available to participants. It was concluded that EBCD is a suitable methodology for service improvement in mental health settings.

  9. Optimization of Reversed-Phase Peptide Liquid Chromatography Ultraviolet Mass Spectrometry Analyses Using an Automated Blending Methodology

    PubMed Central

    Chakraborty, Asish B.; Berger, Scott J.

    2005-01-01

    The balance between chromatographic performance and mass spectrometric response has been evaluated using an automated series of experiments where separations are produced by the real-time automated blending of water with organic and acidic modifiers. In this work, the concentration effects of two acidic modifiers (formic acid and trifluoroacetic acid) were studied on the separation selectivity, ultraviolet, and mass spectrometry detector response, using a complex peptide mixture. Peptide retention selectivity differences were apparent between the two modifiers, and under the conditions studied, trifluoroacetic acid produced slightly narrower (more concentrated) peaks, but significantly higher electrospray mass spectrometry suppression. Trifluoroacetic acid suppression of electrospray signal and influence on peptide retention and selectivity was dominant when mixtures of the two modifiers were analyzed. Our experimental results indicate that in analyses where the analyzed components are roughly equimolar (e.g., a peptide map of a recombinant protein), the selectivity of peptide separations can be optimized by choice and concentration of acidic modifier, without compromising the ability to obtain effective sequence coverage of a protein. In some cases, these selectivity differences were explored further, and a rational basis for differentiating acidic modifier effects from the underlying peptide sequences is described. PMID:16522853

  10. Direct observation of morphological evolution of a catalyst during carbon nanotube forest growth: new insights into growth and growth termination

    NASA Astrophysics Data System (ADS)

    Jeong, Seojeong; Lee, Jaegeun; Kim, Hwan-Chul; Hwang, Jun Yeon; Ku, Bon-Cheol; Zakharov, Dmitri N.; Maruyama, Benji; Stach, Eric A.; Kim, Seung Min

    2016-01-01

    In this study, we develop a new methodology for transmission electron microscopy (TEM) analysis that enables us to directly investigate the interface between carbon nanotube (CNT) arrays and the catalyst and support layers for CNT forest growth without any damage induced by a post-growth TEM sample preparation. Using this methodology, we perform in situ and ex situ TEM investigations on the evolution of the morphology of the catalyst particles and observe the catalyst particles to climb up through CNT arrays during CNT forest growth. We speculate that the lifted catalysts significantly affect the growth and growth termination of CNT forests along with Ostwald ripening and sub-surface diffusion. Thus, we propose a modified growth termination model which better explains various phenomena related to the growth and growth termination of CNT forests.In this study, we develop a new methodology for transmission electron microscopy (TEM) analysis that enables us to directly investigate the interface between carbon nanotube (CNT) arrays and the catalyst and support layers for CNT forest growth without any damage induced by a post-growth TEM sample preparation. Using this methodology, we perform in situ and ex situ TEM investigations on the evolution of the morphology of the catalyst particles and observe the catalyst particles to climb up through CNT arrays during CNT forest growth. We speculate that the lifted catalysts significantly affect the growth and growth termination of CNT forests along with Ostwald ripening and sub-surface diffusion. Thus, we propose a modified growth termination model which better explains various phenomena related to the growth and growth termination of CNT forests. Electronic supplementary information (ESI) available. See DOI: 10.1039/c5nr05547d

  11. Parent-child mediated learning interactions as determinants of cognitive modifiability: recent research and future directions.

    PubMed

    Tzuriel, D

    1999-05-01

    The main objectives of this article are to describe the effects of mediated learning experience (MLE) strategies in mother-child interactions on the child's cognitive modifiability, the effects of distal factors (e.g., socioeconomic status, mother's intelligence, child's personality) on MLE interactions, and the effects of situational variables on MLE processes. Methodological aspects of measurement of MLE interactions and of cognitive modifiability, using a dynamic assessment approach, are discussed. Studies with infants showed that the quality of mother-infant MLE interactions predict later cognitive functioning and that MLE patterns and children's cognitive performance change as a result of intervention programs. Studies with preschool and school-aged children showed that MLE interactions predict cognitive modifiability and that distal factors predict MLE interactions but not the child's cognitive modifiability. The child's cognitive modifiability was predicted by MLE interactions in a structured but not in a free-play situation. Mediation for transcendence (e.g., teaching rules and generalizations) appeared to be the strongest predictor of children's cognitive modifiability. Discussion of future research includes the consideration of a holistic transactional approach, which refers to MLE processes, personality, and motivational-affective factors, the cultural context of mediation, perception of the whole family as a mediational unit, and the "mediational normative scripts."

  12. Modified Chaihu Shugan Powder for Functional Dyspepsia: Meta-Analysis for Randomized Controlled Trial

    PubMed Central

    Yang, Nan; Jiang, Xuehua; Hu, Zhiqiang; Wang, Ling; Song, Minxian

    2013-01-01

    Context. Modified Chaihu Shugan powder (MCSP) is a popular traditional Chinese herbal formula for functional dyspepsia, which is revised from Chaihu Shugan San and recorded in a medical classic works of China. However, its role and effect in treating functional dyspepsia have not been well established. Objective. To assess the effect and safety of modified Chaihu Shugan powder for functional dyspepsia. Methods. We searched the published and unpublished studies up to August 2012. Only RCTs of modified Chaihu Shugan powder with or without prokinetic drugs versus prokinetic drugs in the patients diagnosed with functional dyspepsia were included. Results. Twenty-two clinical trials involving 1998 participants were included. There were evidences that modified Chaihu Shugan powder (RR = 1.20, 95%, CI 1.14 to 1.27) and modified Chaihu Shugan powder plus prokinetic drugs (RR = 1.18, 95%, CI 1.11 to 1.25) were significantly better treatment options than prokinetic drugs alone in improving symptoms. No serious adverse events were described in the included trials. Conclusions. This meta-analysis showed that modified Chaihu Shugan powder alone or in combination with prokinetic drugs might be more effective than prokinetic drugs alone. However, with poor methodological quality, all the included trials were at high risk of bias. Further large-scale high-quality trials are required for assessment. PMID:23762161

  13. Planning Future Clinical Trials for Machado-Joseph Disease.

    PubMed

    Saute, Jonas Alex Morales; Jardim, Laura Bannach

    2018-01-01

    Spinocerebellar ataxia type 3/Machado-Joseph disease (SCA3/MJD) is an autosomal dominant multiple neurological systems degenerative disorder caused by a CAG repeat expansion at ATXN3 gene. Only a few treatments were evaluated in randomized clinical trials (RCT) in SCA3/MJD patients, with a lack of evidence for both disease-modifying and symptomatic therapies. The present chapter discuss in detail major methodological issues for planning future RCT for SCA3/MJD. There are several potential therapies for SCA3/MJD with encouraging preclinical results. Route of treatment, dosage titration and potential therapy biomarkers might differ among candidate drugs; however, the core study design and protocol will be mostly the same. RCT against placebo group is the best study design to test a disease-modifying therapy; the same cannot be stated for some symptomatic treatments. Main outcomes for future RCT are clinical scales: the Scale for the Assessment and Rating of ataxia (SARA) is currently the instrument of choice to prove efficacy of disease-modifying or symptomatic treatments against ataxia, the most important disease feature. Ataxia quantitative scales or its composite scores can be used as primary outcomes to provide preliminary evidence of efficacy in phase 2 RCT, due to a greater sensitivity to change. Details regarding eligibility criteria, randomization, sample size estimation, duration and type of analysis for both disease modifying and symptomatic treatment trials, were also discussed. Finally, a section anticipates the methodological issues for testing novel drugs when an effective treatment is already available. We conclude emphasizing four points, the first being the need of RCT for a number of different aims in the care of SCA3/MJD. Due to large sample sizes needed to warrant power, RCT for disease-modifying therapies should be multicenter enterprises. There is an urge need for surrogate markers validated for several drug classes. Finally, engagement of at risk or presymptomatic individuals in future trials will enable major advances on treatment research for SCA3/MJD.

  14. Symposium 'methodology in medical education research' organised by the Methodology in Medical Education Research Committee of the German Society of Medical Education May, 25th to 26th 2013 at Charité, Berlin

    PubMed Central

    Schüttpelz-Brauns, Katrin; Kiessling, Claudia; Ahlers, Olaf; Hautz, Wolf E.

    2015-01-01

    In 2013, the Methodology in Medical Education Research Committee ran a symposium on “Research in Medical Education” as part of its ongoing faculty development activities. The symposium aimed to introduce to participants educational research methods with a specific focus on research in medical education. Thirty-five participants were able to choose from workshops covering qualitative methods, quantitative methods and scientific writing throughout the one and a half days. The symposium’s evaluation showed participant satisfaction with the format as well as suggestions for future improvement. Consequently, the committee will offer the symposium again in a modified form in proximity to the next annual Congress of the German Society of Medical Education. PMID:25699106

  15. A Comprehensive Comparison of Current Operating Reserve Methodologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krad, Ibrahim; Ibanez, Eduardo; Gao, Wenzhong

    Electric power systems are currently experiencing a paradigm shift from a traditionally static system to a system that is becoming increasingly more dynamic and variable. Emerging technologies are forcing power system operators to adapt to their performance characteristics. These technologies, such as distributed generation and energy storage systems, have changed the traditional idea of a distribution system with power flowing in one direction into a distribution system with bidirectional flows. Variable generation, in the form of wind and solar generation, also increases the variability and uncertainty in the system. As such, power system operators are revisiting the ways in whichmore » they treat this evolving power system, namely by modifying their operating reserve methodologies. This paper intends to show an in-depth analysis on different operating reserve methodologies and investigate their impacts on power system reliability and economic efficiency.« less

  16. An automated methodology development. [software design for combat simulation

    NASA Technical Reports Server (NTRS)

    Hawley, L. R.

    1985-01-01

    The design methodology employed in testing the applicability of Ada in large-scale combat simulations is described. Ada was considered as a substitute for FORTRAN to lower life cycle costs and ease the program development efforts. An object-oriented approach was taken, which featured definitions of military targets, the capability of manipulating their condition in real-time, and one-to-one correlation between the object states and real world states. The simulation design process was automated by the problem statement language (PSL)/problem statement analyzer (PSA). The PSL/PSA system accessed the problem data base directly to enhance the code efficiency by, e.g., eliminating non-used subroutines, and provided for automated report generation, besides allowing for functional and interface descriptions. The ways in which the methodology satisfied the responsiveness, reliability, transportability, modifiability, timeliness and efficiency goals are discussed.

  17. Nonlinear Performance Seeking Control using Fuzzy Model Reference Learning Control and the Method of Steepest Descent

    NASA Technical Reports Server (NTRS)

    Kopasakis, George

    1997-01-01

    Performance Seeking Control (PSC) attempts to find and control the process at the operating condition that will generate maximum performance. In this paper a nonlinear multivariable PSC methodology will be developed, utilizing the Fuzzy Model Reference Learning Control (FMRLC) and the method of Steepest Descent or Gradient (SDG). This PSC control methodology employs the SDG method to find the operating condition that will generate maximum performance. This operating condition is in turn passed to the FMRLC controller as a set point for the control of the process. The conventional SDG algorithm is modified in this paper in order for convergence to occur monotonically. For the FMRLC control, the conventional fuzzy model reference learning control methodology is utilized, with guidelines generated here for effective tuning of the FMRLC controller.

  18. Statistical Anomalies of Bitflips in SRAMs to Discriminate SBUs From MCUs

    NASA Astrophysics Data System (ADS)

    Clemente, Juan Antonio; Franco, Francisco J.; Villa, Francesca; Baylac, Maud; Rey, Solenne; Mecha, Hortensia; Agapito, Juan A.; Puchner, Helmut; Hubert, Guillaume; Velazco, Raoul

    2016-08-01

    Recently, the occurrence of multiple events in static tests has been investigated by checking the statistical distribution of the difference between the addresses of the words containing bitflips. That method has been successfully applied to Field Programmable Gate Arrays (FPGAs) and the original authors indicate that it is also valid for SRAMs. This paper presents a modified methodology that is based on checking the XORed addresses with bitflips, rather than on the difference. Irradiation tests on CMOS 130 & 90 nm SRAMs with 14-MeV neutrons have been performed to validate this methodology. Results in high-altitude environments are also presented and cross-checked with theoretical predictions. In addition, this methodology has also been used to detect modifications in the organization of said memories. Theoretical predictions have been validated with actual data provided by the manufacturer.

  19. A simple stochastic weather generator for ecological modeling

    Treesearch

    A.G. Birt; M.R. Valdez-Vivas; R.M. Feldman; C.W. Lafon; D. Cairns; R.N. Coulson; M. Tchakerian; W. Xi; Jim Guldin

    2010-01-01

    Stochastic weather generators are useful tools for exploring the relationship between organisms and their environment. This paper describes a simple weather generator that can be used in ecological modeling projects. We provide a detailed description of methodology, and links to full C++ source code (http://weathergen.sourceforge.net) required to implement or modify...

  20. Repeatability precision of the falling number procedure under standard and modified methodologies

    USDA-ARS?s Scientific Manuscript database

    The falling number (FN) procedure is used worldwide to assess the integrity of the starch stored within wheat seed. As an indirect measurement of the activity level of alpha-amylase, FN relies on a dedicated viscometer that measures the amount of time needed for a metal stirring rod of precise geome...

  1. Using Behavior Change to Reduce Child Lead Exposure in Resource-Poor Settings: A Formative Study

    ERIC Educational Resources Information Center

    Feit, M. N.; Mathee, A.; Harpham, T.; Barnes, B. R.

    2014-01-01

    The objective of this formative research was to explore the acceptability and feasibility of changing housekeeping behaviors as a low-cost approach that may reduce childhood lead exposure in Johannesburg, South Africa. Using the Trials of Improved Practices (TIPs) methodology, modified housekeeping behaviors were negotiated with participants who…

  2. The Disabled Student Experience: Does the SERVQUAL Scale Measure Up?

    ERIC Educational Resources Information Center

    Vaughan, Elizabeth; Woodruffe-Burton, Helen

    2011-01-01

    Purpose: The purpose of this paper is to empirically test a new disabled service user-specific service quality model ARCHSECRET against a modified SERVQUAL model in the context of disabled students within higher education. Design/methodology/approach: The application of SERVQUAL in the voluntary sector had raised serious issues on its portability…

  3. Training Psychiatry Residents in Quality Improvement: An Integrated, Year-Long Curriculum

    ERIC Educational Resources Information Center

    Arbuckle, Melissa R.; Weinberg, Michael; Cabaniss, Deborah L.; Kistler; Susan C.; Isaacs, Abby J.; Sederer, Lloyd I.; Essock, Susan M.

    2013-01-01

    Objective: The authors describe a curriculum for psychiatry residents in Quality Improvement (QI) methodology. Methods: All PGY3 residents (N=12) participated in a QI curriculum that included a year-long group project. Knowledge and attitudes were assessed before and after the curriculum, using a modified Quality Improvement Knowledge Assessment…

  4. Inclusion of Radiation Environment Variability in Total Dose Hardness Assurance Methodology

    NASA Technical Reports Server (NTRS)

    Xapsos, M. A.; Stauffer, C.; Phan, A.; McClure, S. S.; Ladbury, R. L.; Pellish, J. A.; Campola, M. J.; LaBel, K. A.

    2015-01-01

    Variability of the space radiation environment is investigated with regard to parts categorization for total dose hardness assurance methods. It is shown that it can have a significant impact. A modified approach is developed that uses current environment models more consistently and replaces the design margin concept with one of failure probability.

  5. Notes on Inventive Methodologies and Affirmative Critiques of an Affective Edu-Future

    ERIC Educational Resources Information Center

    Staunaes, Dorthe

    2016-01-01

    What are the possible futures for educational research? The essay concerns two intertwined agendas. The first agenda is empirical and concerns how educational policy and leadership constitute, circulate, transform and modify feelings, moods and affects. Especially, motivation, engagement and the desire for learning are targets for policy and…

  6. Cross Slip of Dislocation Loops in GaN Under Shear

    DTIC Science & Technology

    2014-03-01

    methodology 2.1 Discrete dislocation dynamic ( DDD ) simula- tions In this work, we employ a modified version of the ParaDiS code [15, 16]. First a...plane. 4 Conclusions The cross slip mechanisms of different dislocation loops have been studied via DDD simulations using the type <a> active

  7. Information Operations Primer

    DTIC Science & Technology

    2010-11-01

    altering drugs ) but must be influenced indirectly through the physical and information dimensions. c. Information Operations modify the three dimensions...restoration of information systems by incorporating protection, detection, and reaction capabilities. (2) Physical Security is that part of security...wargamed using the traditional friendly action, expected enemy reaction , and friendly counteraction methodology. The wargaming process must also occur

  8. Perceptions of Bachelor-Degree Graduates Regarding General Education Program Quality

    ERIC Educational Resources Information Center

    Bittinger, Sara-Beth

    2017-01-01

    This study was directed by a modified Delphi-methodology design to gain perspective of the perceptions of alumni regarding the value and applicability of the general education program. The expert-panel participants were 14 alumni of Frostburg State University from various majors, representative of all three colleges, who graduated between 2006 and…

  9. "Great Classroom Teaching" and More: Awards for Outstanding Teaching Evaluated

    ERIC Educational Resources Information Center

    Jackson, Michael

    2006-01-01

    Purpose: In this paper teaching excellence awards are evaluated, with an eye to improving them. Design/methodology/approach: Literature is reviewed and an analytic framework developed in Canada is modified to apply to the University of Sydney's Vice Chancellor Outstanding Teaching Award. Data come from 60 respondents familiar with the Sydney award…

  10. Diagnostic Testing Package DX v 2.0 Technical Specification. Methodology Project.

    ERIC Educational Resources Information Center

    McArthur, David

    This paper contains the technical specifications, schematic diagrams, and program printout for a computer software package for the development and administration of diagnostic tests. The second version of the Diagnostic Testing Package DX consists of a PASCAL-based set of modules located in two main programs: (1) EDITTEST creates, modifies, and…

  11. Diagnostics Strategies with Electrochemical Affinity Biosensors Using Carbon Nanomaterials as Electrode Modifiers

    PubMed Central

    Campuzano, Susana; Yáñez-Sedeño, Paloma; Pingarrón, José M.

    2016-01-01

    Early diagnosis is often the key to successful patient treatment and survival. The identification of various disease signaling biomarkers which reliably reflect normal and disease states in humans in biological fluids explain the burgeoning research field in developing new methodologies able to determine the target biomarkers in complex biological samples with the required sensitivity and selectivity and in a simple and rapid way. The unique advantages offered by electrochemical sensors together with the availability of high affinity and specific bioreceptors and their great capabilities in terms of sensitivity and stability imparted by nanostructuring the electrode surface with different carbon nanomaterials have led to the development of new electrochemical biosensing strategies that have flourished as interesting alternatives to conventional methodologies for clinical diagnostics. This paper briefly reviews the advantages of using carbon nanostructures and their hybrid nanocomposites as electrode modifiers to construct efficient electrochemical sensing platforms for diagnosis. The review provides an updated overview of some selected examples involving attractive amplification and biosensing approaches which have been applied to the determination of relevant genetic and protein diagnostics biomarkers. PMID:28035946

  12. Knowledge base methodology: Methodology for first Engineering Script Language (ESL) knowledge base

    NASA Technical Reports Server (NTRS)

    Peeris, Kumar; Izygon, Michel E.

    1992-01-01

    The primary goal of reusing software components is that software can be developed faster, cheaper and with higher quality. Though, reuse is not automatic and can not just happen. It has to be carefully engineered. For example a component needs to be easily understandable in order to be reused, and it has also to be malleable enough to fit into different applications. In fact the software development process is deeply affected when reuse is being applied. During component development, a serious effort has to be directed toward making these components as reusable. This implies defining reuse coding style guidelines and applying then to any new component to create as well as to any old component to modify. These guidelines should point out the favorable reuse features and may apply to naming conventions, module size and cohesion, internal documentation, etc. During application development, effort is shifted from writing new code toward finding and eventually modifying existing pieces of code, then assembling them together. We see here that reuse is not free, and therefore has to be carefully managed.

  13. Optimisation of Copper Oxide Impregnation on Carbonised Oil Palm Empty Fruit Bunch for Nitric Oxide Removal using Response Surface Methodology

    NASA Astrophysics Data System (ADS)

    Ahmad, Norhidayah; Yong, Sing Hung; Ibrahim, Naimah; Ali, Umi Fazara Md; Ridwan, Fahmi Muhammad; Ahmad, Razi

    2018-03-01

    Oil palm empty fruit bunch (EFB) was successfully modified with phosphoric acid hydration followed by impregnation with copper oxide (CuO) to synthesize CuO modified catalytic carbon (CuO/EFBC) for low-temperature removal of nitric oxide (NO) from gas streams. CuO impregnation was optimised through response surface methodology (RSM) using Box-Behnken Design (BBD) in terms of metal loading (5-20%), sintering temperature (200-800˚C) and sintering time (2-6 hours). The model response for the variables was NO adsorption capacity, which was obtained from an up-flow column adsorption experiment with 100 mL/min flow of 500 ppm NO/He at different operating conditions. The optimum operating variables suggested by the model were 20% metal loading, 200˚C sintering temperature and 6 hours sintering time. A good agreement (R2 = 0.9625) was achieved between the experimental data and model prediction. ANOVA analysis indicated that the model terms (metal loading and sintering temperature) are significant (Prob.>F less than 0.05).

  14. Using genetically modified tomato crop plants with purple leaves for absolute weed/crop classification.

    PubMed

    Lati, Ran N; Filin, Sagi; Aly, Radi; Lande, Tal; Levin, Ilan; Eizenberg, Hanan

    2014-07-01

    Weed/crop classification is considered the main problem in developing precise weed-management methodologies, because both crops and weeds share similar hues. Great effort has been invested in the development of classification models, most based on expensive sensors and complicated algorithms. However, satisfactory results are not consistently obtained due to imaging conditions in the field. We report on an innovative approach that combines advances in genetic engineering and robust image-processing methods to detect weeds and distinguish them from crop plants by manipulating the crop's leaf color. We demonstrate this on genetically modified tomato (germplasm AN-113) which expresses a purple leaf color. An autonomous weed/crop classification is performed using an invariant-hue transformation that is applied to images acquired by a standard consumer camera (visible wavelength) and handles variations in illumination intensities. The integration of these methodologies is simple and effective, and classification results were accurate and stable under a wide range of imaging conditions. Using this approach, we simplify the most complicated stage in image-based weed/crop classification models. © 2013 Society of Chemical Industry.

  15. Nanoengineered Plasmonic Hybrid Systems for Bio-nanotechnology

    NASA Astrophysics Data System (ADS)

    Leong, Kirsty

    Plasmonic hybrid systems are fabricated using a combination of lithography and layer-by-layer directed self-assembly approaches to serve as highly sensitive nanosensing devices. This layer-by-layer directed self-assembly approach is utilized as a hybrid methodology to control the organization of quantum dots (QDs), nanoparticles, and biomolecules onto inorganic nanostructures with site-specific attachment and functionality. Here, surface plasmon-enhanced nanoarrays are fabricated where the photoluminescence of quantum dots and conjugated polymer nanoarrays are studied. This study was performed by tuning the localized surface plasmon resonance and the distance between the emitter and the metal surface using genetically engineered polypeptides as binding agents and biotin-streptavidin binding as linker molecules. In addition, these nanoarrays were also chemically modified to support the immobilization and label-free detection of DNA using surface enhanced Raman scattering. The surface of the nanoarrays was chemically modified using an acridine containing molecule which can act as an intercalating agent for DNA. The self-assembled monolayer (SAM) showed the ability to immobilize and intercalate DNA onto the surface. This SAM system using surface enhanced Raman scattering (SERS) serves as a highly sensitive methodology for the immobilization and label-free detection of DNA applicable into a wide range of bio-diagnostic platforms. Other micropatterned arrays were also fabricated using a combination of soft lithography and surface engineering. Selective single cell patterning and adhesion was achieved through chemical modifications and surface engineering of poly(dimethylsiloxane) surface. The surface of each microwell was functionally engineered with a SAM which contained an aldehyde terminated fused-ring aromatic thiolated molecule. Cells were found to be attracted and adherent to the chemically modified microwells. By combining soft lithography and surface engineering, a simple methodology produced single cell arrays on biocompatible substrates. Thus the design of plasmonic devices relies heavily on the nature of the plasmonic interactions between nanoparticles in the devices which can potentially be fabricated into lab-on-a-chip devices for multiplex sensing capabilities.

  16. Advanced piloted aircraft flight control system design methodology. Volume 1: Knowledge base

    NASA Technical Reports Server (NTRS)

    Mcruer, Duane T.; Myers, Thomas T.

    1988-01-01

    The development of a comprehensive and electric methodology for conceptual and preliminary design of flight control systems is presented and illustrated. The methodology is focused on the design stages starting with the layout of system requirements and ending when some viable competing system architectures (feedback control structures) are defined. The approach is centered on the human pilot and the aircraft as both the sources of, and the keys to the solution of, many flight control problems. The methodology relies heavily on computational procedures which are highly interactive with the design engineer. To maximize effectiveness, these techniques, as selected and modified to be used together in the methodology, form a cadre of computational tools specifically tailored for integrated flight control system preliminary design purposes. While theory and associated computational means are an important aspect of the design methodology, the lore, knowledge and experience elements, which guide and govern applications are critical features. This material is presented as summary tables, outlines, recipes, empirical data, lists, etc., which encapsulate a great deal of expert knowledge. Much of this is presented in topical knowledge summaries which are attached as Supplements. The composite of the supplements and the main body elements constitutes a first cut at a a Mark 1 Knowledge Base for manned-aircraft flight control.

  17. Detection and Site Localization of Phosphorylcholine-Modified Peptides by NanoLC-ESI-MS/MS Using Precursor Ion Scanning and Multiple Reaction Monitoring Experiments

    NASA Astrophysics Data System (ADS)

    Timm, Thomas; Lenz, Christof; Merkel, Dietrich; Sadiffo, Christian; Grabitzki, Julia; Klein, Jochen; Lochnit, Guenter

    2015-03-01

    Phosphorylcholine (PC)-modified biomolecules like lipopolysaccharides, glycosphingolipids, and (glyco)proteins are widespread, highly relevant antigens of parasites, since this small hapten shows potent immunomodulatory capacity, which allows the establishment of long-lasting infections of the host. Especially for PC-modified proteins, structural data is rare because of the zwitterionic nature of the PC substituent, resulting in low sensitivities and unusual but characteristic fragmentation patterns. We have developed a targeted mass spectrometric approach using hybrid triple quadrupole/linear ion trap (QTRAP) mass spectrometry coupled to nanoflow chromatography for the sensitive detection of PC-modified peptides from complex proteolytic digests, and the localization of the PC-modification within the peptide backbone. In a first step, proteolytic digests are screened using precursor ion scanning for the marker ions of choline ( m/z 104.1) and phosphorylcholine ( m/z 184.1) to establish the presence of PC-modified peptides. Potential PC-modified precursors are then subjected to a second analysis using multiple reaction monitoring (MRM)-triggered product ion spectra for the identification and site localization of the modified peptides. The approach was first established using synthetic PC-modified synthetic peptides and PC-modified model digests. Following the optimization of key parameters, we then successfully applied the method to the detection of PC-peptides in the background of a proteolytic digest of a whole proteome. This methodological invention will greatly facilitate the detection of PC-substituted biomolecules and their structural analysis.

  18. Methodology of shell structure reinforcement layout optimization

    NASA Astrophysics Data System (ADS)

    Szafrański, Tomasz; Małachowski, Jerzy; Damaziak, Krzysztof

    2018-01-01

    This paper presents an optimization process of a reinforced shell diffuser intended for a small wind turbine (rated power of 3 kW). The diffuser structure consists of multiple reinforcement and metal skin. This kind of structure is suitable for optimization in terms of selection of reinforcement density, stringers cross sections, sheet thickness, etc. The optimisation approach assumes the reduction of the amount of work to be done between the optimization process and the final product design. The proposed optimization methodology is based on application of a genetic algorithm to generate the optimal reinforcement layout. The obtained results are the basis for modifying the existing Small Wind Turbine (SWT) design.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Song, Yong-Seon; Institute of Cosmology and Gravitation, University of Portsmouth, Dennis Sciama Building, Portsmouth, PO1 3FX; Zhao Gongbo

    We explore the complementarity of weak lensing and galaxy peculiar velocity measurements to better constrain modifications to General Relativity. We find no evidence for deviations from General Relativity on cosmological scales from a combination of peculiar velocity measurements (for Luminous Red Galaxies in the Sloan Digital Sky Survey) with weak lensing measurements (from the Canadian France Hawaii Telescope Legacy Survey). We provide a Fisher error forecast for a Euclid-like space-based survey including both lensing and peculiar velocity measurements and show that the expected constraints on modified gravity will be at least an order of magnitude better than with present data,more » i.e. we will obtain {approx_equal}5% errors on the modified gravity parametrization described here. We also present a model-independent method for constraining modified gravity parameters using tomographic peculiar velocity information, and apply this methodology to the present data set.« less

  20. Polythioether Particles Armored with Modifiable Graphene Oxide Nanosheets.

    PubMed

    Rodier, Bradley J; Mosher, Eric P; Burton, Spencer T; Matthews, Rachael; Pentzer, Emily

    2016-06-01

    Facile and scalable fabrication methods are attractive to prepare materials for diverse applications. Herein, a method is presented to prepare cross-linked polymeric nanoparticles with graphene oxide (GO) nanosheets covalently attached to the surface. Alkene-modified GO serves as a surfactant in a miniemulsion polymerization, and the alkene functionalities of GO exposed to the oil-phase are incorporated into the polymer particle through thiol-ene reactions, leaving the unreacted alkene functional groups of the other face of GO available for further functionalization. The surface of GO-armored polymer particles is then modified with a small molecule fluorophore or carboxylic acid functional groups that bind to Fe2 O3 and TiO2 nanoparticles. This methodology provides a facile route to preparing complex hybrid composite materials. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. [The public perception of information about the potential risks of genetically modified crops in the food chain].

    PubMed

    Furnival, Ariadne Chloë; Pinheiro, Sônia Maria

    2008-01-01

    At a time when genetically modified (GM) crops are entering the Brazilian food chain, we present the findings of a study that makes use of a qualitative technique involving focal groups to look into the public's interpretation of the information available about this biotechnological innovation. This methodology produced results that revealed the interconnections drawn by the research subjects between this form of biotechnology, changes to the environment, and food production in general. The mistrust expressed about GM crops was particularly attributed by the participants to the non-availability of comprehensible information in the mass media or on product labels.

  2. S-Nitrosothiol measurements in biological systems⋄

    PubMed Central

    Gow, Andrew; Doctor, Allan; Mannick, Joan; Gaston, Benjamin

    2007-01-01

    S-Nitrosothiol (SNO) cysteine modifications are regulated signaling reactions that dramatically affect, and are affected by, protein conformation. The lability of the S-NO bond can make SNO-modified proteins cumbersome to measure accurately. Here, we review methodologies for detecting SNO modifications in biology. There are three caveats. 1) Many assays for biological SNOs are used near the limit of detection: standard curves must be in the biologically relevant concentration range. 2) The assays that are most reliable are those that modify SNO protein or peptide chemistry the least. 3) Each result should be quantitatively validated using more than one assay. Improved assays are needed and are in development. PMID:17379583

  3. Effects of image processing on the detective quantum efficiency

    NASA Astrophysics Data System (ADS)

    Park, Hye-Suk; Kim, Hee-Joung; Cho, Hyo-Min; Lee, Chang-Lae; Lee, Seung-Wan; Choi, Yu-Na

    2010-04-01

    Digital radiography has gained popularity in many areas of clinical practice. This transition brings interest in advancing the methodologies for image quality characterization. However, as the methodologies for such characterizations have not been standardized, the results of these studies cannot be directly compared. The primary objective of this study was to standardize methodologies for image quality characterization. The secondary objective was to evaluate affected factors to Modulation transfer function (MTF), noise power spectrum (NPS), and detective quantum efficiency (DQE) according to image processing algorithm. Image performance parameters such as MTF, NPS, and DQE were evaluated using the international electro-technical commission (IEC 62220-1)-defined RQA5 radiographic techniques. Computed radiography (CR) images of hand posterior-anterior (PA) for measuring signal to noise ratio (SNR), slit image for measuring MTF, white image for measuring NPS were obtained and various Multi-Scale Image Contrast Amplification (MUSICA) parameters were applied to each of acquired images. In results, all of modified images were considerably influence on evaluating SNR, MTF, NPS, and DQE. Modified images by the post-processing had higher DQE than the MUSICA=0 image. This suggests that MUSICA values, as a post-processing, have an affect on the image when it is evaluating for image quality. In conclusion, the control parameters of image processing could be accounted for evaluating characterization of image quality in same way. The results of this study could be guided as a baseline to evaluate imaging systems and their imaging characteristics by measuring MTF, NPS, and DQE.

  4. Curated Collection for Educators: Five Key Papers about the Flipped Classroom Methodology.

    PubMed

    King, Andrew; Boysen-Osborn, Megan; Cooney, Robert; Mitzman, Jennifer; Misra, Asit; Williams, Jennifer; Dulani, Tina; Gottlieb, Michael

    2017-10-25

    The flipped classroom (FC) pedagogy is becoming increasingly popular in medical education due to its appeal to the millennial learner and potential benefits in knowledge acquisition. Despite its popularity and effectiveness, the FC educational method is not without challenges. In this article, we identify and summarize several key papers relevant to medical educators interested in exploring the FC teaching methodology. The authors identified an extensive list of papers relevant to FC pedagogy via online discussions within the Academic Life in Emergency Medicine (ALiEM) Faculty Incubator. This list was augmented by an open call on Twitter (utilizing the #meded, #FOAMed, and #flippedclassroom hashtags) yielding a list of 33 papers. We then conducted a three-round modified Delphi process within the authorship group, which included both junior and senior clinician educators, to identify the most impactful papers for educators interested in FC pedagogy. The three-round modified Delphi process ranked all of the selected papers and selected the five most highly-rated papers for inclusion. The authorship group reviewed and summarized these papers with specific consideration given to their value to junior faculty educators and faculty developers interested in the flipped classroom approach. The list of papers featured in this article serves as a key reading list for junior clinician educators and faculty developers interested in the flipped classroom technique. The associated commentaries contextualize the importance of these papers for medical educators aiming to optimize their understanding and implementation of the flipped classroom methodology in their teaching and through faculty development.

  5. REVIEW OF DRAFT REVISED BLUE BOOK ON ESTIMATING ...

    EPA Pesticide Factsheets

    In 1994, EPA published a report, referred to as the “Blue Book,” which lays out EPA’s current methodology for quantitatively estimating radiogenic cancer risks. A follow-on report made minor adjustments to the previous estimates and presented a partial analysis of the uncertainties in the numerical estimates. In 2006, the National Research Council of the National Academy of Sciences released a report on the health risks from exposure to low levels of ionizing radiation. Cosponsored by the EPA and several other Federal agencies, Health Risks from Exposure to Low Levels of Ionizing Radiation BEIR VII Phase 2 (BEIR VII) primarily addresses cancer and genetic risks from low doses of low-LET radiation. In the draft White Paper: Modifying EPA Radiation Risk Models Based on BEIR VII (White Paper), ORIA proposed changes in EPA’s methodology for estimating radiogenic cancers, based on the contents of BEIR VII and some ancillary information. For the most part, it proposed to adopt the models and methodology recommended in BEIR VII; however, certain modifications and expansions are considered to be desirable or necessary for EPA’s purposes. EPA sought advice from the Agency’s Science Advisory Board on the application of BEIR VII and on issues relating to these modifications and expansions in the Advisory on EPA’s Draft White Paper: Modifying EPA Radiation Risk Models Based on BEIR VII (record # 83044). The SAB issued its Advisory on Jan. 31, 2008 (EPA-SAB-08-

  6. Closing the Loop: Using Assessment Results to Modify the Curriculum so That Student Quantitative Reasoning Skills Are Enhanced

    ERIC Educational Resources Information Center

    Johnson, Lynn

    2012-01-01

    Assurance of student learning through effective assessment has become increasingly important over the past decade as accrediting agencies now require documented efforts to measure and improve student performance. This paper presents the methodology used by the College of Business Administration at California State University, Stanislaus to assess…

  7. Determinants of Business Student Satisfaction and Retention in Higher Education: Applying Herzberg's Two-Factor Theory

    ERIC Educational Resources Information Center

    DeShields, Oscar W., Jr.; Kara, Ali; Kaynak, Erdener

    2005-01-01

    Purpose: This paper focuses on the determinants of student satisfaction and retention in a college or university that are assumed to impact students' college experience. Design/methodology/approach: Using empirical data and Herzberg's two-factor theory, a modified version of the questionnaire developed by Keaveney and Young was administered to…

  8. On Nature, Christianity and Deep Ecology--A Response to W. S. Helton and N. D. Helton

    ERIC Educational Resources Information Center

    Marangudakis, Manussos

    2008-01-01

    Establishing factually-based public support for the intrinsic value of nature, vis-a-vis a "domineering" or "stewardship" relation with the natural environment, necessitates the prior theoretical and methodological establishment of the above normative distinction. In this reply I argue that the Modified New Environmental Paradigm used by Helton…

  9. DEVELOPMENT OF DIAGNOSTIC ANALYTICAL AND MECHANICAL ABILITY TESTS THROUGH FACET DESIGN AND ANALYSIS.

    ERIC Educational Resources Information Center

    GUTTMAN, LOUIS,; SCHLESINGER, I.M.

    METHODOLOGY BASED ON FACET THEORY (MODIFIED SET THEORY) WAS USED IN TEST CONSTRUCTION AND ANALYSIS TO PROVIDE AN EFFICIENT TOOL OF EVALUATION FOR VOCATIONAL GUIDANCE AND VOCATIONAL SCHOOL USE. THE TYPE OF TEST DEVELOPMENT UNDERTAKEN WAS LIMITED TO THE USE OF NONVERBAL PICTORIAL ITEMS. ITEMS FOR TESTING ABILITY TO IDENTIFY ELEMENTS BELONGING TO AN…

  10. Assessment of Working Memory Capacity in Preschool Children Using the Missing Scan Task

    ERIC Educational Resources Information Center

    Roman, Adrienne S.; Pisoni, David B.; Kronenberger, William G.

    2014-01-01

    The purpose of this study was to investigate the feasibility and validity of a modified version of Buschke's missing scan methodology, the Missing Scan Task (MST), to assess working memory capacity (WMC) and cognitive control processes in preschool children 3-6?years in age. Forty typically developing monolingual English-speaking children between…

  11. How Do Management Students Perceive the Quality of Education in Public Institutions?

    ERIC Educational Resources Information Center

    Narang, Ritu

    2012-01-01

    Purpose: Keeping in mind the urgent need to deliver quality education in higher education institutes, the current paper seeks to measure the quality perception of management students in India. Design/methodology/approach: Based on an exploratory study a modified version of SERVQUAL was employed as the research instrument. Data were collected from…

  12. Digital Skills Acquisition: Future Trends among Older Adults

    ERIC Educational Resources Information Center

    Gilliam, Brian K.

    2011-01-01

    Purpose: The purpose of this study was to identify future trends and barriers that will either facilitate or impede the narrowing of the digital skills divide among older adults during the next 10 years. Methodology: To address the research questions, this study used a modified version of the Delphi process using a panel of experts who…

  13. Inclusion of Radiation Environment Variability in Total Dose Hardness Assurance Methodology

    NASA Technical Reports Server (NTRS)

    Xapsos, M. A.; Stauffer, C.; Phan, A.; McClure, S. S.; Ladbury, R. L.; Pellish, J. A.; Campola, M. J.; LaBel, K. A.

    2016-01-01

    Variability of the space radiation environment is investigated with regard to parts categorization for total dose hardness assurance methods. It is shown that it can have a significant impact. A modified approach is developed that uses current environment models more consistently and replaces the radiation design margin concept with one of failure probability during a mission.

  14. C-Based Design Methodology and Topological Change for an Indian Agricultural Tractor Component

    NASA Astrophysics Data System (ADS)

    Matta, Anil Kumar; Raju, D. Ranga; Suman, K. N. S.; Kranthi, A. S.

    2018-06-01

    The failure of tractor components and their replacement has now become very common in India because of re-cycling, re-sale, and duplication. To over come the problem of failure we propose a design methodology for topological change co-simulating with software's. In the proposed Design methodology, the designer checks Paxial, Pcr, Pfailue, τ by hand calculations, from which refined topological changes of R.S.Arm are formed. We explained several techniques employed in the component for reduction, removal of rib material to change center of gravity and centroid point by using system C for mixed level simulation and faster topological changes. The design process in system C can be compiled and executed with software, TURBO C7. The modified component is developed in proE and analyzed in ANSYS. The topologically changed component with slot 120 × 4.75 × 32.5 mm at the center showed greater effectiveness than the original component.

  15. The use of grounded theory in studies of nurses and midwives' coping processes: a systematic literature search.

    PubMed

    Cheer, Karen; MacLaren, David; Tsey, Komla

    2015-01-01

    Researchers are increasingly using grounded theory methodologies to study the professional experience of nurses and midwives. To review common grounded theory characteristics and research design quality as described in grounded theory studies of coping strategies used by nurses and midwives. A systematic database search for 2005-2015 identified and assessed grounded theory characteristics from 16 studies. Study quality was assessed using a modified Critical Appraisal Skills Programme tool. Grounded theory was considered a methodology or a set of methods, able to be used within different nursing and midwifery contexts. Specific research requirements determined the common grounded theory characteristics used in different studies. Most researchers did not clarify their epistemological and theoretical perspectives. To improve research design and trustworthiness of grounded theory studies in nursing and midwifery, researchers need to state their theoretical stance and clearly articulate their use of grounded theory methodology and characteristics in research reporting.

  16. Mathematical modeling of elementary trapping-reduction processes in positron annihilation lifetime spectroscopy: methodology of Ps-to-positron trapping conversion

    NASA Astrophysics Data System (ADS)

    Shpotyuk, Ya; Cebulski, J.; Ingram, A.; Shpotyuk, O.

    2017-12-01

    Methodological possibilities of positron annihilation lifetime (PAL) spectroscopy in application to nanostructurized substances treated within three-term fitting procedure are reconsidered to parameterize their atomic-deficient structural arrangement. In contrast to conventional three-term fitting analysis of the detected PAL spectra based on admixed positron trapping and positronium (Ps) decaying, the nanostructurization due to guest nanoparticles embedded in host matrix is considered as producing modified trapping, which involves conversion between these channels. The developed approach referred to as x3-x2-coupling decomposition algorithm allows estimation free volumes of interfacial voids responsible for positron trapping and bulk lifetimes in nanoparticle-embedded substances. This methodology is validated using experimental data of Chakraverty et al. [Phys. Rev. B71 (2005) 024115] on PAL study of composites formed by guest NiFe2O4 nanocrystals grown in host SiO2 matrix.

  17. Telephone-quality pathological speech classification using empirical mode decomposition.

    PubMed

    Kaleem, M F; Ghoraani, B; Guergachi, A; Krishnan, S

    2011-01-01

    This paper presents a computationally simple and effective methodology based on empirical mode decomposition (EMD) for classification of telephone quality normal and pathological speech signals. EMD is used to decompose continuous normal and pathological speech signals into intrinsic mode functions, which are analyzed to extract physically meaningful and unique temporal and spectral features. Using continuous speech samples from a database of 51 normal and 161 pathological speakers, which has been modified to simulate telephone quality speech under different levels of noise, a linear classifier is used with the feature vector thus obtained to obtain a high classification accuracy, thereby demonstrating the effectiveness of the methodology. The classification accuracy reported in this paper (89.7% for signal-to-noise ratio 30 dB) is a significant improvement over previously reported results for the same task, and demonstrates the utility of our methodology for cost-effective remote voice pathology assessment over telephone channels.

  18. Corrections to the MODIS Aqua Calibration Derived From MODIS Aqua Ocean Color Products

    NASA Technical Reports Server (NTRS)

    Meister, Gerhard; Franz, Bryan Alden

    2013-01-01

    Ocean color products such as, e.g., chlorophyll-a concentration, can be derived from the top-of-atmosphere radiances measured by imaging sensors on earth-orbiting satellites. There are currently three National Aeronautics and Space Administration sensors in orbit capable of providing ocean color products. One of these sensors is the Moderate Resolution Imaging Spectroradiometer (MODIS) on the Aqua satellite, whose ocean color products are currently the most widely used of the three. A recent improvement to the MODIS calibration methodology has used land targets to improve the calibration accuracy. This study evaluates the new calibration methodology and describes further calibration improvements that are built upon the new methodology by including ocean measurements in the form of global temporally averaged water-leaving reflectance measurements. The calibration improvements presented here mainly modify the calibration at the scan edges, taking advantage of the good performance of the land target trending in the center of the scan.

  19. C-Based Design Methodology and Topological Change for an Indian Agricultural Tractor Component

    NASA Astrophysics Data System (ADS)

    Matta, Anil Kumar; Raju, D. Ranga; Suman, K. N. S.; Kranthi, A. S.

    2018-02-01

    The failure of tractor components and their replacement has now become very common in India because of re-cycling, re-sale, and duplication. To over come the problem of failure we propose a design methodology for topological change co-simulating with software's. In the proposed Design methodology, the designer checks Paxial, Pcr, Pfailue, τ by hand calculations, from which refined topological changes of R.S.Arm are formed. We explained several techniques employed in the component for reduction, removal of rib material to change center of gravity and centroid point by using system C for mixed level simulation and faster topological changes. The design process in system C can be compiled and executed with software, TURBO C7. The modified component is developed in proE and analyzed in ANSYS. The topologically changed component with slot 120 × 4.75 × 32.5 mm at the center showed greater effectiveness than the original component.

  20. Optimization of sample preparation variables for wedelolactone from Eclipta alba using Box-Behnken experimental design followed by HPLC identification.

    PubMed

    Patil, A A; Sachin, B S; Shinde, D B; Wakte, P S

    2013-07-01

    Coumestan wedelolactone is an important phytocomponent from Eclipta alba (L.) Hassk. It possesses diverse pharmacological activities, which have prompted the development of various extraction techniques and strategies for its better utilization. The aim of the present study is to develop and optimize supercritical carbon dioxide assisted sample preparation and HPLC identification of wedelolactone from E. alba (L.) Hassk. The response surface methodology was employed to study the optimization of sample preparation using supercritical carbon dioxide for wedelolactone from E. alba (L.) Hassk. The optimized sample preparation involves the investigation of quantitative effects of sample preparation parameters viz. operating pressure, temperature, modifier concentration and time on yield of wedelolactone using Box-Behnken design. The wedelolactone content was determined using validated HPLC methodology. The experimental data were fitted to second-order polynomial equation using multiple regression analysis and analyzed using the appropriate statistical method. By solving the regression equation and analyzing 3D plots, the optimum extraction conditions were found to be: extraction pressure, 25 MPa; temperature, 56 °C; modifier concentration, 9.44% and extraction time, 60 min. Optimum extraction conditions demonstrated wedelolactone yield of 15.37 ± 0.63 mg/100 g E. alba (L.) Hassk, which was in good agreement with the predicted values. Temperature and modifier concentration showed significant effect on the wedelolactone yield. The supercritical carbon dioxide extraction showed higher selectivity than the conventional Soxhlet assisted extraction method. Copyright © 2013 Elsevier Masson SAS. All rights reserved.

  1. Classification of Spanish white wines using their electrophoretic profiles obtained by capillary zone electrophoresis with amperometric detection.

    PubMed

    Arribas, Alberto Sánchez; Martínez-Fernández, Marta; Moreno, Mónica; Bermejo, Esperanza; Zapardiel, Antonio; Chicharro, Manuel

    2014-06-01

    A method was developed for the simultaneous detection of eight polyphenols (t-resveratrol, (+)-catechin, quercetin and p-coumaric, caffeic, sinapic, ferulic, and gallic acids) by CZE with electrochemical detection. Separation of these polyphenols was achieved within 25 min using a 200 mM borate buffer (pH 9.4) containing 10% methanol as separation electrolyte. Amperometric detection of polyphenols was carried out with a glassy carbon electrode (GCE) modified with a multiwalled carbon nanotubes (CNT) layer obtained from a dispersion of CNT in polyethylenimine. The excellent electrochemical properties of this modified electrode allowed the detection and quantification of the selected polyphenols in white wines without any pretreatment step, showing remarkable signal stability despite the presence of potential fouling substances in wine. The electrophoretic profiles of white wines, obtained using this methodology, have proven to be useful for the classification of these wines by means of chemometric multivariate techniques. Principal component analysis and discriminant analysis allowed accurate classification of wine samples on the basis of their grape varietal (verdejo and airén) using the information contained in selected zones of the electropherogram. The utility of the proposed CZE methodology based on the electrochemical response of CNT-modified electrodes appears to be promising in the field of wine industry and it is expected to be successfully extended to classification of a wider range of wines made of other grape varietals. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Contribution of European research to risk analysis.

    PubMed

    Boenke, A

    2001-12-01

    The European Commission's, Quality of Life Research Programme, Key Action 1-Health, Food & Nutrition is mission-oriented and aims, amongst other things, at providing a healthy, safe and high-quality food supply leading to reinforced consumer confidence in the safety, of European food. Its objectives also include the enhancing of the competitiveness of the European food supply. Key Action 1 is currently supporting a number of different types of European collaborative projects in the area of risk analysis. The objectives of these projects range from the development and validation of prevention strategies including the reduction of consumers risks; development and validation of new modelling approaches, harmonization of risk assessment principles methodologies and terminology; standardization of methods and systems used for the safety evaluation of transgenic food; providing of tools for the evaluation of human viral contamination of shellfish and quality control; new methodologies for assessing the potential of unintended effects of genetically modified (genetically modified) foods; development of a risk assessment model for Cryptosporidium parvum related to the food and water industries, to the development of a communication platform for genetically modified organism, producers, retailers, regulatory authorities and consumer groups to improve safety assessment procedures, risk management strategies and risk communication; development and validation of new methods for safety testing of transgenic food; evaluation of the safety and efficacy of iron supplementation in pregnant women, evaluation of the potential cancer-preventing activity of pro- and pre-biotic ('synbiotic') combinations in human volunteers. An overview of these projects is presented here.

  3. Uptake Mechanism of ApoE-Modified Nanoparticles on Brain Capillary Endothelial Cells as a Blood-Brain Barrier Model

    PubMed Central

    Wagner, Sylvia; Zensi, Anja; Wien, Sascha L.; Tschickardt, Sabrina E.; Maier, Wladislaw; Vogel, Tikva; Worek, Franz; Pietrzik, Claus U.; Kreuter, Jörg; von Briesen, Hagen

    2012-01-01

    Background The blood-brain barrier (BBB) represents an insurmountable obstacle for most drugs thus obstructing an effective treatment of many brain diseases. One solution for overcoming this barrier is a transport by binding of these drugs to surface-modified nanoparticles. Especially apolipoprotein E (ApoE) appears to play a major role in the nanoparticle-mediated drug transport across the BBB. However, at present the underlying mechanism is incompletely understood. Methodology/Principal Findings In this study, the uptake of the ApoE-modified nanoparticles into the brain capillary endothelial cells was investigated to differentiate between active and passive uptake mechanism by flow cytometry and confocal laser scanning microscopy. Furthermore, different in vitro co-incubation experiments were performed with competing ligands of the respective receptor. Conclusions/Significance This study confirms an active endocytotic uptake mechanism and shows the involvement of low density lipoprotein receptor family members, notably the low density lipoprotein receptor related protein, on the uptake of the ApoE-modified nanoparticles into the brain capillary endothelial cells. This knowledge of the uptake mechanism of ApoE-modified nanoparticles enables future developments to rationally create very specific and effective carriers to overcome the blood-brain barrier. PMID:22396775

  4. Strategies for an enzyme immobilization on electrodes: Structural and electrochemical characterizations

    NASA Astrophysics Data System (ADS)

    Ganesh, V.; Muthurasu, A.

    2012-04-01

    In this paper, we propose various strategies for an enzyme immobilization on electrodes (both metal and semiconductor electrodes). In general, the proposed methodology involves two critical steps viz., (1) chemical modification of substrates using functional monolayers [Langmuir - Blodgett (LB) films and/or self-assembled monolayers (SAMs)] and (2) anchoring of a target enzyme using specific chemical and physical interactions by attacking the terminal functionality of the modified films. Basically there are three ways to immobilize an enzyme on chemically modified electrodes. First method consists of an electrostatic interaction between the enzyme and terminal functional groups present within the chemically modified films. Second and third methods involve the introduction of nanomaterials followed by an enzyme immobilization using both the physical and chemical adsorption processes. As a proof of principle, in this work we demonstrate the sensing and catalytic activity of horseradish peroxidase (HRP) anchored onto SAM modified indium tin oxide (ITO) electrodes towards hydrogen peroxide (H2O2). Structural characterization of such modified electrodes is performed using X-ray photoelectron spectroscopy (XPS), atomic force microscopy (AFM) and contact angle measurements. The binding events and the enzymatic reactions are monitored using electrochemical techniques mainly cyclic voltammetry (CV).

  5. Development and application of stir bar sorptive extraction with polyurethane foams for the determination of testosterone and methenolone in urine matrices.

    PubMed

    Sequeiros, R C P; Neng, N R; Portugal, F C M; Pinto, M L; Pires, J; Nogueira, J M F

    2011-04-01

    This work describes the development, validation, and application of a novel methodology for the determination of testosterone and methenolone in urine matrices by stir bar sorptive extraction using polyurethane foams [SBSE(PU)] followed by liquid desorption and high-performance liquid chromatography with diode array detection. The methodology was optimized in terms of extraction time, agitation speed, pH, ionic strength and organic modifier, as well as back-extraction solvent and desorption time. Under optimized experimental conditions, convenient accuracy were achieved with average recoveries of 49.7 8.6% for testosterone and 54.2 ± 4.7% for methenolone. Additionally, the methodology showed good precision (<9%), excellent linear dynamic ranges (>0.9963) and convenient detection limits (0.2-0.3 μg/L). When comparing the efficiency obtained by SBSE(PU) and with the conventional polydimethylsiloxane phase [SBSE(PDMS)], yields up to four-fold higher are attained for the former, under the same experimental conditions. The application of the proposed methodology for the analysis of testosterone and methenolone in urine matrices showed negligible matrix effects and good analytical performance.

  6. Using Mathematical Algorithms to Modify Glomerular Filtration Rate Estimation Equations

    PubMed Central

    Zhu, Bei; Wu, Jianqing; Zhu, Jin; Zhao, Weihong

    2013-01-01

    Background The equations provide a rapid and low-cost method of evaluating glomerular filtration rate (GFR). Previous studies indicated that the Modification of Diet in Renal Disease (MDRD), Chronic Kidney Disease-Epidemiology (CKD-EPI) and MacIsaac equations need further modification for application in Chinese population. Thus, this study was designed to modify the three equations, and compare the diagnostic accuracy of the equations modified before and after. Methodology With the use of 99 mTc-DTPA renal dynamic imaging as the reference GFR (rGFR), the MDRD, CKD-EPI and MacIsaac equations were modified by two mathematical algorithms: the hill-climbing and the simulated-annealing algorithms. Results A total of 703 Chinese subjects were recruited, with the average rGFR 77.14±25.93 ml/min. The entire modification process was based on a random sample of 80% of subjects in each GFR level as a training sample set, the rest of 20% of subjects as a validation sample set. After modification, the three equations performed significant improvement in slop, intercept, correlated coefficient, root mean square error (RMSE), total deviation index (TDI), and the proportion of estimated GFR (eGFR) within 10% and 30% deviation of rGFR (P10 and P30). Of the three modified equations, the modified CKD-EPI equation showed the best accuracy. Conclusions Mathematical algorithms could be a considerable tool to modify the GFR equations. Accuracy of all the three modified equations was significantly improved in which the modified CKD-EPI equation could be the optimal one. PMID:23472113

  7. Use of the single-breath method of estimating cardiac output during exercise-stress testing.

    NASA Technical Reports Server (NTRS)

    Buderer, M. C.; Rummel, J. A.; Sawin, C. F.; Mauldin, D. G.

    1973-01-01

    The single-breath cardiac output measurement technique of Kim et al. (1966) has been modified for use in obtaining cardiac output measurements during exercise-stress tests on Apollo astronauts. The modifications involve the use of a respiratory mass spectrometer for data acquisition and a digital computer program for data analysis. The variation of the modified method for triplicate steady-state cardiac output measurements was plus or minus 1 liter/min. The combined physiological and methodological variation seen during a set of three exercise tests on a series of subjects was 1 to 2.5 liter/min. Comparison of the modified method with the direct Fick technique showed that although the single-breath values were consistently low, the scatter of data was small and the correlation between the two methods was high. Possible reasons for the low single-breath cardiac output values are discussed.

  8. Modified Methodology for Projecting Coastal Louisiana Land Changes over the Next 50 Years

    USGS Publications Warehouse

    Hartley, Steve B.

    2009-01-01

    The coastal Louisiana landscape is continually undergoing geomorphologic changes (in particular, land loss); however, after the 2005 hurricane season, the changes were intensified because of Hurricanes Katrina and Rita. The amount of land loss caused by the 2005 hurricane season was 42 percent (562 km2) of the total land loss (1,329 km2) that was projected for the next 50 years in the Louisiana Coastal Area (LCA), Louisiana Ecosystem Restoration Study. The purpose of this study is to provide information on potential changes to coastal Louisiana by using a revised LCA study methodology. In the revised methodology, we used classified Landsat TM satellite imagery from 1990, 2001, 2004, and 2006 to calculate the 'background' or ambient land-water change rates but divided the Louisiana coastal area differently on the basis of (1) geographic regions ('subprovinces') and (2) specific homogeneous habitat types. Defining polygons by subprovinces (1, Pontchartrain Basin; 2, Barataria Basin; 3, Vermilion/Terrebonne Basins; and 4, the Chenier Plain area) allows for a specific erosion rate to be applied to that area. Further subdividing the provinces by habitat type allows for specific erosion rates for a particular vegetation type to be applied. Our modified methodology resulted in 24 polygons rather than the 183 that were used in the LCA study; further, actively managed areas and the CWPPRA areas were not masked out and dealt with separately as in the LCA study. This revised methodology assumes that erosion rates for habitat types by subprovince are under the influence of similar environmental conditions (sediment depletion, subsidence, and saltwater intrusion). Background change rate for three time periods (1990-2001, 1990-2004, and 1990-2006) were calculated by taking the difference in water or land among each time period and dividing it by the time interval. This calculation gives an annual change rate for each polygon per time period. Change rates for each time period were then used to compute the projected change in each subprovince and habitat type over 50 years by using the same compound rate functions used in the LCA study. The resulting maps show projected land changes based on the revised methodology and inclusion of damage by Hurricanes Katrina and Rita. Comparison of projected land change values between the LCA study and this study shows that this revised methodology - that is, using a reduced polygon subset (reduced from 183 to 24) based on habitat type and subprovince - can be used as a quick projection of land loss.

  9. NMR-Metabolic Methodology in the Study of GM Foods

    PubMed Central

    Sobolev, Anatoly P.; Capitani, Donatella; Giannino, Donato; Nicolodi, Chiara; Testone, Giulio; Santoro, Flavio; Frugis, Giovanna; Iannelli, Maria A.; Mattoo, Autar K.; Brosio, Elvino; Gianferri, Raffaella; D’Amico, Irene; Mannina, Luisa

    2010-01-01

    The 1H-NMR methodology used in the study of genetically modified (GM) foods is discussed. Transgenic lettuce (Lactuca sativa cv "Luxor") over-expressing the ArabidopsisKNAT1 gene is presented as a case study. Twenty-two water-soluble metabolites (amino acids, organic acids, sugars) present in leaves of conventional and GM lettuce were monitored by NMR and quantified at two developmental stages. The NMR spectra did not reveal any difference in metabolite composition between the GM lettuce and the wild type counterpart. Statistical analyses of metabolite variables highlighted metabolism variation as a function of leaf development as well as the transgene. A main effect of the transgene was in altering sugar metabolism. PMID:22253988

  10. Model-based testing with UML applied to a roaming algorithm for bluetooth devices.

    PubMed

    Dai, Zhen Ru; Grabowski, Jens; Neukirchen, Helmut; Pals, Holger

    2004-11-01

    In late 2001, the Object Management Group issued a Request for Proposal to develop a testing profile for UML 2.0. In June 2003, the work on the UML 2.0 Testing Profile was finally adopted by the OMG. Since March 2004, it has become an official standard of the OMG. The UML 2.0 Testing Profile provides support for UML based model-driven testing. This paper introduces a methodology on how to use the testing profile in order to modify and extend an existing UML design model for test issues. The application of the methodology will be explained by applying it to an existing UML Model for a Bluetooth device.

  11. A system and methodology for measuring volatile organic compounds produced by hydroponic lettuce in a controlled environment

    NASA Technical Reports Server (NTRS)

    Charron, C. S.; Cantliffe, D. J.; Wheeler, R. M.; Manukian, A.; Heath, R. R.

    1996-01-01

    A system and methodology were developed for the nondestructive qualitative and quantitative analysis of volatile emissions from hydroponically grown 'Waldmann's Green' leaf lettuce (Lactuca sativa L.). Photosynthetic photon flux (PPF), photoperiod, and temperature were automatically controlled and monitored in a growth chamber modified for the collection of plant volatiles. The lipoxygenase pathway products (Z)-3-hexenal, (Z)-3-hexenol, and (Z)-3-hexenyl acetate were emitted by lettuce plants after the transition from the light period to the dark period. The volatile collection system developed in this study enabled measurements of volatiles emitted by intact plants, from planting to harvest, under controlled environmental conditions.

  12. Methodologic quality of meta-analyses and systematic reviews on the Mediterranean diet and cardiovascular disease outcomes: a review.

    PubMed

    Huedo-Medina, Tania B; Garcia, Marissa; Bihuniak, Jessica D; Kenny, Anne; Kerstetter, Jane

    2016-03-01

    Several systematic reviews/meta-analyses published within the past 10 y have examined the associations of Mediterranean-style diets (MedSDs) on cardiovascular disease (CVD) risk. However, these reviews have not been evaluated for satisfying contemporary methodologic quality standards. This study evaluated the quality of recent systematic reviews/meta-analyses on MedSD and CVD risk outcomes by using an established methodologic quality scale. The relation between review quality and impact per publication value of the journal in which the article had been published was also evaluated. To assess compliance with current standards, we applied a modified version of the Assessment of Multiple Systematic Reviews (AMSTARMedSD) quality scale to systematic reviews/meta-analyses retrieved from electronic databases that had met our selection criteria: 1) used systematic or meta-analytic procedures to review the literature, 2) examined MedSD trials, and 3) had MedSD interventions independently or combined with other interventions. Reviews completely satisfied from 8% to 75% of the AMSTARMedSD items (mean ± SD: 31.2% ± 19.4%), with those published in higher-impact journals having greater quality scores. At a minimum, 60% of the 24 reviews did not disclose full search details or apply appropriate statistical methods to combine study findings. Only 5 of the reviews included participant or study characteristics in their analyses, and none evaluated MedSD diet characteristics. These data suggest that current meta-analyses/systematic reviews evaluating the effect of MedSD on CVD risk do not fully comply with contemporary methodologic quality standards. As a result, there are more research questions to answer to enhance our understanding of how MedSD affects CVD risk or how these effects may be modified by the participant or MedSD characteristics. To clarify the associations between MedSD and CVD risk, future meta-analyses and systematic reviews should not only follow methodologic quality standards but also include more statistical modeling results when data allow. © 2016 American Society for Nutrition.

  13. Development and Application of a Systems Engineering Framework to Support Online Course Design and Delivery

    ERIC Educational Resources Information Center

    Bozkurt, Ipek; Helm, James

    2013-01-01

    This paper develops a systems engineering-based framework to assist in the design of an online engineering course. Specifically, the purpose of the framework is to provide a structured methodology for the design, development and delivery of a fully online course, either brand new or modified from an existing face-to-face course. The main strength…

  14. Assessing the Availability of Fertility Regulation Methods: Report on a Methodological Study. Scientific Reports, No. 1, February 1977.

    ERIC Educational Resources Information Center

    Rodriguez, German

    The report investigates the problems of assessing the availability of fertility regulation methods in the household and the community. The study originated from the need to evaluate a number of proposed additions to the data collection instruments used by the World Fertility Survey (WFS). The core questionnaire was modified to add the following…

  15. How Homeless Sector Workers Deal with the Death of Service Users: A Grounded Theory Study

    ERIC Educational Resources Information Center

    Lakeman, Richard

    2011-01-01

    Homeless sector workers often encounter the deaths of service users. A modified grounded theory methodology project was used to explore how workers make sense of, respond to, and cope with sudden death. In-depth interviews were undertaken with 16 paid homeless sector workers who had experienced the death of someone with whom they worked.…

  16. An Analysis of Factors that Inhibit Business Use of User-Centered Design Principles: A Delphi Study

    ERIC Educational Resources Information Center

    Hilton, Tod M.

    2010-01-01

    The use of user-centered design (UCD) principles has a positive impact on the use of web-based interactive systems in customer-centric organizations. User-centered design methodologies are not widely adopted in organizations due to intraorganizational factors. A qualitative study using a modified Delphi technique was used to identify the factors…

  17. Teaching History of Architecture--Moving from a Knowledge Transfer to a Multi-Participative Methodology Based on IT Tools

    ERIC Educational Resources Information Center

    Cimadomo, Guido

    2014-01-01

    The changes that the European Higher Education Area (EHEA) framework obliged the School of Architecture of Malaga, University of Malaga. to make to its "History of Architecture" course are discussed in this paper. It was taken up as an opportunity to modify the whole course, introducing creative teaching and "imaginative…

  18. Inclusion of Radiation Environment Variability in Total Dose Hardness Assurance Methodology

    PubMed Central

    Xapsos, M.A.; Stauffer, C.; Phan, A.; McClure, S.S.; Ladbury, R.L.; Pellish, J.A.; Campola, M.J.; LaBel, K.A.

    2017-01-01

    Variability of the space radiation environment is investigated with regard to parts categorization for total dose hardness assurance methods. It is shown that it can have a significant impact. A modified approach is developed that uses current environment models more consistently and replaces the radiation design margin concept with one of failure probability during a mission. PMID:28804156

  19. Empathic Technologies for Distance/Mobile Learning: An Empirical Research Based on the Unified Theory of Acceptance and Use of Technology (UTAUT)

    ERIC Educational Resources Information Center

    Isaias, Pedro; Reis, Francisco; Coutinho, Clara; Lencastre, Jose Alberto

    2017-01-01

    Purpose: This paper examines the acceptance, of a group of 79 students, of an educational forum, used for mobile and distance learning, that has been modified to include empathic characteristics and affective principles. Design/Methodology/Approach: With this study is proposed that the introduction of empathic and affective principles in…

  20. Health Status of Immigrants from Nepal in the United States: Preliminary Findings and Methodological Issues

    ERIC Educational Resources Information Center

    Bajracharya, Srijana M.; Bentley, Mary K.

    2006-01-01

    The purpose of this study was to examine the perceived health status, health behaviors, family values and relationships of a select group of Nepalese in the US. Nepalese are a small minority group most often categorized in the U.S. under the Asian and Pacific Islanders cluster. A modified BRFSS (Behavioral Risk Factor Surveillance System) survey…

  1. What is a Baseline for Effective Information Technology Governance for Higher Education Institutions that are Members of Research University CIO Conclave in United States?

    ERIC Educational Resources Information Center

    Mohseni, Maryam

    2012-01-01

    This research study provides the findings of a modified Delphi methodology conducted to define components and baseline for effective information technology governance for higher education institutions member of the Research University CIO Conclave (RUCC) in United States. The participating experts are Chief Information Officers (CIOs) of…

  2. Exploring the Importance of Soft and Hard Skills as Perceived by IT Internship Students and Industry: A Gap Analysis

    ERIC Educational Resources Information Center

    Patacsil, Frederick F.; Tablatin, Christine Lourrine S.

    2017-01-01

    The research paper proposes a skills gap methodology that utilized the respondent experiences in the internship program to measure the importance of the Information Technology (IT) skills gap as perceived by IT students and the industry. The questionnaires were formulated based on previous studies, however, was slightly modified, validated and…

  3. Analysis of recent failures of disease modifying therapies in Alzheimer's disease suggesting a new methodology for future studies.

    PubMed

    Amanatkar, Hamid Reza; Papagiannopoulos, Bill; Grossberg, George Thomas

    2017-01-01

    Pharmaceutical companies and the NIH have invested heavily in a variety of potential disease-modifying therapies for Alzheimer's disease (AD) but unfortunately all double-blind placebo-controlled Phase III studies of these drugs have failed to show statistically significant results supporting their clinical efficacy on cognitive measures. These negative results are surprising as most of these medications have the capability to impact the biomarkers which are associated with progression of Alzheimer's disease. Areas covered: This contradiction prompted us to review all study phases of Intravenous Immunoglobulin (IVIG), Bapineuzumab, Solanezumab, Avagacestat and Dimebolin to shed more light on these recent failures. We critically analyzed these studies, recommending seven lessons from these failures which should not be overlooked. Expert commentary: We suggest a new methodology for future treatment research in Alzheimer's disease considering early intervention with more focus on cognitive decline as a screening tool, more sophisticated exclusion criteria with more reliance on biomarkers, stratification of subjects based on the rate of cognitive decline aiming less heterogeneity, and a longer study duration with periodic assessment of cognition and activities of daily living during the study and also after a washout period.

  4. Modifiers of breast and ovarian cancer risks for BRCA1 and BRCA2 mutation carriers.

    PubMed

    Milne, Roger L; Antoniou, Antonis C

    2016-10-01

    Pathogenic mutations in BRCA1 and BRCA2 are associated with high risks of breast and ovarian cancer. However, penetrance estimates for mutation carriers have been found to vary substantially between studies, and the observed differences in risk are consistent with the hypothesis that genetic and environmental factors modify cancer risks for women with these mutations. Direct evidence that this is the case has emerged in the past decade, through large-scale international collaborative efforts. Here, we describe the methodological challenges in the identification and characterisation of these risk-modifying factors, review the latest evidence on genetic and lifestyle/hormonal risk factors that modify breast and ovarian cancer risks for women with BRCA1 and BRCA2 mutations and outline the implications of these findings for cancer risk prediction. We also review the unresolved issues in this area of research and identify strategies of clinical implementation so that women with BRCA1 and BRCA2 mutations are no longer counselled on the basis of 'average' risk estimates. © 2016 Society for Endocrinology.

  5. Layer-by-layer assembly surface modified microbial biomass for enhancing biorecovery of secondary gold.

    PubMed

    Zhou, Ying; Zhu, Nengwu; Kang, Naixin; Cao, Yanlan; Shi, Chaohong; Wu, Pingxiao; Dang, Zhi; Zhang, Xiaoping; Qin, Benqian

    2017-02-01

    Enhancement of the biosorption capacity for gold is highly desirable for the biorecovery of secondary gold resources. In this study, polyethylenimine (PEI) was grafted on Shewanella haliotis surface through layer-by-layer assembly approach so as to improve the biosorption capacity of Au(III). Results showed that the relative contribution of amino group to the biosorption of Au(III) was the largest one (about 44%). After successful grafting 1, 2 and 3-layer PEI on the surface of biomass, the biosorption capacity significantly enhanced from 143.8mg/g to 597.1, 559.1, and 536.8mg/g, respectively. Interestingly, the biomass modified with 1-layer PEI exhibited 4.2 times higher biosorption capacity than the untreated control. When 1-layer modified biomass was subjected to optimizing the various conditions by response surface methodology, the theoretical maximum adsorption capacity could reach up to 727.3mg/g. All findings demonstrated that PEI modified S. haliotis was effective for enhancing gold biorecovery. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. Modified constraint-induced movement therapy or bimanual occupational therapy following injection of Botulinum toxin-A to improve bimanual performance in young children with hemiplegic cerebral palsy: a randomised controlled trial methods paper

    PubMed Central

    2010-01-01

    Background Use of Botulinum toxin-A (BoNT-A) for treatment of upper limb spasticity in children with cerebral palsy has become routine clinical practice in many paediatric treatment centres worldwide. There is now high-level evidence that upper limb BoNT-A injection, in combination with occupational therapy, improves outcomes in children with cerebral palsy at both the body function/structure and activity level domains of the International Classification of Functioning, Disability and Health. Investigation is now required to establish what amount and specific type of occupational therapy will further enhance functional outcomes and prolong the beneficial effects of BoNT-A. Methods/Design A randomised, controlled, evaluator blinded, prospective parallel-group trial. Eligible participants were children aged 18 months to 6 years, diagnosed with spastic hemiplegic cerebral palsy and who were able to demonstrate selective motor control of the affected upper limb. Both groups received upper limb injections of BoNT-A. Children were randomised to either the modified constraint-induced movement therapy group (experimental) or bimanual occupational therapy group (control). Outcome assessments were undertaken at pre-injection and 1, 3 and 6 months following injection of BoNT-A. The primary outcome measure was the Assisting Hand Assessment. Secondary outcomes included: the Quality of Upper Extremity Skills Test; Pediatric Evaluation of Disability Inventory; Canadian Occupational Performance Measure; Goal Attainment Scaling; Pediatric Motor Activity Log; modified Ashworth Scale and; the modified Tardieu Scale. Discussion The aim of this paper is to describe the methodology of a randomised controlled trial comparing the effects of modified constraint-induced movement therapy (a uni-manual therapy) versus bimanual occupational therapy (a bimanual therapy) on improving bimanual upper limb performance of children with hemiplegic cerebral palsy following upper limb injection of BoNT-A. The paper outlines the background to the study, the study hypotheses, outcome measures and trial methodology. It also provides a comprehensive description of the interventions provided. Trial Registration ACTRN12605000002684 PMID:20602795

  7. A methodology for identification and control of electro-mechanical actuators

    PubMed Central

    Tutunji, Tarek A.; Saleem, Ashraf

    2015-01-01

    Mechatronic systems are fully-integrated engineering systems that are composed of mechanical, electronic, and computer control sub-systems. These integrated systems use electro-mechanical actuators to cause the required motion. Therefore, the design of appropriate controllers for these actuators are an essential step in mechatronic system design. In this paper, a three-stage methodology for real-time identification and control of electro-mechanical actuator plants is presented, tested, and validated. First, identification models are constructed from experimental data to approximate the plants’ response. Second, the identified model is used in a simulation environment for the purpose of designing a suitable controller. Finally, the designed controller is applied and tested on the real plant through Hardware-in-the-Loop (HIL) environment. The described three-stage methodology provides the following practical contributions: • Establishes an easy-to-follow methodology for controller design of electro-mechanical actuators. • Combines off-line and on-line controller design for practical performance. • Modifies the HIL concept by using physical plants with computer control (rather than virtual plants with physical controllers). Simulated and experimental results for two case studies, induction motor and vehicle drive system, are presented in order to validate the proposed methodology. These results showed that electromechanical actuators can be identified and controlled using an easy-to-duplicate and flexible procedure. PMID:26150992

  8. A methodology for identification and control of electro-mechanical actuators.

    PubMed

    Tutunji, Tarek A; Saleem, Ashraf

    2015-01-01

    Mechatronic systems are fully-integrated engineering systems that are composed of mechanical, electronic, and computer control sub-systems. These integrated systems use electro-mechanical actuators to cause the required motion. Therefore, the design of appropriate controllers for these actuators are an essential step in mechatronic system design. In this paper, a three-stage methodology for real-time identification and control of electro-mechanical actuator plants is presented, tested, and validated. First, identification models are constructed from experimental data to approximate the plants' response. Second, the identified model is used in a simulation environment for the purpose of designing a suitable controller. Finally, the designed controller is applied and tested on the real plant through Hardware-in-the-Loop (HIL) environment. The described three-stage methodology provides the following practical contributions: •Establishes an easy-to-follow methodology for controller design of electro-mechanical actuators.•Combines off-line and on-line controller design for practical performance.•Modifies the HIL concept by using physical plants with computer control (rather than virtual plants with physical controllers). Simulated and experimental results for two case studies, induction motor and vehicle drive system, are presented in order to validate the proposed methodology. These results showed that electromechanical actuators can be identified and controlled using an easy-to-duplicate and flexible procedure.

  9. Assessment of intracranial collaterals on CT angiography in anterior circulation acute ischemic stroke.

    PubMed

    Yeo, L L L; Paliwal, P; Teoh, H L; Seet, R C; Chan, B P; Ting, E; Venketasubramanian, N; Leow, W K; Wakerley, B; Kusama, Y; Rathakrishnan, R; Sharma, V K

    2015-02-01

    Intracranial collaterals influence the prognosis of patients treated with intravenous tissue plasminogen activator in acute anterior circulation ischemic stroke. We compared the methods of scoring collaterals on pre-tPA brain CT angiography for predicting functional outcomes in acute anterior circulation ischemic stroke. Two hundred consecutive patients with acute anterior circulation ischemic stroke treated with IV-tPA during 2010-2012 were included. Two independent neuroradiologists evaluated intracranial collaterals by using the Miteff system, Maas system, the modified Tan scale, and the Alberta Stroke Program Early CT Score 20-point methodology. Good and extremely poor outcomes at 3 months were defined by modified Rankin Scale scores of 0-1 and 5-6 points, respectively. Factors associated with good outcome on univariable analysis were younger age, female sex, hypertension, diabetes mellitus, atrial fibrillation, small infarct core (ASPECTS ≥8), vessel recanalization, lower pre-tPA NIHSS scores, and good collaterals according to Tan methodology, ASPECTS methodology, and Miteff methodology. On multivariable logistic regression, only lower NIHSS scores (OR, 1.186 per point; 95% CI, 1.079-1.302; P = .001), recanalization (OR, 5.599; 95% CI, 1.560-20.010; P = .008), and good collaterals by the Miteff method (OR, 3.341; 95% CI, 1.203-5.099; P = .014) were independent predictors of good outcome. Poor collaterals by the Miteff system (OR, 2.592; 95% CI, 1.113-6.038; P = .027), Maas system (OR, 2.580; 95% CI, 1.075-6.187; P = .034), and ASPECTS method ≤5 points (OR, 2.685; 95% CI, 1.156-6.237; P = .022) were independent predictors of extremely poor outcomes. Only the Miteff scoring system for intracranial collaterals is reliable for predicting favorable outcome in thrombolyzed acute anterior circulation ischemic stroke. However, poor outcomes can be predicted by most of the existing methods of scoring intracranial collaterals. © 2015 by American Journal of Neuroradiology.

  10. Protocol for the Solid-phase Synthesis of Oligomers of RNA Containing a 2'-O-thiophenylmethyl Modification and Characterization via Circular Dichroism.

    PubMed

    Francis, Andrew J; Resendiz, Marino J E

    2017-07-28

    Solid-phase synthesis has been used to obtain canonical and modified polymers of nucleic acids, specifically of DNA or RNA, which has made it a popular methodology for applications in various fields and for different research purposes. The procedure described herein focuses on the synthesis, purification, and characterization of dodecamers of RNA 5'-[CUA CGG AAU CAU]-3' containing zero, one, or two modifications located at the C2'-O-position. The probes are based on 2-thiophenylmethyl groups, incorporated into RNA nucleotides via standard organic synthesis and introduced into the corresponding oligonucleotides via their respective phosphoramidites. This report makes use of phosphoramidite chemistry via the four canonical nucleobases (Uridine (U), Cytosine (C), Guanosine (G), Adenosine (A)), as well as 2-thiophenylmethyl functionalized nucleotides modified at the 2'-O-position; however, the methodology is amenable for a large variety of modifications that have been developed over the years. The oligonucleotides were synthesized on a controlled-pore glass (CPG) support followed by cleavage from the resin and deprotection under standard conditions, i.e., a mixture of ammonia and methylamine (AMA) followed by hydrogen fluoride/triethylamine/N-methylpyrrolidinone. The corresponding oligonucleotides were purified via polyacrylamide electrophoresis (20% denaturing) followed by elution, desalting, and isolation via reversed-phase chromatography (Sep-pak, C18-column). Quantification and structural parameters were assessed via ultraviolet-visible (UV-vis) and circular dichroism (CD) photometric analysis, respectively. This report aims to serve as a resource and guide for beginner and expert researchers interested in embarking in this field. It is expected to serve as a work-in-progress as new technologies and methodologies are developed. The description of the methodologies and techniques within this document correspond to a DNA/RNA synthesizer (refurbished and purchased in 2013) that uses phosphoramidite chemistry.

  11. Curated Collection for Educators: Five Key Papers about the Flipped Classroom Methodology

    PubMed Central

    Boysen-Osborn, Megan; Cooney, Robert; Mitzman, Jennifer; Misra, Asit; Williams, Jennifer; Dulani, Tina; Gottlieb, Michael

    2017-01-01

    The flipped classroom (FC) pedagogy is becoming increasingly popular in medical education due to its appeal to the millennial learner and potential benefits in knowledge acquisition. Despite its popularity and effectiveness, the FC educational method is not without challenges. In this article, we identify and summarize several key papers relevant to medical educators interested in exploring the FC teaching methodology. The authors identified an extensive list of papers relevant to FC pedagogy via online discussions within the Academic Life in Emergency Medicine (ALiEM) Faculty Incubator. This list was augmented by an open call on Twitter (utilizing the #meded, #FOAMed, and #flippedclassroom hashtags) yielding a list of 33 papers. We then conducted a three-round modified Delphi process within the authorship group, which included both junior and senior clinician educators, to identify the most impactful papers for educators interested in FC pedagogy. The three-round modified Delphi process ranked all of the selected papers and selected the five most highly-rated papers for inclusion. The authorship group reviewed and summarized these papers with specific consideration given to their value to junior faculty educators and faculty developers interested in the flipped classroom approach. The list of papers featured in this article serves as a key reading list for junior clinician educators and faculty developers interested in the flipped classroom technique. The associated commentaries contextualize the importance of these papers for medical educators aiming to optimize their understanding and implementation of the flipped classroom methodology in their teaching and through faculty development. PMID:29282445

  12. Prioritization methodology for the decommissioning of nuclear facilities: a study case on the Iraq former nuclear complex.

    PubMed

    Jarjies, Adnan; Abbas, Mohammed; Monken Fernandes, Horst; Wong, Melanie; Coates, Roger

    2013-05-01

    There are a number of sites in Iraq which have been used for nuclear activities and which contain potentially significant amounts of radioactive waste. The principal nuclear site being Al-Tuwaitha. Many of these sites suffered substantial physical damage during the Gulf Wars and have been subjected to subsequent looting. All require decommissioning in order to ensure both radiological and non-radiological safety. However, it is not possible to undertake the decommissioning of all sites and facilities at the same time. Therefore, a prioritization methodology has been developed in order to aid the decision-making process. The methodology comprises three principal stages of assessment: i) a quantitative surrogate risk assessment ii) a range of sensitivity analyses and iii) the inclusion of qualitative modifying factors. A group of Tuwaitha facilities presented the highest risk among the evaluated ones, followed by a middle ranking grouping of Tuwaitha facilities and some other sites, and a relatively large group of lower risk facilities and sites. The initial order of priority is changed when modifying factors are taken into account. It has to be considered the Iraq's isolation from the international nuclear community over the last two decades and the lack of experienced personnel. Therefore it is appropriate to initiate decommissioning operations on selected low risk facilities at Tuwaitha in order to build capacity and prepare for work to be carried out in more complex and potentially high hazard facilities. In addition it is appropriate to initiate some prudent precautionary actions relating to some of the higher risk facilities. Copyright © 2012 Elsevier Ltd. All rights reserved.

  13. The Use of Ammonium Formate as a Mobile-Phase Modifier for LC-MS/MS Analysis of Tryptic Digests

    PubMed Central

    Johnson, Darryl; Boyes, Barry; Orlando, Ron

    2013-01-01

    A major challenge facing current mass spectrometry (MS)-based proteomics research is the large concentration range displayed in biological systems, which far exceeds the dynamic range of commonly available mass spectrometers. One approach to overcome this limitation is to improve online reversed-phase liquid chromatography (RP-LC) separation methodologies. LC mobile-phase modifiers are used to improve peak shape and increase sample load tolerance. Trifluoroacetic acid (TFA) is a commonly used mobile-phase modifier, as it produces peptide separations that are far superior to other additives. However, TFA leads to signal suppression when incorporated with electrospray ionization (ESI), and thus, other modifiers, such as formic acid (FA), are used for LC-MS applications. FA exhibits significantly less signal suppression, but is not as effective of a modifier as TFA. An alternative mobile-phase modifier is the combination of FA and ammonium formate (AF), which has been shown to improve peptide separations. The ESI-MS compatibility of this modifier has not been investigated, particularly for proteomic applications. This work compares the separation metrics of mobile phases modified with FA and FA/AF and explores the use of FA/AF for the LC-MS analysis of tryptic digests. Standard tryptic-digest peptides were used for comparative analysis of peak capacity and sample load tolerance. The compatibility of FA/AF in proteomic applications was examined with the analysis of soluble proteins from canine prostate carcinoma tissue. Overall, the use of FA/AF improved online RP-LC separations and led to significant increases in peptide identifications with improved protein sequence coverage. PMID:24294112

  14. The use of ammonium formate as a mobile-phase modifier for LC-MS/MS analysis of tryptic digests.

    PubMed

    Johnson, Darryl; Boyes, Barry; Orlando, Ron

    2013-12-01

    A major challenge facing current mass spectrometry (MS)-based proteomics research is the large concentration range displayed in biological systems, which far exceeds the dynamic range of commonly available mass spectrometers. One approach to overcome this limitation is to improve online reversed-phase liquid chromatography (RP-LC) separation methodologies. LC mobile-phase modifiers are used to improve peak shape and increase sample load tolerance. Trifluoroacetic acid (TFA) is a commonly used mobile-phase modifier, as it produces peptide separations that are far superior to other additives. However, TFA leads to signal suppression when incorporated with electrospray ionization (ESI), and thus, other modifiers, such as formic acid (FA), are used for LC-MS applications. FA exhibits significantly less signal suppression, but is not as effective of a modifier as TFA. An alternative mobile-phase modifier is the combination of FA and ammonium formate (AF), which has been shown to improve peptide separations. The ESI-MS compatibility of this modifier has not been investigated, particularly for proteomic applications. This work compares the separation metrics of mobile phases modified with FA and FA/AF and explores the use of FA/AF for the LC-MS analysis of tryptic digests. Standard tryptic-digest peptides were used for comparative analysis of peak capacity and sample load tolerance. The compatibility of FA/AF in proteomic applications was examined with the analysis of soluble proteins from canine prostate carcinoma tissue. Overall, the use of FA/AF improved online RP-LC separations and led to significant increases in peptide identifications with improved protein sequence coverage.

  15. Developing a Composite Aquifer Vulnerability Assessment Model Combining DRASTIC with Agricultural Land Use in Choushui River Alluvial Fan, Central Taiwan

    NASA Astrophysics Data System (ADS)

    Chen, Shih-Kai; Hsieh, Chih-Heng; Tsai, Cheng-Bin

    2017-04-01

    Aquifer vulnerability assessment is considered to be an effective tool in controlling potential pollution which is critical for groundwater management. The Choushui River alluvial fan, located in central Taiwan, is an agricultural area with complex crop patterns and various irrigation schemes, which increased the difficulties in groundwater resource management. The aim of this study is to propose an integrated methodology to assess shallow groundwater vulnerability by including land-use impact on groundwater potential pollution. The original groundwater vulnerability methodology, DRASTIC, was modified by adding a land-use parameter in order to assess groundwater vulnerability under intense agricultural activities. To examine the prediction capacity of pollution for the modified DRASTIC model, various risk categories of contamination potentials were compared with observed nitrate-N obtained from groundwater monitoring network. It was found that for the original DRASTIC vulnerability map, some areas with low nitrate-N concentrations are covered within the high vulnerability areas, especially in the northern part of mid-fan areas, where rice paddy is the main crop and planted for two crop seasons per year. The low nitrate-N contamination potential of rice paddies may be resulted from the denitrification in the reduced root zone. By reducing the rating for rice paddies, the modified model was proved to be capable of increasing the precise of prediction in study area. The results can provide a basis for groundwater monitoring network design and effective preserve measures formulation in the mixed agricultural area. Keyword:Aquifer Vulnerability, Groundwater, DRASTIC, Nitrate-N

  16. Modified Ashworth Scale (MAS) Model based on Clinical Data Measurement towards Quantitative Evaluation of Upper Limb Spasticity

    NASA Astrophysics Data System (ADS)

    Puzi, A. Ahmad; Sidek, S. N.; Mat Rosly, H.; Daud, N.; Yusof, H. Md

    2017-11-01

    Spasticity is common symptom presented amongst people with sensorimotor disabilities. Imbalanced signals from the central nervous systems (CNS) which are composed of the brain and spinal cord to the muscles ultimately leading to the injury and death of motor neurons. In clinical practice, the therapist assesses muscle spasticity using a standard assessment tool like Modified Ashworth Scale (MAS), Modified Tardiue Scale (MTS) or Fugl-Meyer Assessment (FMA). This is done subjectively based on the experience and perception of the therapist subjected to the patient fatigue level and body posture. However, the inconsistency in the assessment is prevalent and could affect the efficacy of the rehabilitation process. Thus, the aim of this paper is to describe the methodology of data collection and the quantitative model of MAS developed to satisfy its description. Two subjects with MAS of 2 and 3 spasticity levels were involved in the clinical data measurement. Their level of spasticity was verified by expert therapist using current practice. Data collection was established using mechanical system equipped with data acquisition system and LABVIEW software. The procedure engaged repeated series of flexion of the affected arm that was moved against the platform using a lever mechanism and performed by the therapist. The data was then analyzed to investigate the characteristics of spasticity signal in correspondence to the MAS description. Experimental results revealed that the methodology used to quantify spasticity satisfied the MAS tool requirement according to the description. Therefore, the result is crucial and useful towards the development of formal spasticity quantification model.

  17. Intervening in disease through genetically-modified bacteria.

    PubMed

    Ferreira, Adilson K; Mambelli, Lisley I; Pillai, Saravanan Y

    2017-12-01

    The comprehension of the molecular basis of different diseases is rapidly being dissected as a consequence of advancing technology. Consequently, proteins with potential therapeutic usefulness, including cytokines and signaling molecules have been identified in the last decades. However, their clinical use is hampered by disadvantageous functional and economic considerations. One of the most important of these considerations is targeted topical delivery and also the synthesis of such proteins, which for intravenous use requires rigorous purification whereas proteins often do not withstand digestive degradation and thus cannot be applied per os. Recently, the idea of using genetically modified bacteria has emerged as an attempt to evade these important barriers. Using such bacteria can deliver therapeutic proteins or other molecules at place of disease, especially when disease is at a mucosal surface. Further, whereas intravenously applied therapeutic proteins require expensive methodology in order to become endotoxin-free, this is not necessary for local application of therapeutic proteins in the intestine. In addition, once created further propagation of genetically modified bacteria is both cheap and requires relatively little in conditioning with respect to transport of the medication, making such organisms also suitable for combating disease in developing countries with poor infrastructure. Although first human trials with such bacteria were already performed more as a decade ago, the recent revolution in our understanding of the role of human gut microbiome in health and diseases has unleashed a revolution in this field resulting in a plethora of potential novel prophylactic and therapeutic intervention against disease onset and development employing such organisms. Today, the engineering of human microbiome for health benefits and related applications now chances many aspects of biology, nanotechnology and chemistry. Here, we review genetically modified bacteria methodology as possible carriers of drug delivering and provided the origin and inspirations for new drug delivery systems. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Methodologic quality and relevance of references in pharmaceutical advertisements in a Canadian medical journal.

    PubMed

    Lexchin, J; Holbrook, A

    1994-07-01

    To evaluate the methodologic quality and relevance of references in pharmaceutical advertisements in the Canadian Medical Association Journal (CMAJ). Analytic study. All 114 references cited in the first 22 distinct pharmaceutical advertisements in volume 146 of CMAJ. Mean methodologic quality score (modified from the 6-point scale used to assess articles in the American College of Physicians' Journal Club) and mean relevance score (based on a new 5-point scale) for all references in each advertisement. Twenty of the 22 companies responded, sending 78 (90%) of the 87 references requested. The mean methodologic quality score was 58% (95% confidence limits [CL] 51% and 65%) and the mean relevance score 76% (95% CL 72% and 80%). The two mean scores were statistically lower than the acceptable score of 80% (p < 0.05), and the methodologic quality score was outside the preset clinically significant difference of 15%. The poor rating for methodologic quality was primarily because of the citation of references to low-quality review articles and "other" sources (i.e., other than reports of clinical trials). Half of the advertisements had a methodologic quality score of less than 65%, but only five had a relevance score of less than 65%. Although the relevance of most of the references was within minimal acceptable limits, the methodologic quality was often unacceptable. Because advertisements are an important part of pharmaceutical marketing and education, we suggest that companies develop written standards for their advertisements and monitor their advertisements for adherence to these standards. We also suggest that the Pharmaceutical Advertising Advisory Board develop more stringent guidelines for advertising and that it enforce these guidelines in a consistent, rigorous fashion.

  19. Methodologic quality and relevance of references in pharmaceutical advertisements in a Canadian medical journal.

    PubMed Central

    Lexchin, J; Holbrook, A

    1994-01-01

    OBJECTIVE: To evaluate the methodologic quality and relevance of references in pharmaceutical advertisements in the Canadian Medical Association Journal (CMAJ). DESIGN: Analytic study. DATA SOURCE: All 114 references cited in the first 22 distinct pharmaceutical advertisements in volume 146 of CMAJ. MAIN OUTCOME MEASURES: Mean methodologic quality score (modified from the 6-point scale used to assess articles in the American College of Physicians' Journal Club) and mean relevance score (based on a new 5-point scale) for all references in each advertisement. MAIN RESULTS: Twenty of the 22 companies responded, sending 78 (90%) of the 87 references requested. The mean methodologic quality score was 58% (95% confidence limits [CL] 51% and 65%) and the mean relevance score 76% (95% CL 72% and 80%). The two mean scores were statistically lower than the acceptable score of 80% (p < 0.05), and the methodologic quality score was outside the preset clinically significant difference of 15%. The poor rating for methodologic quality was primarily because of the citation of references to low-quality review articles and "other" sources (i.e., other than reports of clinical trials). Half of the advertisements had a methodologic quality score of less than 65%, but only five had a relevance score of less than 65%. CONCLUSIONS: Although the relevance of most of the references was within minimal acceptable limits, the methodologic quality was often unacceptable. Because advertisements are an important part of pharmaceutical marketing and education, we suggest that companies develop written standards for their advertisements and monitor their advertisements for adherence to these standards. We also suggest that the Pharmaceutical Advertising Advisory Board develop more stringent guidelines for advertising and that it enforce these guidelines in a consistent, rigorous fashion. PMID:8004560

  20. Demonstration Of A Nanomaterial-Modified Primer For Use In Corrosion-Inhibiting Coating Systems

    DTIC Science & Technology

    2011-11-01

    abrasive blasting or other means. This report documents the materials and methodologies used for testing and application of the new coating systems on the...method with improved corrosion resistant coatings will provide the DoD with a means to cost effectively rehabilitate the outer metal surfaces of...contained with environmental controls in place. ........................................ 9 Figure 6. Abrasive blast-cleaned tank surface

  1. Defining, Developing, and Implementing a New Design for the Technology Component of a Human Resource Development Undergraduate Program

    ERIC Educational Resources Information Center

    Byers, Celina

    2005-01-01

    Purpose: To describe an approach to course redesign that may provide others in the field with a "template" to follow or modify when course redesign is necessary. Design/methodology/approach: Action research implies making a change and then observing and responding to the consequences of that change. Making the change in this course involved:…

  2. Validating a faster method for reconstitution of Crotalidae Polyvalent Immune Fab (ovine).

    PubMed

    Gerring, David; King, Thomas R; Branton, Richard

    2013-07-01

    Reconstitution of CroFab(®) (Crotalidae Polyvalent Immune Fab [ovine]) lyophilized drug product was previously performed using 10 mL sterile water for injection followed by up to 36 min of gentle swirling of the vial. CroFab has been clinically demonstrated to be most effective when administered within 6 h of snake envenomation, and improved clinical outcomes are correlated with quicker timing of administration. An alternate reconstitution method was devised, using 18 mL 0.9% saline with manual inversion, with the goal of shortening reconstitution time while maintaining a high quality, efficacious product. An analytical study was designed to compare the physicochemical properties of 3 separate batches of CroFab when reconstituted using the standard procedure (10 mL WFI with gentle swirling) and a modified rapid procedure using 18 mL 0.9% saline and manual inversion. The physical and chemical characteristics of the same 3 batches were assessed using various analytic methodologies associated with routine quality control release testing. In addition further analytical methodologies were applied in order to elucidate possible structural changes that may be induced by the changed reconstitution procedure. Batches A, B, and C required mean reconstitution times of 25 min 51 s using the label method and 3 min 07 s (a 88.0% mean decrease) using the modified method. Physicochemical characteristics (color and clarity, pH, purity, protein content, potency) were found to be highly comparable. Characterization assays (dynamic light scattering, analytical ultracentrifugation, LC-MS, SDS-PAGE and circular dichroism spectroscopy were also all found to be comparable between methods. When comparing CroFab batches that were reconstituted using the labeled and modified methods, the physicochemical and biological (potency) characteristics of CroFab were not significantly changed when challenged by the various standard analytical methodologies applied in routine quality control analysis. Additionally, no changes in the CroFab molecule regarding degradation, aggregation, purity, structure, or mass were observed. The analyses performed validated the use of the more rapid reconstitution method using 18 mL 0.9% saline in order to allow a significantly reduced time to administration of CroFab to patients in need. Copyright © 2013 Elsevier Ltd. All rights reserved.

  3. Maternal and perinatal health research priorities beyond 2015: an international survey and prioritization exercise.

    PubMed

    Souza, Joao Paulo; Widmer, Mariana; Gülmezoglu, Ahmet Metin; Lawrie, Theresa Anne; Adejuyigbe, Ebunoluwa Aderonke; Carroli, Guillermo; Crowther, Caroline; Currie, Sheena M; Dowswell, Therese; Hofmeyr, Justus; Lavender, Tina; Lawn, Joy; Mader, Silke; Martinez, Francisco Eulógio; Mugerwa, Kidza; Qureshi, Zahida; Silvestre, Maria Asuncion; Soltani, Hora; Torloni, Maria Regina; Tsigas, Eleni Z; Vowles, Zoe; Ouedraogo, Léopold; Serruya, Suzanne; Al-Raiby, Jamela; Awin, Narimah; Obara, Hiromi; Mathai, Matthews; Bahl, Rajiv; Martines, José; Ganatra, Bela; Phillips, Sharon Jelena; Johnson, Brooke Ronald; Vogel, Joshua P; Oladapo, Olufemi T; Temmerman, Marleen

    2014-08-07

    Maternal mortality has declined by nearly half since 1990, but over a quarter million women still die every year of causes related to pregnancy and childbirth. Maternal-health related targets are falling short of the 2015 Millennium Development Goals and a post-2015 Development Agenda is emerging. In connection with this, setting global research priorities for the next decade is now required. We adapted the methods of the Child Health and Nutrition Research Initiative (CHNRI) to identify and set global research priorities for maternal and perinatal health for the period 2015 to 2025. Priority research questions were received from various international stakeholders constituting a large reference group, and consolidated into a final list of research questions by a technical working group. Questions on this list were then scored by the reference working group according to five independent and equally weighted criteria. Normalized research priority scores (NRPS) were calculated, and research priority questions were ranked accordingly. A list of 190 priority research questions for improving maternal and perinatal health was scored by 140 stakeholders. Most priority research questions (89%) were concerned with the evaluation of implementation and delivery of existing interventions, with research subthemes frequently concerned with training and/or awareness interventions (11%), and access to interventions and/or services (14%). Twenty-one questions (11%) involved the discovery of new interventions or technologies. Key research priorities in maternal and perinatal health were identified. The resulting ranked list of research questions provides a valuable resource for health research investors, researchers and other stakeholders. We are hopeful that this exercise will inform the post-2015 Development Agenda and assist donors, research-policy decision makers and researchers to invest in research that will ultimately make the most significant difference in the lives of mothers and babies.

  4. Setting Priorities in Global Child Health Research Investments: Addressing Values of Stakeholders

    PubMed Central

    Kapiriri, Lydia; Tomlinson, Mark; Gibson, Jennifer; Chopra, Mickey; El Arifeen, Shams; Black, Robert E.; Rudan, Igor

    2007-01-01

    Aim To identify main groups of stakeholders in the process of health research priority setting and propose strategies for addressing their systems of values. Methods In three separate exercises that took place between March and June 2006 we interviewed three different groups of stakeholders: 1) members of the global research priority setting network; 2) a diverse group of national-level stakeholders from South Africa; and 3) participants at the conference related to international child health held in Washington, DC, USA. Each of the groups was administered different version of the questionnaire in which they were asked to set weights to criteria (and also minimum required thresholds, where applicable) that were a priori defined as relevant to health research priority setting by the consultants of the Child Health and Nutrition Research initiative (CHNRI). Results At the global level, the wide and diverse group of respondents placed the greatest importance (weight) to the criterion of maximum potential for disease burden reduction, while the most stringent threshold was placed on the criterion of answerability in an ethical way. Among the stakeholders’ representatives attending the international conference, the criterion of deliverability, answerability, and sustainability of health research results was proposed as the most important one. At the national level in South Africa, the greatest weight was placed on the criterion addressing the predicted impact on equity of the proposed health research. Conclusions Involving a large group of stakeholders when setting priorities in health research investments is important because the criteria of relevance to scientists and technical experts, whose knowledge and technical expertise is usually central to the process, may not be appropriate to specific contexts and in accordance with the views and values of those who invest in health research, those who benefit from it, or wider society as a whole. PMID:17948948

  5. Research priority setting for integrated early child development and violence prevention (ECD+) in low and middle income countries: An expert opinion exercise.

    PubMed

    Tomlinson, Mark; Jordans, Mark; MacMillan, Harriet; Betancourt, Theresa; Hunt, Xanthe; Mikton, Christopher

    2017-10-01

    Child development in low and middle income countries (LMIC) is compromised by multiple risk factors. Reducing children's exposure to harmful events is essential for early childhood development (ECD). In particular, preventing violence against children - a highly prevalent risk factor that negatively affects optimal child development - should be an intervention priority. We used the Child Health and Nutrition Initiative (CHNRI) method for the setting of research priorities in integrated Early Childhood Development and violence prevention programs (ECD+). An expert group was identified and invited to systematically list and score research questions. A total of 186 stakeholders were asked to contribute five research questions each, and contributions were received from 81 respondents. These were subsequently evaluated using a set of five criteria: answerability; effectiveness; feasibility and/or affordability; applicability and impact; and equity. Of the 400 questions generated, a composite group of 50 were scored by 55 respondents. The highest scoring research questions related to the training of Community Health Workers (CHW's) to deliver ECD+ interventions effectively and whether ECD+ interventions could be integrated within existing delivery platforms such as HIV, nutrition or mental health platforms. The priority research questions can direct new research initiatives, mainly in focusing on the effectiveness of an ECD+ approach, as well as on service delivery questions. To the best of our knowledge, this is the first systematic exercise of its kind in the field of ECD+. The findings from this research priority setting exercise can help guide donors and other development actors towards funding priorities for important future research related to ECD and violence prevention. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  6. How Are Health Research Priorities Set in Low and Middle Income Countries? A Systematic Review of Published Reports

    PubMed Central

    McGregor, Skye; Henderson, Klara J.; Kaldor, John M.

    2014-01-01

    Background Priority setting is increasingly recognised as essential for directing finite resources to support research that maximizes public health benefits and drives health equity. Priority setting processes have been undertaken in a number of low- and middle-income country (LMIC) settings, using a variety of methods. We undertook a critical review of reports of these processes. Methods and Findings We searched electronic databases and online for peer reviewed and non-peer reviewed literature. We found 91 initiatives that met inclusion criteria. The majority took place at the global level (46%). For regional or national initiatives, most focused on Sub Saharan Africa (49%), followed by East Asia and Pacific (20%) and Latin America and the Caribbean (18%). A quarter of initiatives aimed to cover all areas of health research, with a further 20% covering communicable diseases. The most frequently used process was a conference or workshop to determine priorities (24%), followed by the Child Health and Nutrition Initiative (CHNRI) method (18%). The majority were initiated by an international organization or collaboration (46%). Researchers and government were the most frequently represented stakeholders. There was limited evidence of any implementation or follow-up strategies. Challenges in priority setting included engagement with stakeholders, data availability, and capacity constraints. Conclusions Health research priority setting (HRPS) has been undertaken in a variety of LMIC settings. While not consistently used, the application of established methods provides a means of identifying health research priorities in a repeatable and transparent manner. In the absence of published information on implementation or evaluation, it is not possible to assess what the impact and effectiveness of health research priority setting may have been. PMID:25275315

  7. Associated with aerospace vehicles development of methodologies for the estimation of thermal properties

    NASA Technical Reports Server (NTRS)

    Scott, Elaine P.

    1994-01-01

    Thermal stress analyses are an important aspect in the development of aerospace vehicles at NASA-LaRC. These analyses require knowledge of the temperature distributions within the vehicle structures which consequently necessitates the need for accurate thermal property data. The overall goal of this ongoing research effort is to develop methodologies for the estimation of the thermal property data needed to describe the temperature responses of these complex structures. The research strategy undertaken utilizes a building block approach. The idea here is to first focus on the development of property estimation methodologies for relatively simple conditions, such as isotropic materials at constant temperatures, and then systematically modify the technique for the analysis of more and more complex systems, such as anisotropic multi-component systems. The estimation methodology utilized is a statistically based method which incorporates experimental data and a mathematical model of the system. Several aspects of this overall research effort were investigated during the time of the ASEE summer program. One important aspect involved the calibration of the estimation procedure for the estimation of the thermal properties through the thickness of a standard material. Transient experiments were conducted using a Pyrex standard at various temperatures, and then the thermal properties (thermal conductivity and volumetric heat capacity) were estimated at each temperature. Confidence regions for the estimated values were also determined. These results were then compared to documented values. Another set of experimental tests were conducted on carbon composite samples at different temperatures. Again, the thermal properties were estimated for each temperature, and the results were compared with values obtained using another technique. In both sets of experiments, a 10-15 percent off-set between the estimated values and the previously determined values was found. Another effort was related to the development of the experimental techniques. Initial experiments required a resistance heater placed between two samples. The design was modified such that the heater was placed on the surface of only one sample, as would be necessary in the analysis of built up structures. Experiments using the modified technique were conducted on the composite sample used previously at different temperatures. The results were within 5 percent of those found using two samples. Finally, an initial heat transfer analysis, including conduction, convection and radiation components, was completed on a titanium sandwich structural sample. Experiments utilizing this sample are currently being designed and will be used to first estimate the material's effective thermal conductivity and later to determine the properties associated with each individual heat transfer component.

  8. Development of an adaptive failure detection and identification system for detecting aircraft control element failures

    NASA Technical Reports Server (NTRS)

    Bundick, W. Thomas

    1990-01-01

    A methodology for designing a failure detection and identification (FDI) system to detect and isolate control element failures in aircraft control systems is reviewed. An FDI system design for a modified B-737 aircraft resulting from this methodology is also reviewed, and the results of evaluating this system via simulation are presented. The FDI system performed well in a no-turbulence environment, but it experienced an unacceptable number of false alarms in atmospheric turbulence. An adaptive FDI system, which adjusts thresholds and other system parameters based on the estimated turbulence level, was developed and evaluated. The adaptive system performed well over all turbulence levels simulated, reliably detecting all but the smallest magnitude partially-missing-surface failures.

  9. Fracture mechanism maps in unirradiated and irradiated metals and alloys

    NASA Astrophysics Data System (ADS)

    Li, Meimei; Zinkle, S. J.

    2007-04-01

    This paper presents a methodology for computing a fracture mechanism map in two-dimensional space of tensile stress and temperature using physically-based constitutive equations. Four principal fracture mechanisms were considered: cleavage fracture, low temperature ductile fracture, transgranular creep fracture, and intergranular creep fracture. The methodology was applied to calculate fracture mechanism maps for several selected reactor materials, CuCrZr, 316 type stainless steel, F82H ferritic-martensitic steel, V4Cr4Ti and Mo. The calculated fracture maps are in good agreement with empirical maps obtained from experimental observations. The fracture mechanism maps of unirradiated metals and alloys were modified to include radiation hardening effects on cleavage fracture and high temperature helium embrittlement. Future refinement of fracture mechanism maps is discussed.

  10. Computational Simulation of Continuous Fiber-Reinforced Ceramic Matrix Composites Behavior

    NASA Technical Reports Server (NTRS)

    Murthy, Pappu L. N.; Chamis, Christos C.; Mital, Subodh K.

    1996-01-01

    This report describes a methodology which predicts the behavior of ceramic matrix composites and has been incorporated in the computational tool CEMCAN (CEramic Matrix Composite ANalyzer). The approach combines micromechanics with a unique fiber substructuring concept. In this new concept, the conventional unit cell (the smallest representative volume element of the composite) of the micromechanics approach is modified by substructuring it into several slices and developing the micromechanics-based equations at the slice level. The methodology also takes into account nonlinear ceramic matrix composite (CMC) behavior due to temperature and the fracture initiation and progression. Important features of the approach and its effectiveness are described by using selected examples. Comparisons of predictions and limited experimental data are also provided.

  11. Contemporary research on parenting: conceptual, methodological, and translational issues.

    PubMed

    Power, Thomas G; Sleddens, Ester F C; Berge, Jerica; Connell, Lauren; Govig, Bert; Hennessy, Erin; Liggett, Leanne; Mallan, Kimberley; Santa Maria, Diane; Odoms-Young, Angela; St George, Sara M

    2013-08-01

    Researchers over the last decade have documented the association between general parenting style and numerous factors related to childhood obesity (e.g., children's eating behaviors, physical activity, and weight status). Many recent childhood obesity prevention programs are family focused and designed to modify parenting behaviors thought to contribute to childhood obesity risk. This article presents a brief consideration of conceptual, methodological, and translational issues that can inform future research on the role of parenting in childhood obesity. They include: (1) General versus domain specific parenting styles and practices; (2) the role of ethnicity and culture; (3) assessing bidirectional influences; (4) broadening assessments beyond the immediate family; (5) novel approaches to parenting measurement; and (6) designing effective interventions. Numerous directions for future research are offered.

  12. Synthesis and physico-chemical characterization of modified starches from banana (Musa AAB) and its biological activities in diabetic rats.

    PubMed

    Reddy, Chagam Koteswara; Suriya, M; Vidya, P V; Haripriya, Sundaramoorthy

    2017-01-01

    This study describes a simple method of preparation and physico-chemical properties of modified starches (type-3 resistant starches) from banana (Musa AAB), and the modified starches investigated as functional food with a beneficial effect on type-2 diabetes. RS3 was prepared using a method combined with debranching modification and physical modification; native and modifies starches were characterized by scanning electron microscope (SEM), powder X-ray diffraction (XRD), differential scanning calorimetry (DSC) and rapid visco analyzer (RVA). Use of the enzymatic and physical modification methodology, improved the yield of RS (26.62%) from Musa AAB. A reduced viscosity and swelling power; increased transition temperatures, water absorption capacity and solubility index with B-type crystalline pattern and loss of granular appearance were observed during the debranching modification and physical modification. The modified starches exhibited beneficial health effects in diabetic and HFD rats who consumed it. These results recommend that dietary feeding of RS3 was effective in the regulation of glucose and lipid profile in serum and suppressing the oxidative stress in rats under diabetic and HFD condition. This current study provides new bioactive starches, with potential applications in the food and non-food industries. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. Modifiable correlates of illness perceptions in adults with chronic somatic conditions: A systematic review.

    PubMed

    Arat, Seher; De Cock, Diederik; Moons, Philip; Vandenberghe, Joris; Westhovens, René

    2018-04-01

    When individuals become ill, they want to understand and give meaning to their illness. The interpretation of this illness experience, or illness perception, is influenced by a range of individual, contextual, and cultural factors. Some of these factors may be modifiable by nursing interventions. The purpose of this systematic review was to investigate which modifiable factors were correlated with illness perceptions across studies of adults with different chronic somatic diseases. Using search terms tailored to each of four electronic databases, studies retrieved were reviewed by two independent evaluators, and each relevant article was assessed for methodological quality. Results were standardized by calculating correlation coefficients. Fifteen papers on illness perceptions in a variety of chronic diseases met the inclusion criteria. All used standardized measures of illness perceptions. We identified five groups of modifiable correlates of illness perceptions: illness-related factors, psychosocial factors, medication beliefs, information provision and satisfaction with information received, and quality of care. Our findings add to the knowledge of modifiable factors correlated with illness perceptions, including the importance of illness-related factors and psychosocial factors such as anxiety and depression. Knowledge of these correlates can facilitate understanding of patients' illness perceptions and might be useful in tailoring patient education programs. © 2018 Wiley Periodicals, Inc.

  14. Quasi-experimental study designs series-paper 9: collecting data from quasi-experimental studies.

    PubMed

    Aloe, Ariel M; Becker, Betsy Jane; Duvendack, Maren; Valentine, Jeffrey C; Shemilt, Ian; Waddington, Hugh

    2017-09-01

    To identify variables that must be coded when synthesizing primary studies that use quasi-experimental designs. All quasi-experimental (QE) designs. When designing a systematic review of QE studies, potential sources of heterogeneity-both theory-based and methodological-must be identified. We outline key components of inclusion criteria for syntheses of quasi-experimental studies. We provide recommendations for coding content-relevant and methodological variables and outlined the distinction between bivariate effect sizes and partial (i.e., adjusted) effect sizes. Designs used and controls used are viewed as of greatest importance. Potential sources of bias and confounding are also addressed. Careful consideration must be given to inclusion criteria and the coding of theoretical and methodological variables during the design phase of a synthesis of quasi-experimental studies. The success of the meta-regression analysis relies on the data available to the meta-analyst. Omission of critical moderator variables (i.e., effect modifiers) will undermine the conclusions of a meta-analysis. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. Multi-Time Step Service Restoration for Advanced Distribution Systems and Microgrids

    DOE PAGES

    Chen, Bo; Chen, Chen; Wang, Jianhui; ...

    2017-07-07

    Modern power systems are facing increased risk of disasters that can cause extended outages. The presence of remote control switches (RCSs), distributed generators (DGs), and energy storage systems (ESS) provides both challenges and opportunities for developing post-fault service restoration methodologies. Inter-temporal constraints of DGs, ESS, and loads under cold load pickup (CLPU) conditions impose extra complexity on problem formulation and solution. In this paper, a multi-time step service restoration methodology is proposed to optimally generate a sequence of control actions for controllable switches, ESSs, and dispatchable DGs to assist the system operator with decision making. The restoration sequence is determinedmore » to minimize the unserved customers by energizing the system step by step without violating operational constraints at each time step. The proposed methodology is formulated as a mixed-integer linear programming (MILP) model and can adapt to various operation conditions. Furthermore, the proposed method is validated through several case studies that are performed on modified IEEE 13-node and IEEE 123-node test feeders.« less

  16. The Berlin 2016 process: a summary of methodology for the 5th International Consensus Conference on Concussion in Sport.

    PubMed

    Meeuwisse, Willem H; Schneider, Kathryn J; Dvořák, Jiří; Omu, Onutobor Tobi; Finch, Caroline F; Hayden, K Alix; McCrory, Paul

    2017-06-01

    The purpose of this paper is to summarise the methodology for the 5th International Consensus Conference on Concussion in Sport. The 18 months of preparation included engagement of a scientific committee, an expert panel of 33 individuals in the field of concussion and a modified Delphi technique to determine the primary questions to be answered. The methodology also involved the writing of 12 systematic reviews to inform the consensus conference and submission and review of scientific abstracts. The meeting itself followed a 2-day open format, a 1-day closed expert panel meeting and two additional half day meetings to develop the Concussion Recognition Tool 5 (Pocket CRT5), Sport Concussion Assessment Tool 5 (SCAT5) and Child SCAT5. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  17. Accounting Methodology for Source Energy of Non-Combustible Renewable Electricity Generation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Donohoo-Vallett, Paul

    As non-combustible sources of renewable power (wind, solar, hydro, and geothermal) do not consume fuel, the “source” (or “primary”) energy from these sources cannot be accounted for in the same manner as it is for fossil fuel sources. The methodology chosen for these technologies is important as it affects the perception of the relative size of renewable source energy to fossil energy, affects estimates of source-based building energy use, and overall source energy based metrics such as energy productivity. This memo reviews the methodological choices, outlines implications of each choice, summarizes responses to a request for information on this topic,more » and presents guiding principles for the U.S. Department of Energy, (DOE) Office of Energy Efficiency and Renewable Energy (EERE) to use to determine where modifying the current renewable source energy accounting method used in EERE products and analyses would be appropriate to address the issues raised above.« less

  18. A method to preserve trends in quantile mapping bias correction of climate modeled temperature

    NASA Astrophysics Data System (ADS)

    Grillakis, Manolis G.; Koutroulis, Aristeidis G.; Daliakopoulos, Ioannis N.; Tsanis, Ioannis K.

    2017-09-01

    Bias correction of climate variables is a standard practice in climate change impact (CCI) studies. Various methodologies have been developed within the framework of quantile mapping. However, it is well known that quantile mapping may significantly modify the long-term statistics due to the time dependency of the temperature bias. Here, a method to overcome this issue without compromising the day-to-day correction statistics is presented. The methodology separates the modeled temperature signal into a normalized and a residual component relative to the modeled reference period climatology, in order to adjust the biases only for the former and preserve the signal of the later. The results show that this method allows for the preservation of the originally modeled long-term signal in the mean, the standard deviation and higher and lower percentiles of temperature. To illustrate the improvements, the methodology is tested on daily time series obtained from five Euro CORDEX regional climate models (RCMs).

  19. A probabilistic seismic risk assessment procedure for nuclear power plants: (I) Methodology

    USGS Publications Warehouse

    Huang, Y.-N.; Whittaker, A.S.; Luco, N.

    2011-01-01

    A new procedure for probabilistic seismic risk assessment of nuclear power plants (NPPs) is proposed. This procedure modifies the current procedures using tools developed recently for performance-based earthquake engineering of buildings. The proposed procedure uses (a) response-based fragility curves to represent the capacity of structural and nonstructural components of NPPs, (b) nonlinear response-history analysis to characterize the demands on those components, and (c) Monte Carlo simulations to determine the damage state of the components. The use of response-rather than ground-motion-based fragility curves enables the curves to be independent of seismic hazard and closely related to component capacity. The use of Monte Carlo procedure enables the correlation in the responses of components to be directly included in the risk assessment. An example of the methodology is presented in a companion paper to demonstrate its use and provide the technical basis for aspects of the methodology. ?? 2011 Published by Elsevier B.V.

  20. Multi-Time Step Service Restoration for Advanced Distribution Systems and Microgrids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Bo; Chen, Chen; Wang, Jianhui

    Modern power systems are facing increased risk of disasters that can cause extended outages. The presence of remote control switches (RCSs), distributed generators (DGs), and energy storage systems (ESS) provides both challenges and opportunities for developing post-fault service restoration methodologies. Inter-temporal constraints of DGs, ESS, and loads under cold load pickup (CLPU) conditions impose extra complexity on problem formulation and solution. In this paper, a multi-time step service restoration methodology is proposed to optimally generate a sequence of control actions for controllable switches, ESSs, and dispatchable DGs to assist the system operator with decision making. The restoration sequence is determinedmore » to minimize the unserved customers by energizing the system step by step without violating operational constraints at each time step. The proposed methodology is formulated as a mixed-integer linear programming (MILP) model and can adapt to various operation conditions. Furthermore, the proposed method is validated through several case studies that are performed on modified IEEE 13-node and IEEE 123-node test feeders.« less

  1. Extended cooperative control synthesis

    NASA Technical Reports Server (NTRS)

    Davidson, John B.; Schmidt, David K.

    1994-01-01

    This paper reports on research for extending the Cooperative Control Synthesis methodology to include a more accurate modeling of the pilot's controller dynamics. Cooperative Control Synthesis (CCS) is a methodology that addresses the problem of how to design control laws for piloted, high-order, multivariate systems and/or non-conventional dynamic configurations in the absence of flying qualities specifications. This is accomplished by emphasizing the parallel structure inherent in any pilot-controlled, augmented vehicle. The original CCS methodology is extended to include the Modified Optimal Control Model (MOCM), which is based upon the optimal control model of the human operator developed by Kleinman, Baron, and Levison in 1970. This model provides a modeling of the pilot's compensation dynamics that is more accurate than the simplified pilot dynamic representation currently in the CCS methodology. Inclusion of the MOCM into the CCS also enables the modeling of pilot-observation perception thresholds and pilot-observation attention allocation affects. This Extended Cooperative Control Synthesis (ECCS) allows for the direct calculation of pilot and system open- and closed-loop transfer functions in pole/zero form and is readily implemented in current software capable of analysis and design for dynamic systems. Example results based upon synthesizing an augmentation control law for an acceleration command system in a compensatory tracking task using the ECCS are compared with a similar synthesis performed by using the original CCS methodology. The ECCS is shown to provide augmentation control laws that yield more favorable, predicted closed-loop flying qualities and tracking performance than those synthesized using the original CCS methodology.

  2. Requirements analysis notebook for the flight data systems definition in the Real-Time Systems Engineering Laboratory (RSEL)

    NASA Astrophysics Data System (ADS)

    Wray, Richard B.

    1991-12-01

    A hybrid requirements analysis methodology was developed, based on the practices actually used in developing a Space Generic Open Avionics Architecture. During the development of this avionics architecture, a method of analysis able to effectively define the requirements for this space avionics architecture was developed. In this methodology, external interfaces and relationships are defined, a static analysis resulting in a static avionics model was developed, operating concepts for simulating the requirements were put together, and a dynamic analysis of the execution needs for the dynamic model operation was planned. The systems engineering approach was used to perform a top down modified structured analysis of a generic space avionics system and to convert actual program results into generic requirements. CASE tools were used to model the analyzed system and automatically generate specifications describing the model's requirements. Lessons learned in the use of CASE tools, the architecture, and the design of the Space Generic Avionics model were established, and a methodology notebook was prepared for NASA. The weaknesses of standard real-time methodologies for practicing systems engineering, such as Structured Analysis and Object Oriented Analysis, were identified.

  3. Integrated Control Using the SOFFT Control Structure

    NASA Technical Reports Server (NTRS)

    Halyo, Nesim

    1996-01-01

    The need for integrated/constrained control systems has become clearer as advanced aircraft introduced new coupled subsystems such as new propulsion subsystems with thrust vectoring and new aerodynamic designs. In this study, we develop an integrated control design methodology which accomodates constraints among subsystem variables while using the Stochastic Optimal Feedforward/Feedback Control Technique (SOFFT) thus maintaining all the advantages of the SOFFT approach. The Integrated SOFFT Control methodology uses a centralized feedforward control and a constrained feedback control law. The control thus takes advantage of the known coupling among the subsystems while maintaining the identity of subsystems for validation purposes and the simplicity of the feedback law to understand the system response in complicated nonlinear scenarios. The Variable-Gain Output Feedback Control methodology (including constant gain output feedback) is extended to accommodate equality constraints. A gain computation algorithm is developed. The designer can set the cross-gains between two variables or subsystems to zero or another value and optimize the remaining gains subject to the constraint. An integrated control law is designed for a modified F-15 SMTD aircraft model with coupled airframe and propulsion subsystems using the Integrated SOFFT Control methodology to produce a set of desired flying qualities.

  4. Control Law Design in a Computational Aeroelasticity Environment

    NASA Technical Reports Server (NTRS)

    Newsom, Jerry R.; Robertshaw, Harry H.; Kapania, Rakesh K.

    2003-01-01

    A methodology for designing active control laws in a computational aeroelasticity environment is given. The methodology involves employing a systems identification technique to develop an explicit state-space model for control law design from the output of a computational aeroelasticity code. The particular computational aeroelasticity code employed in this paper solves the transonic small disturbance aerodynamic equation using a time-accurate, finite-difference scheme. Linear structural dynamics equations are integrated simultaneously with the computational fluid dynamics equations to determine the time responses of the structure. These structural responses are employed as the input to a modern systems identification technique that determines the Markov parameters of an "equivalent linear system". The Eigensystem Realization Algorithm is then employed to develop an explicit state-space model of the equivalent linear system. The Linear Quadratic Guassian control law design technique is employed to design a control law. The computational aeroelasticity code is modified to accept control laws and perform closed-loop simulations. Flutter control of a rectangular wing model is chosen to demonstrate the methodology. Various cases are used to illustrate the usefulness of the methodology as the nonlinearity of the aeroelastic system is increased through increased angle-of-attack changes.

  5. Requirements analysis notebook for the flight data systems definition in the Real-Time Systems Engineering Laboratory (RSEL)

    NASA Technical Reports Server (NTRS)

    Wray, Richard B.

    1991-01-01

    A hybrid requirements analysis methodology was developed, based on the practices actually used in developing a Space Generic Open Avionics Architecture. During the development of this avionics architecture, a method of analysis able to effectively define the requirements for this space avionics architecture was developed. In this methodology, external interfaces and relationships are defined, a static analysis resulting in a static avionics model was developed, operating concepts for simulating the requirements were put together, and a dynamic analysis of the execution needs for the dynamic model operation was planned. The systems engineering approach was used to perform a top down modified structured analysis of a generic space avionics system and to convert actual program results into generic requirements. CASE tools were used to model the analyzed system and automatically generate specifications describing the model's requirements. Lessons learned in the use of CASE tools, the architecture, and the design of the Space Generic Avionics model were established, and a methodology notebook was prepared for NASA. The weaknesses of standard real-time methodologies for practicing systems engineering, such as Structured Analysis and Object Oriented Analysis, were identified.

  6. Beyond cysteine: recent developments in the area of targeted covalent inhibition.

    PubMed

    Mukherjee, Herschel; Grimster, Neil P

    2018-05-29

    Over the past decade targeted covalent inhibitors have undergone a renaissance due to the clinical validation and regulatory approval of several small molecule therapeutics that are designed to irreversibly modify their target protein. Invariably, these compounds rely on the serendipitous placement of a cysteine residue proximal to the small molecule binding site; while this strategy has afforded numerous successes, it necessarily limits the number of proteins that can be targeted by this approach. This drawback has led several research groups to develop novel methodologies that target non-cysteine residues for covalent modification. Herein, we survey the current literature of warheads that covalently modify non-cysteine amino acids in proteins. Copyright © 2018 Elsevier Ltd. All rights reserved.

  7. Development of agricultural biotechnology and biosafety regulations used to assess the safety of genetically modified crops in Iran.

    PubMed

    Mousavi, Amir; Malboobi, Mohammad A; Esmailzadeh, Nasrin S

    2007-01-01

    Rapid progress in the application of biotechnological methodologies and development of genetically modified crops in Iran necessitated intensive efforts to establish proper organizations and prepare required rules and regulations at the national level to ensure safe application of biotechnology in all pertinent aspects. Practically, preparation of a national biotechnology strategic plan in the country coincided with development of a national biosafety framework that was the basis for the drafted biosafety law. Although biosafety measures were observed by researchers voluntarily, the establishment of national biosafety organizations since the year 2000 built a great capacity to deal with biosafety issues in the present and future time, particularly with respect to food and agricultural biotechnology.

  8. Problem formulation in the environmental risk assessment for genetically modified plants

    PubMed Central

    Wolt, Jeffrey D.; Keese, Paul; Raybould, Alan; Burachik, Moisés; Gray, Alan; Olin, Stephen S.; Schiemann, Joachim; Sears, Mark; Wu, Felicia

    2009-01-01

    Problem formulation is the first step in environmental risk assessment (ERA) where policy goals, scope, assessment endpoints, and methodology are distilled to an explicitly stated problem and approach for analysis. The consistency and utility of ERAs for genetically modified (GM) plants can be improved through rigorous problem formulation (PF), producing an analysis plan that describes relevant exposure scenarios and the potential consequences of these scenarios. A properly executed PF assures the relevance of ERA outcomes for decision-making. Adopting a harmonized approach to problem formulation should bring about greater uniformity in the ERA process for GM plants among regulatory regimes globally. This paper is the product of an international expert group convened by the International Life Sciences Institute (ILSI) Research Foundation. PMID:19757133

  9. Predicting nurses' use of healthcare technology using the technology acceptance model: an integrative review.

    PubMed

    Strudwick, Gillian

    2015-05-01

    The benefits of healthcare technologies can only be attained if nurses accept and intend to fully use them. One of the most common models utilized to understand user acceptance of technology is the Technology Acceptance Model. This model and modified versions of it have only recently been applied in the healthcare literature among nurse participants. An integrative literature review was conducted on this topic. Ovid/MEDLINE, PubMed, Google Scholar, and CINAHL were searched yielding a total of 982 references. Upon eliminating duplicates and applying the inclusion and exclusion criteria, the review included a total of four dissertations, three symposium proceedings, and 13 peer-reviewed journal articles. These documents were appraised and reviewed. The results show that a modified Technology Acceptance Model with added variables could provide a better explanation of nurses' acceptance of healthcare technology. These added variables to modified versions of the Technology Acceptance Model are discussed, and the studies' methodologies are critiqued. Limitations of the studies included in the integrative review are also examined.

  10. Epidemiology Characteristics, Methodological Assessment and Reporting of Statistical Analysis of Network Meta-Analyses in the Field of Cancer

    PubMed Central

    Ge, Long; Tian, Jin-hui; Li, Xiu-xia; Song, Fujian; Li, Lun; Zhang, Jun; Li, Ge; Pei, Gai-qin; Qiu, Xia; Yang, Ke-hu

    2016-01-01

    Because of the methodological complexity of network meta-analyses (NMAs), NMAs may be more vulnerable to methodological risks than conventional pair-wise meta-analysis. Our study aims to investigate epidemiology characteristics, conduction of literature search, methodological quality and reporting of statistical analysis process in the field of cancer based on PRISMA extension statement and modified AMSTAR checklist. We identified and included 102 NMAs in the field of cancer. 61 NMAs were conducted using a Bayesian framework. Of them, more than half of NMAs did not report assessment of convergence (60.66%). Inconsistency was assessed in 27.87% of NMAs. Assessment of heterogeneity in traditional meta-analyses was more common (42.62%) than in NMAs (6.56%). Most of NMAs did not report assessment of similarity (86.89%) and did not used GRADE tool to assess quality of evidence (95.08%). 43 NMAs were adjusted indirect comparisons, the methods used were described in 53.49% NMAs. Only 4.65% NMAs described the details of handling of multi group trials and 6.98% described the methods of similarity assessment. The median total AMSTAR-score was 8.00 (IQR: 6.00–8.25). Methodological quality and reporting of statistical analysis did not substantially differ by selected general characteristics. Overall, the quality of NMAs in the field of cancer was generally acceptable. PMID:27848997

  11. A methodology to estimate uncertainty for emission projections through sensitivity analysis.

    PubMed

    Lumbreras, Julio; de Andrés, Juan Manuel; Pérez, Javier; Borge, Rafael; de la Paz, David; Rodríguez, María Encarnación

    2015-04-01

    Air pollution abatement policies must be based on quantitative information on current and future emissions of pollutants. As emission projections uncertainties are inevitable and traditional statistical treatments of uncertainty are highly time/resources consuming, a simplified methodology for nonstatistical uncertainty estimation based on sensitivity analysis is presented in this work. The methodology was applied to the "with measures" scenario for Spain, concretely over the 12 highest emitting sectors regarding greenhouse gas and air pollutants emissions. Examples of methodology application for two important sectors (power plants, and agriculture and livestock) are shown and explained in depth. Uncertainty bands were obtained up to 2020 by modifying the driving factors of the 12 selected sectors and the methodology was tested against a recomputed emission trend in a low economic-growth perspective and official figures for 2010, showing a very good performance. A solid understanding and quantification of uncertainties related to atmospheric emission inventories and projections provide useful information for policy negotiations. However, as many of those uncertainties are irreducible, there is an interest on how they could be managed in order to derive robust policy conclusions. Taking this into account, a method developed to use sensitivity analysis as a source of information to derive nonstatistical uncertainty bands for emission projections is presented and applied to Spain. This method simplifies uncertainty assessment and allows other countries to take advantage of their sensitivity analyses.

  12. Predicting the Reliability of Brittle Material Structures Subjected to Transient Proof Test and Service Loading

    NASA Astrophysics Data System (ADS)

    Nemeth, Noel N.; Jadaan, Osama M.; Palfi, Tamas; Baker, Eric H.

    Brittle materials today are being used, or considered, for a wide variety of high tech applications that operate in harsh environments, including static and rotating turbine parts, thermal protection systems, dental prosthetics, fuel cells, oxygen transport membranes, radomes, and MEMS. Designing brittle material components to sustain repeated load without fracturing while using the minimum amount of material requires the use of a probabilistic design methodology. The NASA CARES/Life 1 (Ceramic Analysis and Reliability Evaluation of Structure/Life) code provides a general-purpose analysis tool that predicts the probability of failure of a ceramic component as a function of its time in service. This capability includes predicting the time-dependent failure probability of ceramic components against catastrophic rupture when subjected to transient thermomechanical loads (including cyclic loads). The developed methodology allows for changes in material response that can occur with temperature or time (i.e. changing fatigue and Weibull parameters with temperature or time). For this article an overview of the transient reliability methodology and how this methodology is extended to account for proof testing is described. The CARES/Life code has been modified to have the ability to interface with commercially available finite element analysis (FEA) codes executed for transient load histories. Examples are provided to demonstrate the features of the methodology as implemented in the CARES/Life program.

  13. Advanced piloted aircraft flight control system design methodology. Volume 2: The FCX flight control design expert system

    NASA Technical Reports Server (NTRS)

    Myers, Thomas T.; Mcruer, Duane T.

    1988-01-01

    The development of a comprehensive and electric methodology for conceptual and preliminary design of flight control systems is presented and illustrated. The methodology is focused on the design states starting with the layout of system requirements and ending when some viable competing system architectures (feedback control structures) are defined. The approach is centered on the human pilot and the aircraft as both the sources of, and the keys to the solution of, many flight control problems. The methodology relies heavily on computational procedures which are highly interactive with the design engineer. To maximize effectiveness, these techniques, as selected and modified to be used together in the methodology, form a cadre of computational tools specifically tailored for integrated flight control system preliminary design purposes. The FCX expert system as presently developed is only a limited prototype capable of supporting basic lateral-directional FCS design activities related to the design example used. FCX presently supports design of only one FCS architecture (yaw damper plus roll damper) and the rules are largely focused on Class IV (highly maneuverable) aircraft. Despite this limited scope, the major elements which appear necessary for application of knowledge-based software concepts to flight control design were assembled and thus FCX represents a prototype which can be tested, critiqued and evolved in an ongoing process of development.

  14. Environmental Quality Standards Research on Wastewaters of Army Ammunition Plants

    DTIC Science & Technology

    1978-06-01

    characterization of nitrocellulose wastewaters. We are grateful to LTC Leroy H. Reuter and LTC Robert P. Carnahan of the US Army Medical Research and...analytical methodology was required to characterize the wastes. The techniques used for fingerprinting (showing that the same compound exists although its...examination of the NC wastewaters has somewhat clarified the problem of characterizing the NC and has caused us to change or modify previous

  15. Metabolic glycoengineering: Sialic acid and beyond

    PubMed Central

    Du, Jian; Meledeo, M Adam; Wang, Zhiyun; Khanna, Hargun S; Paruchuri, Venkata D P; Yarema, Kevin J

    2009-01-01

    This report provides a perspective on metabolic glycoengineering methodology developed over the past two decades that allows natural sialic acids to be replaced with chemical variants in living cells and animals. Examples are given demonstrating how this technology provides the glycoscientist with chemical tools that are beginning to reproduce Mother Nature's control over complex biological systems – such as the human brain – through subtle modifications in sialic acid chemistry. Several metabolic substrates (e.g., ManNAc, Neu5Ac, and CMP-Neu5Ac analogs) can be used to feed flux into the sialic acid biosynthetic pathway resulting in numerous – and sometime quite unexpected – biological repercussions upon nonnatural sialoside display in cellular glycans. Once on the cell surface, ketone-, azide-, thiol-, or alkyne-modified glycans can be transformed with numerous ligands via bioorthogonal chemoselective ligation reactions, greatly increasing the versatility and potential application of this technology. Recently, sialic acid glycoengineering methodology has been extended to other pathways with analog incorporation now possible in surface-displayed GalNAc and fucose residues as well as nucleocytoplasmic O-GlcNAc-modified proteins. Finally, recent efforts to increase the “druggability” of sugar analogs used in metabolic glycoengineering, which have resulted in unanticipated “scaffold-dependent” activities, are summarized. PMID:19675091

  16. Anchorage of iron hydro(oxide) nanoparticles onto activated carbon to remove As(V) from water.

    PubMed

    Nieto-Delgado, Cesar; Rangel-Mendez, Jose Rene

    2012-06-01

    The adsorption of arsenic (V) by granular iron hydro(oxides) has been proven to be a reliable technique. However, due to the low mechanical properties of this material, it is difficult to apply it in full scale water treatment. Hence, the aim of this research is to develop a methodology to anchor iron hydro(oxide) nanoparticles onto activated carbon, in which the iron hydro(oxide) nanoparticles will give the activated carbon an elevated active surface area for arsenic adsorption and also help avoid the blockage of the activated carbon pores. Three activated carbons were modified by employing the thermal hydrolysis of iron as the anchorage procedure. The effects of hydrolysis temperature (60-120 °C), hydrolysis time (4-16 h), and FeCl(3) concentration (0.4-3 mol Fe/L) were studied by the surface response methodology. The iron content of the modified samples ranged from 0.73 to 5.27%, with the higher end of the range pertaining to the carbons with high oxygen content. The materials containing smaller iron hydro(oxide) particles exhibited an enhanced arsenic adsorption capacity. The best adsorbent material reported an arsenic adsorption capacity of 4.56 mg As/g at 1.5 ppm As at equilibrium and pH 7. Copyright © 2012 Elsevier Ltd. All rights reserved.

  17. Recognising and referring children exposed to domestic abuse: a multi-professional, proactive systems-based evaluation using a modified Failure Mode and Effects Analysis (FMEA).

    PubMed

    Ashley, Laura; Armitage, Gerry; Taylor, Julie

    2017-03-01

    Failure Modes and Effects Analysis (FMEA) is a prospective quality assurance methodology increasingly used in healthcare, which identifies potential vulnerabilities in complex, high-risk processes and generates remedial actions. We aimed, for the first time, to apply FMEA in a social care context to evaluate the process for recognising and referring children exposed to domestic abuse within one Midlands city safeguarding area in England. A multidisciplinary, multi-agency team of 10 front-line professionals undertook the FMEA, using a modified methodology, over seven group meetings. The FMEA included mapping out the process under evaluation to identify its component steps, identifying failure modes (potential errors) and possible causes for each step and generating corrective actions. In this article, we report the output from the FMEA, including illustrative examples of the failure modes and corrective actions generated. We also present an analysis of feedback from the FMEA team and provide future recommendations for the use of FMEA in appraising social care processes and practice. Although challenging, the FMEA was unequivocally valuable for team members and generated a significant number of corrective actions locally for the safeguarding board to consider in its response to children exposed to domestic abuse. © 2016 John Wiley & Sons Ltd.

  18. The Flint Food Store Survey: combining spatial analysis with a modified Nutrition Environment Measures Survey in Stores (NEMS-S) to measure the community and consumer nutrition environments.

    PubMed

    Shaver, Erika R; Sadler, Richard C; Hill, Alex B; Bell, Kendall; Ray, Myah; Choy-Shin, Jennifer; Lerner, Joy; Soldner, Teresa; Jones, Andrew D

    2018-06-01

    The goal of the present study was to use a methodology that accurately and reliably describes the availability, price and quality of healthy foods at both the store and community levels using the Nutrition Environment Measures Survey in Stores (NEMS-S), to propose a spatial methodology for integrating these store and community data into measures for defining objective food access. Two hundred and sixty-five retail food stores in and within 2 miles (3·2 km) of Flint, Michigan, USA, were mapped using ArcGIS mapping software. A survey based on the validated NEMS-S was conducted at each retail food store. Scores were assigned to each store based on a modified version of the NEMS-S scoring system and linked to the mapped locations of stores. Neighbourhood characteristics (race and socio-economic distress) were appended to each store. Finally, spatial and kernel density analyses were run on the mapped store scores to obtain healthy food density metrics. Regression analyses revealed that neighbourhoods with higher socio-economic distress had significantly lower dairy sub-scores compared with their lower-distress counterparts (β coefficient=-1·3; P=0·04). Additionally, supermarkets were present only in neighbourhoods with <60 % African-American population and low socio-economic distress. Two areas in Flint had an overall NEMS-S score of 0. By identifying areas with poor access to healthy foods via a validated metric, this research can be used help local government and organizations target interventions to high-need areas. Furthermore, the methodology used for the survey and the mapping exercise can be replicated in other cities to provide comparable results.

  19. The use of transcranial magnetic stimulation to evaluate cortical excitability of lower limb musculature: Challenges and opportunities.

    PubMed

    Kesar, Trisha M; Stinear, James W; Wolf, Steven L

    2018-05-05

    Neuroplasticity is a fundamental yet relatively unexplored process that can impact rehabilitation of lower extremity (LE) movements. Transcranial magnetic stimulation (TMS) has gained widespread application as a non-invasive brain stimulation technique for evaluating neuroplasticity of the corticospinal pathway. However, a majority of TMS studies have been performed on hand muscles, with a paucity of TMS investigations focused on LE muscles. This perspective review paper proposes that there are unique methodological challenges associated with using TMS to evaluate corticospinal excitability of lower limb muscles. The challenges include: (1) the deeper location of the LE motor homunculus; (2) difficulty with targeting individual LE muscles during TMS; and (3) differences in corticospinal circuity controlling upper and lower limb muscles. We encourage future investigations that modify traditional methodological approaches to help address these challenges. Systematic TMS investigations are needed to determine the extent of overlap in corticomotor maps for different LE muscles. A simple, yet informative methodological solution involves simultaneous recordings from multiple LE muscles, which will provide the added benefit of observing how other relevant muscles co-vary in their responses during targeted TMS assessment directed toward a specific muscle. Furthermore, conventionally used TMS methods (e.g., determination of hot spot location and motor threshold) may need to be modified for TMS studies involving LE muscles. Additional investigations are necessary to determine the influence of testing posture as well as activation state of adjacent and distant LE muscles on TMS-elicited responses. An understanding of these challenges and solutions specific to LE TMS will improve the ability of neurorehabilitation clinicians to interpret TMS literature, and forge novel future directions for neuroscience research focused on elucidating neuroplasticity processes underlying locomotion and gait training.

  20. Plasma treatment of polyethersulfone membrane for benzene removal from water by air gap membrane distillation.

    PubMed

    Pedram, Sara; Mortaheb, Hamid Reza; Arefi-Khonsari, Farzaneh

    2018-01-01

    In order to obtain a durable cost-effective membrane for membrane distillation (MD) process, flat sheet polyethersulfone (PES) membranes were modified by an atmospheric pressure nonequilibrium plasma generated using a dielectric barrier discharge in a mixture of argon and hexamethyldisiloxane as the organosilicon precursor. The surface properties of the plasma-modified membranes were characterized by water contact angle (CA), liquid entry pressure, X-ray photoelectron spectroscopy, scanning electron microscopy, and atomic force microscopy. The water CA of the membrane was increased from 64° to 104° by depositing a Si(CH 3 )-rich thin layer. While the pristine PES membrane was not applicable in the MD process, the modified PES membrane could be applied for the first time in an air gap membrane distillation setup for the removal of benzene as a volatile organic compound from water. The experimental design using central composite design and response surface methodology was applied to study the effects of feed temperature, concentration, and flow rate as well as their binary interactions on the overall permeate flux and separation factor. The separation factor and permeation flux of the modified PES membrane at optimum conditions were comparable with those of commercial polytetrafluoroethylene membrane.

  1. A modified approach to controller partitioning

    NASA Technical Reports Server (NTRS)

    Garg, Sanjay; Veillette, Robert J.

    1993-01-01

    The idea of computing a decentralized control law for the integrated flight/propulsion control of an aircraft by partitioning a given centralized controller is investigated. An existing controller partitioning methodology is described, and a modified approach is proposed with the objective of simplifying the associated controller approximation problem. Under the existing approach, the decentralized control structure is a variable in the partitioning process; by contrast, the modified approach assumes that the structure is fixed a priori. Hence, the centralized controller design may take the decentralized control structure into account. Specifically, the centralized controller may be designed to include all the same inputs and outputs as the decentralized controller; then, the two controllers may be compared directly, simplifying the partitioning process considerably. Following the modified approach, a centralized controller is designed for an example aircraft mode. The design includes all the inputs and outputs to be used in a specified decentralized control structure. However, it is shown that the resulting centralized controller is not well suited for approximation by a decentralized controller of the given structure. The results indicate that it is not practical in general to cast the controller partitioning problem as a direct controller approximation problem.

  2. Effects of processing parameters on the caffeine extraction yield during decaffeination of black tea using pilot-scale supercritical carbon dioxide extraction technique.

    PubMed

    Ilgaz, Saziye; Sat, Ihsan Gungor; Polat, Atilla

    2018-04-01

    In this pilot-scale study supercritical carbon dioxide (SCCO 2 ) extraction technique was used for decaffeination of black tea. Pressure (250, 375, 500 bar), extraction time (60, 180, 300 min), temperature (55, 62.5, 70 °C), CO 2 flow rate (1, 2, 3 L/min) and modifier quantity (0, 2.5, 5 mol%) were selected as extraction parameters. Three-level and five-factor response surface methodology experimental design with a Box-Behnken type was employed to generate 46 different processing conditions. 100% of caffeine from black tea was removed under two different extraction conditions; one of which was consist of 375 bar pressure, 62.5 °C temperature, 300 min extraction time, 2 L/min CO 2 flow rate and 5 mol% modifier concentration and the other was composed of same temperature, pressure and extraction time conditions with 3 L/min CO 2 flow rate and 2.5 mol% modifier concentration. Results showed that extraction time, pressure, CO 2 flow rate and modifier quantity had great impact on decaffeination yield.

  3. Ion implantation modified stainless steel as a substrate for hydroxyapatite deposition. Part II. Biomimetic layer growth and characterization.

    PubMed

    Pramatarova, L; Pecheva, E; Krastev, V

    2007-03-01

    The interest in stainless steel as a material widely used in medicine and dentistry has stimulated extensive studies on improving its bone-bonding properties. AISI 316 stainless steel is modified by a sequential ion implantation of Ca and P ions (the basic ions of hydroxyapatite), and by Ca and P implantation and subsequent thermal treatment in air (600( composite function)C, 1 h). This paper investigates the ability of the as-modified surfaces to induce hydroxyapatite deposition by using a biomimetic approach, i.e. immersion in a supersaturated aqueous solution resembling the human blood plasma (the so-called simulated body fluid). We describe our experimental procedure and results, and discuss the physico-chemical properties of the deposed hydroxyapatite on the modified stainless steel surfaces. It is shown that the implantation of a selected combination of ions followed by the applied methodology of the sample soaking in the simulated body fluid yield the growth of hydroxyapatite layers with composition and structure resembling those of the bone apatite. The grown layers are found suitable for studying the process of mineral formation in nature (biomineralization).

  4. Towards the concept of disease-modifier in post-stroke or vascular cognitive impairment: a consensus report.

    PubMed

    Bordet, Régis; Ihl, Ralf; Korczyn, Amos D; Lanza, Giuseppe; Jansa, Jelka; Hoerr, Robert; Guekht, Alla

    2017-05-24

    Vascular cognitive impairment (VCI) is a complex spectrum encompassing post-stroke cognitive impairment (PSCI) and small vessel disease-related cognitive impairment. Despite the growing health, social, and economic burden of VCI, to date, no specific treatment is available, prompting the introduction of the concept of a disease modifier. Within this clinical spectrum, VCI and PSCI remain advancing conditions as neurodegenerative diseases with progression of both vascular and degenerative lesions accounting for cognitive decline. Disease-modifying strategies should integrate both pharmacological and non-pharmacological multimodal approaches, with pleiotropic effects targeting (1) endothelial and brain-blood barrier dysfunction; (2) neuronal death and axonal loss; (3) cerebral plasticity and compensatory mechanisms; and (4) degenerative-related protein misfolding. Moreover, pharmacological and non-pharmacological treatment in PSCI or VCI requires valid study designs clearly stating the definition of basic methodological issues, such as the instruments that should be used to measure eventual changes, the biomarker-based stratification of participants to be investigated, and statistical tests, as well as the inclusion and exclusion criteria that should be applied. A consensus emerged to propose the development of a disease-modifying strategy in VCI and PSCI based on pleiotropic pharmacological and non-pharmacological approaches.

  5. Quality of Reporting Randomized Controlled Trials in Five Leading Neurology Journals in 2008 and 2013 Using the Modified "Risk of Bias" Tool.

    PubMed

    Zhai, Xiao; Cui, Jin; Wang, Yiran; Qu, Zhiquan; Mu, Qingchun; Li, Peiwen; Zhang, Chaochao; Yang, Mingyuan; Chen, Xiao; Chen, Ziqiang; Li, Ming

    2017-03-01

    To examine the risk of bias of methodological quality of reporting randomized clinical trials (RCTs) in major neurology journals before and after the update (2011) of Cochrane risk of bias tool. RCTs in 5 leading neurology journals in 2008 and 2013 were searched systematically. Characteristics were extracted based on the list of the modified Cochrane Collaboration's tool. Country, number of patients, type of intervention, and funding source also were examined for further analysis. A total of 138 RCTs were enrolled in this study. The rates of following a trial plan were 61.6% for the allocation generation, 52.9% for the allocation concealment, 84.8% for the blinding of the participants or the personnel, 34.8% for the blinding of outcome assessment, 78.3% for the incomplete outcome data, and 67.4% for the selective reporting. A significant setback was found in "the selective reporting" in 2013 than that in 2008. Trials performed by multi-centers and on a large scale had significantly more "low risk of bias" trials. Not only the number of surgical trials (5.8%) was much less than that of trials using drugs (73.9%), but also the reporting quality of surgical trials were worse (P = 0.008). Finally, only 17.4% trials met the criterion of "low risk of bias." The modified "risk of bias" tool is an improved version for assessment. Methodological quality of reporting RCTs in the 5neurology journals is unsatisfactory, especially that for surgical RCTs, and it could be further improved. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. Neural network approach in multichannel auditory event-related potential analysis.

    PubMed

    Wu, F Y; Slater, J D; Ramsay, R E

    1994-04-01

    Even though there are presently no clearly defined criteria for the assessment of P300 event-related potential (ERP) abnormality, it is strongly indicated through statistical analysis that such criteria exist for classifying control subjects and patients with diseases resulting in neuropsychological impairment such as multiple sclerosis (MS). We have demonstrated the feasibility of artificial neural network (ANN) methods in classifying ERP waveforms measured at a single channel (Cz) from control subjects and MS patients. In this paper, we report the results of multichannel ERP analysis and a modified network analysis methodology to enhance automation of the classification rule extraction process. The proposed methodology significantly reduces the work of statistical analysis. It also helps to standardize the criteria of P300 ERP assessment and facilitate the computer-aided analysis on neuropsychological functions.

  7. Contemporary Research on Parenting: Conceptual, Methodological, and Translational Issues

    PubMed Central

    Sleddens, Ester F. C.; Berge, Jerica; Connell, Lauren; Govig, Bert; Hennessy, Erin; Liggett, Leanne; Mallan, Kimberley; Santa Maria, Diane; Odoms-Young, Angela; St. George, Sara M.

    2013-01-01

    Abstract Researchers over the last decade have documented the association between general parenting style and numerous factors related to childhood obesity (e.g., children's eating behaviors, physical activity, and weight status). Many recent childhood obesity prevention programs are family focused and designed to modify parenting behaviors thought to contribute to childhood obesity risk. This article presents a brief consideration of conceptual, methodological, and translational issues that can inform future research on the role of parenting in childhood obesity. They include: (1) General versus domain specific parenting styles and practices; (2) the role of ethnicity and culture; (3) assessing bidirectional influences; (4) broadening assessments beyond the immediate family; (5) novel approaches to parenting measurement; and (6) designing effective interventions. Numerous directions for future research are offered. PMID:23944927

  8. A method for the design of transonic flexible wings

    NASA Technical Reports Server (NTRS)

    Smith, Leigh Ann; Campbell, Richard L.

    1990-01-01

    Methodology was developed for designing airfoils and wings at transonic speeds which includes a technique that can account for static aeroelastic deflections. This procedure is capable of designing either supercritical or more conventional airfoil sections. Methods for including viscous effects are also illustrated and are shown to give accurate results. The methodology developed is an interactive system containing three major parts. A design module was developed which modifies airfoil sections to achieve a desired pressure distribution. This design module works in conjunction with an aerodynamic analysis module, which for this study is a small perturbation transonic flow code. Additionally, an aeroelastic module is included which determines the wing deformation due to the calculated aerodynamic loads. Because of the modular nature of the method, it can be easily coupled with any aerodynamic analysis code.

  9. Interventions to modify sexual risk behaviours for preventing HIV in homeless youth.

    PubMed

    Naranbhai, Vivek; Abdool Karim, Quarraisha; Meyer-Weitz, Anna

    2011-01-19

    Homeless youth are at high risk for HIV infection as a consequence of risky sexual behaviour. Interventions for homeless youth are challenging. Assessment of the effectiveness of interventions to modify sexual risk behaviours for preventing HIV in homeless youth is needed. To evaluate and summarize the effectiveness of interventions for modifying sexual risk behaviours and preventing transmission of HIV among homeless youth. We searched electronic databases (CENTRAL, MEDLINE, EMBASE, AIDSearch, Gateway, PsycInfo, LILACS), reference lists of eligible articles, international health agency publication lists, and clinical trial registries. The search was updated January 2010. We contacted authors of published reports and other key role players. Randomised studies of interventions to modify sexual risk behaviour (biological, self-reporting of sexual-risk behaviour or health-seeking behaviour) in homeless youth (12-24 years). Data from eligible studies were extracted by two reviewers. We assessed risk of bias per the Cochrane Collaborations tool. None of the eligible studies reported any primary biological outcomes for this review. Reports of self-reporting sexual risk behaviour outcomes varied across studies precluding calculation of summary measures of effect; we present the outcomes descriptively for each study. We contacted authors for missing or ambiguous data. We identified three eligible studies after screening a total of 255 unique records. All three were performed in the United States of America and recruited substance-abusing male and female adolescents (total N=615) through homeless shelters into randomised controlled trials of independent and non-overlapping behavioural interventions. The three trials differed in theoretical background, delivery method, dosage (number of sessions,) content and outcome assessments. Overall, the variability in delivery and outcomes precluded estimation of summary of effect measures. We assessed the risk of bias to be high for each of the studies. Whilst some effect of the interventions on outcome measures were reported, heterogeneity and lack of robustness in these studies necessitate caution in interpreting the effectiveness of these interventions.  The body of evidence does not permit conclusions on the impact of interventions to modify sexual risk behaviour in homeless youth; more research is required. While the psychosocial and contextual factors that fuel sexual risk behaviours among homeless youth challenge stringent methodologies of RCT's, novel ways for program delivery and trial retention are in need of development. Future trials should comply with rigorous methodology in design, delivery, outcome measurement and reporting.

  10. Removal of Cr(VI) by surfactant modified Auricularia auricula spent substrate: biosorption condition and mechanism.

    PubMed

    Dong, Liying; Jin, Yu; Song, Tao; Liang, Jinsong; Bai, Xin; Yu, Sumei; Teng, Chunying; Wang, Xin; Qu, Juanjuan; Huang, Xiaomei

    2017-07-01

    Auricularia auricula spent substrate (AASS) modified by didodecyldimethylammonium bromide(DDAB) was used as adsorbent to remove Cr(VI) from aqueous solution. Based on a single-factor experiment and response surface methodology, the optimal conditions were adsorbent dosage of 1.5 g/L, pH value of 4.0, initial Cr(VI) concentration of 19 mg/L, temperature of 25 °C, biosorption time of 120 min, rotational speed of 150 r/min, respectively, under which biosorption capacity could reach 12.16 mg/g compared with unmodified AASS (6.058 mg/g). DDAB modification could enlarge the specific surface area and porous diameter of the adsorbents, and supply hydrophilic and hydrophobic groups capable of adsorbing at the interfaces. In addition, DDAB increased ionic exchange and complex formation demonstrated by variations of elemental contents, shifts of carboxyl, amine groups, hydroxyl, alkyl chains, and phosphate groups as well as the crystal structure of the Cr-O compounds. Variations of peaks and energy in XPS analysis also testified the reduction of Cr(VI) to Cr(III).The biosorption behavior of modified AASS was in line with Langmuir and Freundlich isotherm equation. The final regeneration efficiency was 62.33% after three biosorption-desorption cycles. Apparently, DDBA is a eximious modifier and DDBA-modified AASS was very efficient for Cr(VI) removal.

  11. Characterization of Combustion Dynamics, Detection, and Prevention of an Unstable Combustion State Based on a Complex-Network Theory

    NASA Astrophysics Data System (ADS)

    Gotoda, Hiroshi; Kinugawa, Hikaru; Tsujimoto, Ryosuke; Domen, Shohei; Okuno, Yuta

    2017-04-01

    Complex-network theory has attracted considerable attention for nearly a decade, and it enables us to encompass our understanding of nonlinear dynamics in complex systems in a wide range of fields, including applied physics and mechanical, chemical, and electrical engineering. We conduct an experimental study using a pragmatic online detection methodology based on complex-network theory to prevent a limiting unstable state such as blowout in a confined turbulent combustion system. This study introduces a modified version of the natural visibility algorithm based on the idea of a visibility limit to serve as a pragmatic online detector. The average degree of the modified version of the natural visibility graph allows us to detect the onset of blowout, resulting in online prevention.

  12. Disentangling early language development: modeling lexical and grammatical acquisition using an extension of case-study methodology.

    PubMed

    Robinson, B F; Mervis, C B

    1998-03-01

    The early lexical and grammatical development of 1 male child is examined with growth curves and dynamic-systems modeling procedures. Lexical-development described a pattern of logistic growth (R2 = .98). Lexical and plural development shared the following characteristics: Plural growth began only after a threshold was reached in vocabulary size; lexical growth slowed as plural growth increased. As plural use reached full mastery, lexical growth began again to increase. It was hypothesized that a precursor model (P. van Geert, 1991) would fit these data. Subsequent testing indicated that the precursor model, modified to incorporate brief yet intensive plural growth, provided a suitable fit. The value of the modified precursor model for the explication of processes implicated in language development is discussed.

  13. ASHMET: A computer code for estimating insolation incident on tilted surfaces

    NASA Technical Reports Server (NTRS)

    Elkin, R. F.; Toelle, R. G.

    1980-01-01

    A computer code, ASHMET, was developed by MSFC to estimate the amount of solar insolation incident on the surfaces of solar collectors. Both tracking and fixed-position collectors were included. Climatological data for 248 U. S. locations are built into the code. The basic methodology used by ASHMET is the ASHRAE clear-day insolation relationships modified by a clearness index derived from SOLMET-measured solar radiation data to a horizontal surface.

  14. Transmutation Fuel Performance Code Thermal Model Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gregory K. Miller; Pavel G. Medvedev

    2007-09-01

    FRAPCON fuel performance code is being modified to be able to model performance of the nuclear fuels of interest to the Global Nuclear Energy Partnership (GNEP). The present report documents the effort for verification of the FRAPCON thermal model. It was found that, with minor modifications, FRAPCON thermal model temperature calculation agrees with that of the commercial software ABAQUS (Version 6.4-4). This report outlines the methodology of the verification, code input, and calculation results.

  15. Near-wall turbulence alteration through thin streamwise riblets

    NASA Technical Reports Server (NTRS)

    Wilkinson, Stephen P.; Lazos, Barry S.

    1987-01-01

    The possibility of improving the level of drag reduction associated with near-wall riblets is considered. The methodology involves the use of a hot-wire anemometer to study various surface geometries on small, easily constructed models. These models consist of small, adjacent rectangular channels on the wall aligned in the streamwise direction. The VITA technique is modified and applied to thin-element-array and smooth flat-plate data and the results are indicated schematically.

  16. An Automated Energy Detection Algorithm Based on Morphological Filter Processing with a Modified Watershed Transform

    DTIC Science & Technology

    2018-01-01

    collected data. These statistical techniques are under the area of descriptive statistics, which is a methodology to condense the data in quantitative ...ARL-TR-8270 ● JAN 2018 US Army Research Laboratory An Automated Energy Detection Algorithm Based on Morphological Filter...report when it is no longer needed. Do not return it to the originator. ARL-TR-8270 ● JAN 2017 US Army Research Laboratory An

  17. An Analytical Methodology for Predicting Repair Time Distributions of Advanced Technology Aircraft.

    DTIC Science & Technology

    1985-12-01

    1984. 3. Barlow, Richard E. "Mathematical Theory of Reliabilitys A Historical Perspective." ZEEE Transactions on Reliability, 33. 16-19 (April 1984...Technology (AU), Wright-Patterson AFB OH, March 1971. 11. Coppola, Anthony. "Reliability Engineering of J- , Electronic Equipment," ZEEE Transactions on...1982. 64. Woodruff, Brian W. at al. "Modified Goodness-o-Fit Tests for Gamma Distributions with Unknown Location and Scale Parameters," ZEEE

  18. An optimal baseline selection methodology for data-driven damage detection and temperature compensation in acousto-ultrasonics

    NASA Astrophysics Data System (ADS)

    Torres-Arredondo, M.-A.; Sierra-Pérez, Julián; Cabanes, Guénaël

    2016-05-01

    The process of measuring and analysing the data from a distributed sensor network all over a structural system in order to quantify its condition is known as structural health monitoring (SHM). For the design of a trustworthy health monitoring system, a vast amount of information regarding the inherent physical characteristics of the sources and their propagation and interaction across the structure is crucial. Moreover, any SHM system which is expected to transition to field operation must take into account the influence of environmental and operational changes which cause modifications in the stiffness and damping of the structure and consequently modify its dynamic behaviour. On that account, special attention is paid in this paper to the development of an efficient SHM methodology where robust signal processing and pattern recognition techniques are integrated for the correct interpretation of complex ultrasonic waves within the context of damage detection and identification. The methodology is based on an acousto-ultrasonics technique where the discrete wavelet transform is evaluated for feature extraction and selection, linear principal component analysis for data-driven modelling and self-organising maps for a two-level clustering under the principle of local density. At the end, the methodology is experimentally demonstrated and results show that all the damages were detectable and identifiable.

  19. A semi-quantitative approach to GMO risk-benefit analysis.

    PubMed

    Morris, E Jane

    2011-10-01

    In many countries there are increasing calls for the benefits of genetically modified organisms (GMOs) to be considered as well as the risks, and for a risk-benefit analysis to form an integral part of GMO regulatory frameworks. This trend represents a shift away from the strict emphasis on risks, which is encapsulated in the Precautionary Principle that forms the basis for the Cartagena Protocol on Biosafety, and which is reflected in the national legislation of many countries. The introduction of risk-benefit analysis of GMOs would be facilitated if clear methodologies were available to support the analysis. Up to now, methodologies for risk-benefit analysis that would be applicable to the introduction of GMOs have not been well defined. This paper describes a relatively simple semi-quantitative methodology that could be easily applied as a decision support tool, giving particular consideration to the needs of regulators in developing countries where there are limited resources and experience. The application of the methodology is demonstrated using the release of an insect resistant maize variety in South Africa as a case study. The applicability of the method in the South African regulatory system is also discussed, as an example of what might be involved in introducing changes into an existing regulatory process.

  20. The integrative review: updated methodology.

    PubMed

    Whittemore, Robin; Knafl, Kathleen

    2005-12-01

    The aim of this paper is to distinguish the integrative review method from other review methods and to propose methodological strategies specific to the integrative review method to enhance the rigour of the process. Recent evidence-based practice initiatives have increased the need for and the production of all types of reviews of the literature (integrative reviews, systematic reviews, meta-analyses, and qualitative reviews). The integrative review method is the only approach that allows for the combination of diverse methodologies (for example, experimental and non-experimental research), and has the potential to play a greater role in evidence-based practice for nursing. With respect to the integrative review method, strategies to enhance data collection and extraction have been developed; however, methods of analysis, synthesis, and conclusion drawing remain poorly formulated. A modified framework for research reviews is presented to address issues specific to the integrative review method. Issues related to specifying the review purpose, searching the literature, evaluating data from primary sources, analysing data, and presenting the results are discussed. Data analysis methods of qualitative research are proposed as strategies that enhance the rigour of combining diverse methodologies as well as empirical and theoretical sources in an integrative review. An updated integrative review method has the potential to allow for diverse primary research methods to become a greater part of evidence-based practice initiatives.

  1. MODIFIED PATH METHODOLOGY FOR OBTAINING INTERVAL-SCALED POSTURAL ASSESSMENTS OF FARMWORKERS.

    PubMed

    Garrison, Emma B; Dropkin, Jonathan; Russell, Rebecca; Jenkins, Paul

    2018-01-29

    Agricultural workers perform tasks that frequently require awkward and extreme postures that are associated with musculoskeletal disorders (MSDs). The PATH (Posture, Activity, Tools, Handling) system currently provides a sound methodology for quantifying workers' exposure to these awkward postures on an ordinal scale of measurement, which places restrictions on the choice of analytic methods. This study reports a modification of the PATH methodology that instead captures these postures as degrees of flexion, an interval-scaled measurement. Rather than making live observations in the field, as in PATH, the postural assessments were performed on photographs using ImageJ photo analysis software. Capturing the postures in photographs permitted more careful measurement of the degrees of flexion. The current PATH methodology requires that the observer in the field be trained in the use of PATH, whereas the single photographer used in this modification requires only sufficient training to maintain the proper camera angle. Ultimately, these interval-scale measurements could be combined with other quantitative measures, such as those produced by electromyograms (EMGs), to provide more sophisticated estimates of future risk for MSDs. Further, these data can provide a baseline from which the effects of interventions designed to reduce hazardous postures can be calculated with greater precision. Copyright© by the American Society of Agricultural Engineers.

  2. Methodology for estimating human perception to tremors in high-rise buildings

    NASA Astrophysics Data System (ADS)

    Du, Wenqi; Goh, Key Seng; Pan, Tso-Chien

    2017-07-01

    Human perception to tremors during earthquakes in high-rise buildings is usually associated with psychological discomfort such as fear and anxiety. This paper presents a methodology for estimating the level of perception to tremors for occupants living in high-rise buildings subjected to ground motion excitations. Unlike other approaches based on empirical or historical data, the proposed methodology performs a regression analysis using the analytical results of two generic models of 15 and 30 stories. The recorded ground motions in Singapore are collected and modified for structural response analyses. Simple predictive models are then developed to estimate the perception level to tremors based on a proposed ground motion intensity parameter—the average response spectrum intensity in the period range between 0.1 and 2.0 s. These models can be used to predict the percentage of occupants in high-rise buildings who may perceive the tremors at a given ground motion intensity. Furthermore, the models are validated with two recent tremor events reportedly felt in Singapore. It is found that the estimated results match reasonably well with the reports in the local newspapers and from the authorities. The proposed methodology is applicable to urban regions where people living in high-rise buildings might feel tremors during earthquakes.

  3. An optimized methodology to analyze biopolymer capsules by environmental scanning electron microscopy.

    PubMed

    Conforto, Egle; Joguet, Nicolas; Buisson, Pierre; Vendeville, Jean-Eudes; Chaigneau, Carine; Maugard, Thierry

    2015-02-01

    The aim of this paper is to describe an optimized methodology to study the surface characteristics and internal structure of biopolymer capsules using scanning electron microscopy (SEM) in environmental mode. The main advantage of this methodology is that no preparation is required and, significantly, no metallic coverage is deposited on the surface of the specimen, thus preserving the original capsule shape and its surface morphology. This avoids introducing preparation artefacts which could modify the capsule surface and mask information concerning important feature like porosities or roughness. Using this method gelatin and mainly fatty coatings, difficult to be analyzed by standard SEM technique, unambiguously show fine details of their surface morphology without damage. Furthermore, chemical contrast is preserved in backscattered electron images of unprepared samples, allowing visualizing the internal organization of the capsule, the quality of the envelope, etc... This study provides pointers on how to obtain optimal conditions for the analysis of biological or sensitive material, as this is not always studied using appropriate techniques. A reliable evaluation of the parameters used in capsule elaboration for research and industrial applications, as well as that of capsule functionality is provided by this methodology, which is essential for the technological progress in this domain. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. Arthroscopic Transtibial Pullout Repair for Posterior Medial Meniscus Root Tears: A Systematic Review of Clinical, Radiographic, and Second-Look Arthroscopic Results.

    PubMed

    Feucht, Matthias J; Kühle, Jan; Bode, Gerrit; Mehl, Julian; Schmal, Hagen; Südkamp, Norbert P; Niemeyer, Philipp

    2015-09-01

    To systematically review the results of arthroscopic transtibial pullout repair (ATPR) for posterior medial meniscus root tears. A systematic electronic search of the PubMed database and the Cochrane Library was performed in September 2014 to identify studies that reported clinical, radiographic, or second-look arthroscopic outcomes of ATPR for posterior medial meniscus root tears. Included studies were abstracted regarding study characteristics, patient demographic characteristics, surgical technique, rehabilitation, and outcome measures. The methodologic quality of the included studies was assessed with the modified Coleman Methodology Score. Seven studies with a total of 172 patients met the inclusion criteria. The mean patient age was 55.3 years, and 83% of patients were female patients. Preoperative and postoperative Lysholm scores were reported for all patients. After a mean follow-up period of 30.2 months, the Lysholm score increased from 52.4 preoperatively to 85.9 postoperatively. On conventional radiographs, 64 of 76 patients (84%) showed no progression of Kellgren-Lawrence grading. Magnetic resonance imaging showed no progression of cartilage degeneration in 84 of 103 patients (82%) and showed reduced medial meniscal extrusion in 34 of 61 patients (56%). On the basis of second-look arthroscopy and magnetic resonance imaging in 137 patients, the healing status was rated as complete in 62%, partial in 34%, and failed in 3%. Overall, the methodologic quality of the included studies was fair, with a mean modified Coleman Methodology Score of 63. ATPR significantly improves functional outcome scores and seems to prevent the progression of osteoarthritis in most patients, at least during a short-term follow-up. Complete healing of the repaired root and reduction of meniscal extrusion seem to be less predictable, being observed in only about 60% of patients. Conclusions about the progression of osteoarthritis and reduction of meniscal extrusion are limited by the small portion of patients undergoing specific evaluation (44% and 35% of the study group, respectively). Level IV, systematic review of Level III and IV studies. Copyright © 2015 Arthroscopy Association of North America. Published by Elsevier Inc. All rights reserved.

  5. A methodology for generating a tailored implementation blueprint: an exemplar from a youth residential setting.

    PubMed

    Lewis, Cara C; Scott, Kelli; Marriott, Brigid R

    2018-05-16

    Tailored implementation approaches are touted as more likely to support the integration of evidence-based practices. However, to our knowledge, few methodologies for tailoring implementations exist. This manuscript will apply a model-driven, mixed methods approach to a needs assessment to identify the determinants of practice, and pilot a modified conjoint analysis method to generate an implementation blueprint using a case example of a cognitive behavioral therapy (CBT) implementation in a youth residential center. Our proposed methodology contains five steps to address two goals: (1) identify the determinants of practice and (2) select and match implementation strategies to address the identified determinants (focusing on barriers). Participants in the case example included mental health therapists and operations staff in two programs of Wolverine Human Services. For step 1, the needs assessment, they completed surveys (clinician N = 10; operations staff N = 58; other N = 7) and participated in focus groups (clinician N = 15; operations staff N = 38) guided by the domains of the Framework for Diffusion [1]. For step 2, the research team conducted mixed methods analyses following the QUAN + QUAL structure for the purpose of convergence and expansion in a connecting process, revealing 76 unique barriers. Step 3 consisted of a modified conjoint analysis. For step 3a, agency administrators prioritized the identified barriers according to feasibility and importance. For step 3b, strategies were selected from a published compilation and rated for feasibility and likelihood of impacting CBT fidelity. For step 4, sociometric surveys informed implementation team member selection and a meeting was held to identify officers and clarify goals and responsibilities. For step 5, blueprints for each of pre-implementation, implementation, and sustainment phases were generated. Forty-five unique strategies were prioritized across the 5 years and three phases representing all nine categories. Our novel methodology offers a relatively low burden collaborative approach to generating a plan for implementation that leverages advances in implementation science including measurement, models, strategy compilations, and methods from other fields.

  6. Modified versus standard intention-to-treat reporting: are there differences in methodological quality, sponsorship, and findings in randomized trials? A cross-sectional study.

    PubMed

    Montedori, Alessandro; Bonacini, Maria Isabella; Casazza, Giovanni; Luchetta, Maria Laura; Duca, Piergiorgio; Cozzolino, Francesco; Abraha, Iosief

    2011-02-28

    Randomized controlled trials (RCTs) that use the modified intention-to-treat (mITT) approach are increasingly being published. Such trials have a preponderance of post-randomization exclusions, industry sponsorship, and favourable findings, and little is known whether in terms of these items mITT trials are different with respect to trials that report a standard intention-to-treat. To determine differences in the methodological quality, sponsorship, authors' conflicts of interest, and findings among trials with different "types" of intention-to-treat, we undertook a cross-sectional study of RCTs published in 2006 in three general medical journals (the Journal of the American Medical Association, the New England Journal of Medicine and the Lancet) and three specialty journals (Antimicrobial Agents and Chemotherapy, the American Heart Journal and the Journal of Clinical Oncology). Trials were categorized based on the "type" of intention-to-treat reporting as follows: ITT, trials reporting the use of standard ITT approach; mITT, trials reporting the use of a "modified intention-to-treat" approach; and "no ITT", trials not reporting the use of any intention-to-treat approach. Two pairs of reviewers independently extracted the data in duplicate. The strength of the associations between the "type" of intention-to-treat reporting and the quality of reporting (sample size calculation, flow-chart, lost to follow-up), the methodological quality of the trials (sequence generation, allocation concealment, and blinding), the funding source, and the findings was determined. Odds ratios (OR) were calculated with 95% confidence intervals (CI). Of the 367 RCTs included, 197 were classified as ITT, 56 as mITT, and 114 as "no ITT" trials. The quality of reporting and the methodological quality of the mITT trials were similar to those of the ITT trials; however, the mITT trials were more likely to report post-randomization exclusions (adjusted OR 3.43 [95%CI, 1.70 to 6.95]; P < 0.001). We found a strong association between trials classified as mITT and for-profit agency sponsorship (adjusted OR 7.41 [95%CI, 3.14 to 17.48]; P < .001) as well as the presence of authors' conflicts of interest (adjusted OR 5.14 [95%CI, 2.12 to 12.48]; P < .001). There was no association between mITT reporting and favourable results; in general, however, trials with for-profit agency sponsorship were significantly associated with favourable results (adjusted OR 2.30; [95%CI, 1.28 to 4.16]; P = 0.006). We found that the mITT trials were significantly more likely to perform post-randomization exclusions and were strongly associated with industry funding and authors' conflicts of interest.

  7. An urban approach to planetary boundaries.

    PubMed

    Hoornweg, Daniel; Hosseini, Mehdi; Kennedy, Christopher; Behdadi, Azin

    2016-09-01

    The achievement of global sustainable development goals subject to planetary boundaries will mostly be determined by cities as they drive cultures, economies, material use, and waste generation. Locally relevant, applied and quantitative methodologies are critical to capture the complexity of urban infrastructure systems, global inter-connections, and to monitor local and global progress toward sustainability. An urban monitoring (and communications) tool is presented here illustrating that a city-based approach to sustainable development is possible. Following efforts to define and quantify safe planetary boundaries in areas such as climate change, biosphere integrity, and freshwater use, this paper modifies the methodology to propose boundaries from a city's perspective. Socio-economic boundaries, or targets, largely derived from the Sustainable Development Goals are added to bio-physical boundaries. Issues such as data availability, city priorities, and ease of implementation are considered. The framework is trialed for Toronto, Shanghai, Sao Paulo, Mumbai, and Dakar, as well as aggregated for the world's larger cities. The methodology provides an important tool for cities to play a more fulsome and active role in global sustainable development.

  8. A guided tour of current research in synovial joints with reference to wavelet methodology

    NASA Astrophysics Data System (ADS)

    Agarwal, Ruchi; Salimath, C. S.; Alam, Khursheed

    2017-10-01

    Main aim of this article is to provide a comprehensive overview of biomechanical aspects of synovial joints of human body. This can be considered as a part of continued research work carried out by various authors over a period of time. Almost every person once in life time has suffered from joint disease; this has triggered intensive investigation into various biomechanical aspects of synovial joints. This has also resulted into an increase of arthroplasty with introduction to various clinical trials. From last few decades new improvements and ideas for new technologies have been introduced to decrease the incidence of joint problem. In this paper a literature survey of recent advances, developments and recognition of wear and tear of human joint is presented. Wavelet method in Computational fluid dynamics (CFD) is relatively a new research field. This review aims to provide a glimpse of wavelet methodology in CFD. Wavelets methodology has played a vital role in the solution of governing equation of synovial fluid flow in the synovial joints represented by Reynolds equation and its modified version.

  9. Creep Life Prediction of Ceramic Components Using the Finite Element Based Integrated Design Program (CARES/Creep)

    NASA Technical Reports Server (NTRS)

    Jadaan, Osama M.; Powers, Lynn M.; Gyekenyesi, John P.

    1997-01-01

    The desirable properties of ceramics at high temperatures have generated interest in their use for structural applications such as in advanced turbine systems. Design lives for such systems can exceed 10,000 hours. Such long life requirements necessitate subjecting the components to relatively low stresses. The combination of high temperatures and low stresses typically places failure for monolithic ceramics in the creep regime. The objective of this work is to present a design methodology for predicting the lifetimes of structural components subjected to multiaxial creep loading. This methodology utilizes commercially available finite element packages and takes into account the time varying creep stress distributions (stress relaxation). In this methodology, the creep life of a component is divided into short time steps, during which, the stress and strain distributions are assumed constant. The damage, D, is calculated for each time step based on a modified Monkman-Grant creep rupture criterion. For components subjected to predominantly tensile loading, failure is assumed to occur when the normalized accumulated damage at any point in the component is greater than or equal to unity.

  10. Pressure-based high-order TVD methodology for dynamic stall control

    NASA Astrophysics Data System (ADS)

    Yang, H. Q.; Przekwas, A. J.

    1992-01-01

    The quantitative prediction of the dynamics of separating unsteady flows, such as dynamic stall, is of crucial importance. This six-month SBIR Phase 1 study has developed several new pressure-based methodologies for solving 3D Navier-Stokes equations in both stationary and moving (body-comforting) coordinates. The present pressure-based algorithm is equally efficient for low speed incompressible flows and high speed compressible flows. The discretization of convective terms by the presently developed high-order TVD schemes requires no artificial dissipation and can properly resolve the concentrated vortices in the wing-body with minimum numerical diffusion. It is demonstrated that the proposed Newton's iteration technique not only increases the convergence rate but also strongly couples the iteration between pressure and velocities. The proposed hyperbolization of the pressure correction equation is shown to increase the solver's efficiency. The above proposed methodologies were implemented in an existing CFD code, REFLEQS. The modified code was used to simulate both static and dynamic stalls on two- and three-dimensional wing-body configurations. Three-dimensional effect and flow physics are discussed.

  11. Development of Methodologies for the Estimation of Thermal Properties Associated with Aerospace Vehicles

    NASA Technical Reports Server (NTRS)

    Scott, Elaine P.

    1996-01-01

    A thermal stress analysis is an important aspect in the design of aerospace structures and vehicles such as the High Speed Civil Transport (HSCT) at the National Aeronautics and Space Administration Langley Research Center (NASA-LaRC). These structures are complex and are often composed of numerous components fabricated from a variety of different materials. The thermal loads on these structures induce temperature variations within the structure, which in turn result in the development of thermal stresses. Therefore, a thermal stress analysis requires knowledge of the temperature distributions within the structures which consequently necessitates the need for accurate knowledge of the thermal properties, boundary conditions and thermal interface conditions associated with the structural materials. The goal of this proposed multi-year research effort was to develop estimation methodologies for the determination of the thermal properties and interface conditions associated with aerospace vehicles. Specific objectives focused on the development and implementation of optimal experimental design strategies and methodologies for the estimation of thermal properties associated with simple composite and honeycomb structures. The strategy used in this multi-year research effort was to first develop methodologies for relatively simple systems and then systematically modify these methodologies to analyze complex structures. This can be thought of as a building block approach. This strategy was intended to promote maximum usability of the resulting estimation procedure by NASA-LARC researchers through the design of in-house experimentation procedures and through the use of an existing general purpose finite element software.

  12. Aeroacoustic Codes for Rotor Harmonic and BVI Noise. CAMRAD.Mod1/HIRES: Methodology and Users' Manual

    NASA Technical Reports Server (NTRS)

    Boyd, D. Douglas, Jr.; Brooks, Thomas F.; Burley, Casey L.; Jolly, J. Ralph, Jr.

    1998-01-01

    This document details the methodology and use of the CAMRAD.Mod1/HIRES codes, which were developed at NASA Langley Research Center for the prediction of helicopter harmonic and Blade-Vortex Interaction (BVI) noise. CANMAD.Mod1 is a substantially modified version of the performance/trim/wake code CANMAD. High resolution blade loading is determined in post-processing by HIRES and an associated indicial aerodynamics code. Extensive capabilities of importance to noise prediction accuracy are documented, including a new multi-core tip vortex roll-up wake model, higher harmonic and individual blade control, tunnel and fuselage correction input, diagnostic blade motion input, and interfaces for acoustic and CFD aerodynamics codes. Modifications and new code capabilities are documented with examples. A users' job preparation guide and listings of variables and namelists are given.

  13. New methodology for adjusting rotating shadowband irradiometer measurements

    NASA Astrophysics Data System (ADS)

    Vignola, Frank; Peterson, Josh; Wilbert, Stefan; Blanc, Philippe; Geuder, Norbert; Kern, Chris

    2017-06-01

    A new method is developed for correcting systematic errors found in rotating shadowband irradiometer measurements. Since the responsivity of photodiode-based pyranometers typically utilized for RST sensors is dependent upon the wavelength of the incident radiation and the spectral distribution of the incident radiation is different for the Direct Normal Trradiance and the Diffuse Horizontal Trradiance, spectral effects have to be considered. These cause the most problematic errors when applying currently available correction functions to RST measurements. Hence, direct normal and diffuse contributions are analyzed and modeled separately. An additional advantage of this methodology is that it provides a prescription for how to modify the adjustment algorithms to locations with different atmospheric characteristics from the location where the calibration and adjustment algorithms were developed. A summary of results and areas for future efforts are then discussed.

  14. Transonic Flow Field Analysis for Wing-Fuselage Configurations

    NASA Technical Reports Server (NTRS)

    Boppe, C. W.

    1980-01-01

    A computational method for simulating the aerodynamics of wing-fuselage configurations at transonic speeds is developed. The finite difference scheme is characterized by a multiple embedded mesh system coupled with a modified or extended small disturbance flow equation. This approach permits a high degree of computational resolution in addition to coordinate system flexibility for treating complex realistic aircraft shapes. To augment the analysis method and permit applications to a wide range of practical engineering design problems, an arbitrary fuselage geometry modeling system is incorporated as well as methodology for computing wing viscous effects. Configuration drag is broken down into its friction, wave, and lift induced components. Typical computed results for isolated bodies, isolated wings, and wing-body combinations are presented. The results are correlated with experimental data. A computer code which employs this methodology is described.

  15. Logic flowgraph methodology - A tool for modeling embedded systems

    NASA Technical Reports Server (NTRS)

    Muthukumar, C. T.; Guarro, S. B.; Apostolakis, G. E.

    1991-01-01

    The logic flowgraph methodology (LFM), a method for modeling hardware in terms of its process parameters, has been extended to form an analytical tool for the analysis of integrated (hardware/software) embedded systems. In the software part of a given embedded system model, timing and the control flow among different software components are modeled by augmenting LFM with modified Petrinet structures. The objective of the use of such an augmented LFM model is to uncover possible errors and the potential for unanticipated software/hardware interactions. This is done by backtracking through the augmented LFM mode according to established procedures which allow the semiautomated construction of fault trees for any chosen state of the embedded system (top event). These fault trees, in turn, produce the possible combinations of lower-level states (events) that may lead to the top event.

  16. Methodological considerations for researchers and practitioners using pedometers to measure physical (ambulatory) activity.

    PubMed

    Tudor-Locke, C E; Myers, A M

    2001-03-01

    Researchers and practitioners require guidelines for using electronic pedometers to objectively quantify physical activity (specifically ambulatory activity) for research and surveillance as well as clinical and program applications. Methodological considerations include choice of metric and length of monitoring frame as well as different data recording and collection procedures. A systematic review of 32 empirical studies suggests we can expect 12,000-16,000 steps/day for 8-10-year-old children (lower for girls than boys); 7,000-13,000 steps/day for relatively healthy, younger adults (lower for women than men); 6,000-8,500 steps/day for healthy older adults; and 3,500-5,500 steps/day for individuals living with disabilities and chronic illnesses. These preliminary recommendations should be modified and refined, as evidence and experience using pedometers accumulates.

  17. Performance evaluation methodology for historical document image binarization.

    PubMed

    Ntirogiannis, Konstantinos; Gatos, Basilis; Pratikakis, Ioannis

    2013-02-01

    Document image binarization is of great importance in the document image analysis and recognition pipeline since it affects further stages of the recognition process. The evaluation of a binarization method aids in studying its algorithmic behavior, as well as verifying its effectiveness, by providing qualitative and quantitative indication of its performance. This paper addresses a pixel-based binarization evaluation methodology for historical handwritten/machine-printed document images. In the proposed evaluation scheme, the recall and precision evaluation measures are properly modified using a weighting scheme that diminishes any potential evaluation bias. Additional performance metrics of the proposed evaluation scheme consist of the percentage rates of broken and missed text, false alarms, background noise, character enlargement, and merging. Several experiments conducted in comparison with other pixel-based evaluation measures demonstrate the validity of the proposed evaluation scheme.

  18. Application of Risk-Based Inspection method for gas compressor station

    NASA Astrophysics Data System (ADS)

    Zhang, Meng; Liang, Wei; Qiu, Zeyang; Lin, Yang

    2017-05-01

    According to the complex process and lots of equipment, there are risks in gas compressor station. At present, research on integrity management of gas compressor station is insufficient. In this paper, the basic principle of Risk Based Inspection (RBI) and the RBI methodology are studied; the process of RBI in the gas compressor station is developed. The corrosion loop and logistics loop of the gas compressor station are determined through the study of corrosion mechanism and process of the gas compressor station. The probability of failure is calculated by using the modified coefficient, and the consequence of failure is calculated by the quantitative method. In particular, we addressed the application of a RBI methodology in a gas compressor station. The risk ranking is helpful to find the best preventive plan for inspection in the case study.

  19. Groundwater vulnerability and risk mapping using GIS, modeling and a fuzzy logic tool.

    PubMed

    Nobre, R C M; Rotunno Filho, O C; Mansur, W J; Nobre, M M M; Cosenza, C A N

    2007-12-07

    A groundwater vulnerability and risk mapping assessment, based on a source-pathway-receptor approach, is presented for an urban coastal aquifer in northeastern Brazil. A modified version of the DRASTIC methodology was used to map the intrinsic and specific groundwater vulnerability of a 292 km(2) study area. A fuzzy hierarchy methodology was adopted to evaluate the potential contaminant source index, including diffuse and point sources. Numerical modeling was performed for delineation of well capture zones, using MODFLOW and MODPATH. The integration of these elements provided the mechanism to assess groundwater pollution risks and identify areas that must be prioritized in terms of groundwater monitoring and restriction on use. A groundwater quality index based on nitrate and chloride concentrations was calculated, which had a positive correlation with the specific vulnerability index.

  20. Transient and steady state viscoelastic rolling contact

    NASA Technical Reports Server (NTRS)

    Padovan, J.; Paramadilok, O.

    1985-01-01

    Based on moving total Lagrangian coordinates, a so-called traveling Hughes type contact strategy is developed. Employing the modified contact scheme in conjunction with a traveling finite element strategy, an overall solution methodology is developed to handle transient and steady viscoelastic rolling contact. To verify the scheme, the results of both experimental and analytical benchmarking is presented. The experimental benchmarking includes the handling of rolling tires up to their upper bound behavior, namely the standing wave response.

  1. In-situ determination of amine/epoxy and carboxylic/epoxy exothermic heat of reaction on surface of modified carbon nanotubes and structural verification of covalent bond formation

    NASA Astrophysics Data System (ADS)

    Neves, Juliana C.; de Castro, Vinícius G.; Assis, Ana L. S.; Veiga, Amanda G.; Rocco, Maria Luiza M.; Silva, Glaura G.

    2018-04-01

    An effective nanofiller-matrix interaction is considered crucial to produce enhanced nanocomposites. Nevertheless, there is lack of experiments focused in the direct measurement of possible filler-matrix covalent linkage, which was the main goal of this work for a carbon nanotube (CNT)/epoxy system. CNT were functionalized with oxygenated (ox) functions and further with triethylenetetramine (TETA). An in-situ determination methodology of epoxy-CNTs heat of reaction was developed by Differential Scanning Calorimetry (DSC). Values of -(8.7 ± 0.4) and -(6.0 ± 0.6) J/g were observed for epoxy with CNT-ox and CNT-TETA, respectively. These results confirm the occurrence of covalent bonds for both functionalized CNTs, a very important information due to the literature generally disregard this possibility for oxygenated functions. The higher value obtained for CNT-ox can be attributed to a not complete amidation and to steric impediments in the CNT-TETA structure. The modified CNTs produced by DSC experiments were then characterized by X-Ray Photoelectron Spectroscopy, Transmission Electron Microscopy and Thermogravimetry, which confirmed the covalent linkage. This characterization methodology can be used to verify the occurrence of covalent bonds in various nanocomposites with a quantitative evaluation, providing data for better understanding of the role of CNT functional groups and for tailoring its interface with polymers.

  2. Use of Carabids for the Post-Market Environmental Monitoring of Genetically Modified Crops

    PubMed Central

    Skoková Habuštová, Oxana; Svobodová, Zdeňka; Cagáň, Ľudovít; Sehnal, František

    2017-01-01

    Post-market environmental monitoring (PMEM) of genetically modified (GM) crops is required by EU legislation and has been a subject of debate for many years; however, no consensus on the methodology to be used has been reached. We explored the suitability of carabid beetles as surrogates for the detection of unintended effects of GM crops in general PMEM surveillance. Our study combines data on carabid communities from five maize field trials in Central Europe. Altogether, 86 species and 58,304 individuals were collected. Modeling based on the gradual elimination of the least abundant species, or of the fewest categories of functional traits, showed that a trait-based analysis of the most common species may be suitable for PMEM. Species represented by fewer than 230 individuals (all localities combined) should be excluded and species with an abundance higher than 600 should be preserved for statistical analyses. Sixteen species, representing 15 categories of functional traits fulfill these criteria, are typical dominant inhabitants of agroecocoenoses in Central Europe, are easy to determine, and their functional classification is well known. The effect of sampling year is negligible when at least four samples are collected during maize development beginning from 1 April. The recommended methodology fulfills PMEM requirements, including applicability to large-scale use. However, suggested thresholds of carabid comparability should be verified before definitive conclusions are drawn. PMID:28353663

  3. A systematic review of the therapeutic effects of Reiki.

    PubMed

    vanderVaart, Sondra; Gijsen, Violette M G J; de Wildt, Saskia N; Koren, Gideon

    2009-11-01

    Reiki is an ancient form of Japanese healing. While this healing method is widely used for a variety of psychologic and physical symptoms, evidence of its effectiveness is scarce and conflicting. The purpose of this systematic review was to try to evaluate whether Reiki produces a significant treatment effect. Studies were identified using an electronic search of Medline, EMBASE, Cochrane Library, and Google Scholar. Quality of reporting was evaluated using a modified CONSORT Criteria for Herbal Interventions, while methodological quality was assessed using the Jadad Quality score. Two (2) researchers selected articles based on the following features: placebo or other adequate control, clinical investigation on humans, intervention using a Reiki practitioner, and published in English. They independently extracted data on study design, inclusion criteria, type of control, sample size, result, and nature of outcome measures. The modified CONSORT Criteria indicated that all 12 trials meeting the inclusion criteria were lacking in at least one of the three key areas of randomization, blinding, and accountability of all patients, indicating a low quality of reporting. Nine (9) of the 12 trials detected a significant therapeutic effect of the Reiki intervention; however, using the Jadad Quality score, 11 of the 12 studies ranked "poor." The serious methodological and reporting limitations of limited existing Reiki studies preclude a definitive conclusion on its effectiveness. High-quality randomized controlled trials are needed to address the effectiveness of Reiki over placebo.

  4. DNA recovery from microhymenoptera using six non-destructive methodologies with considerations for subsequent preparation of museum slides.

    PubMed

    Guzmán-Larralde, Adriana J; Suaste-Dzul, Alba P; Gallou, Adrien; Peña-Carrillo, Kenzy I

    2017-01-01

    Because of the tiny size of microhymenoptera, successful morphological identification typically requires specific mounting protocols that require time, skills, and experience. Molecular taxonomic identification is an alternative, but many DNA extraction protocols call for maceration of the whole specimen, which is not compatible with preserving museum vouchers. Thus, non-destructive DNA isolation methods are attractive alternatives for obtaining DNA without damaging sample individuals. However, their performance needs to be assessed in microhymenopterans. We evaluated six non-destructive methods: (A) DNeasy® Blood & Tissue Kit; (B) DNeasy® Blood & Tissue Kit, modified; (C) Protocol with CaCl 2 buffer; (D) Protocol with CaCl 2 buffer, modified; (E) HotSHOT; and (F) Direct PCR. The performance of each DNA extraction method was tested across several microhymenopteran species by attempting to amplify the mitochondrial gene COI from insect specimens of varying ages: 1 day, 4 months, 3 years, 12 years, and 23 years. Methods B and D allowed COI amplification in all insects, while methods A, C, and E were successful in DNA amplification from insects up to 12 years old. Method F, the fastest, was useful in insects up to 4 months old. Finally, we adapted permanent slide preparation in Canada balsam for every technique. The results reported allow for combining morphological and molecular methodologies for taxonomic studies.

  5. Designed polar cosolvent-modified supercritical CO2 removing caffeine from and retaining catechins in green tea powder using response surface methodology.

    PubMed

    Huang, Kuo-Jong; Wu, Jia-Jiuan; Chiu, Yung-Ho; Lai, Cheng-Yung; Chang, Chieh-Ming J

    2007-10-31

    This study examines cosolvent-modified supercritical carbon dioxide (SC-CO2) to remove caffeine from and to retain catechins in green tea powder. The response surface method was adopted to determine the optimal operation conditions in terms of the extraction efficiencies and concentration factors of caffeine and catechins during the extractions. When SC-CO2 was used at 333 K and 300 bar, 91.5% of the caffeine was removed and 80.8% of catechins were retained in the tea: 3600 g of carbon dioxide was used in the extraction of 4 g of tea soaked with 1 g of water. Under the same extraction conditions, 10 g of water was added to <800 g of carbon dioxide in an extraction that completely removed caffeine (that is, the caffeine extraction efficiency was 100%). The optimal result as predicted by three-factor response surface methodology and supported by experimental data was that in 1.5 h of extraction, 640 g of carbon dioxide at 323 K and 275 bar with the addition of 6 g of water extracted 71.9% of the caffeine while leaving 67.8% of the catechins in 8 g of tea. Experimental data indicated that supercritical carbon dioxide decaffeination increased the concentrations of caffeine in the SC-CO2 extracts at 353 K.

  6. A modified Poisson-Boltzmann equation applied to protein adsorption.

    PubMed

    Gama, Marlon de Souza; Santos, Mirella Simões; Lima, Eduardo Rocha de Almeida; Tavares, Frederico Wanderley; Barreto, Amaro Gomes Barreto

    2018-01-05

    Ion-exchange chromatography has been widely used as a standard process in purification and analysis of protein, based on the electrostatic interaction between the protein and the stationary phase. Through the years, several approaches are used to improve the thermodynamic description of colloidal particle-surface interaction systems, however there are still a lot of gaps specifically when describing the behavior of protein adsorption. Here, we present an improved methodology for predicting the adsorption equilibrium constant by solving the modified Poisson-Boltzmann (PB) equation in bispherical coordinates. By including dispersion interactions between ions and protein, and between ions and surface, the modified PB equation used can describe the Hofmeister effects. We solve the modified Poisson-Boltzmann equation to calculate the protein-surface potential of mean force, treated as spherical colloid-plate system, as a function of process variables. From the potential of mean force, the Henry constants of adsorption, for different proteins and surfaces, are calculated as a function of pH, salt concentration, salt type, and temperature. The obtained Henry constants are compared with experimental data for several isotherms showing excellent agreement. We have also performed a sensitivity analysis to verify the behavior of different kind of salts and the Hofmeister effects. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Distribution network connection pricing framework and methodology: identification of areas of improvement for Sarawak Energy Berhad Connection Charges Guidelines through modified delphi method

    NASA Astrophysics Data System (ADS)

    Tan, J. K.; Abas, N.

    2017-07-01

    Complaints on issues and matters related to connection charges have been very common for electricity supply utility companies around the world including Sarawak Energy Berhad. In order to identify the areas that can be improved, a mixed method of exploratory research involving qualitative and quantitative methods have been designed and undertaken rather than a single method of survey. This will ensure a more comprehensive and detailed understanding of the issues from various target groups. The method is designed under three phases, employing Modified Delphi Technique for phase 1 through a series of stake holder engagements, online and offline survey questionnaires to be filled by internal wiring contractors for phase 2 whilst under phase 3, case studies shall be carried out on the issues identified from phase 1 and phase 2 of the study. This paper presented the findings from the Modified Delphi Technique. The findings revealed that there are areas of improvement for Sarawak Energy Berhad connection guidelines in term of differentiation of dedicated and shared assets which leads to unfairness to the connecting customers, inconsistency and non-transparent in charging. The findings of Modified Delphi Technique shall be used for implementation of phase 2 and phase 3 of the study.

  8. Evaluation of intratympanic formulations for inner ear delivery: methodology and sustained release formulation testing

    PubMed Central

    Liu, Hongzhuo; Feng, Liang; Tolia, Gaurav; Liddell, Mark R.; Hao, Jinsong; Li, S. Kevin

    2013-01-01

    A convenient and efficient in vitro diffusion cell method to evaluate formulations for inner ear delivery via the intratympanic route is currently not available. The existing in vitro diffusion cell systems commonly used to evaluate drug formulations do not resemble the physical dimensions of the middle ear and round window membrane. The objectives of this study were to examine a modified in vitro diffusion cell system of a small diffusion area for studying sustained release formulations in inner ear drug delivery and to identify a formulation for sustained drug delivery to the inner ear. Four formulations and a control were examined in this study using cidofovir as the model drug. Drug release from the formulations in the modified diffusion cell system was slower than that in the conventional diffusion cell system due to the decrease in the diffusion surface area of the modified diffusion cell system. The modified diffusion cell system was able to show different drug release behaviors among the formulations and allowed formulation evaluation better than the conventional diffusion cell system. Among the formulations investigated, poly(lactic-co-glycolic acid)–poly(ethylene glycol)–poly(lactic-co-glycolic acid) triblock copolymer systems provided the longest sustained drug delivery, probably due to their rigid gel structures and/or polymer-to-cidofovir interactions. PMID:23631539

  9. Simultaneous quantification of vitamin E, γ-oryzanols and xanthophylls from rice bran essences extracted by supercritical CO2.

    PubMed

    Sookwong, Phumon; Suttiarporn, Panawan; Boontakham, Pittayaporn; Seekhow, Pattawat; Wangtueai, Sutee; Mahatheeranont, Sugunya

    2016-11-15

    Since the nutrition value of rice is diminished during rice processing, technology that can preserve and sustain functional compounds is necessary. In this study, supercritical carbon dioxide (SC-CO2) extraction was optimized for operational conditions (time, temperature, pressure and modifier) to extract vitamin E, γ-oryzanols and xanthophylls from rice bran. The simultaneous quantification of the compounds was developed using high-performance liquid chromatography with diode array and fluorescence detectors. Central composite design and respond surface methodology were applied to achieve optimum extraction conditions. The optimized conditions were 60min, 43°C, 5420psi with 10% ethanol as a modifier. Pigmented rice bran extracts contained greater amounts of functional phytochemicals than non-pigmented rice bran extracts (0.68, 1410, and non-detectable μg/g compared with 16.65, 2480, and 0.10μg/g of vitamin E, γ-oryzanols and xanthophylls in pigmented and non-pigmented ones, respectively). SC-CO2 extraction with modifier would be promising for preparation of phytochemical essences for therapeutic purpose. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. An improved genetic algorithm for designing optimal temporal patterns of neural stimulation

    NASA Astrophysics Data System (ADS)

    Cassar, Isaac R.; Titus, Nathan D.; Grill, Warren M.

    2017-12-01

    Objective. Electrical neuromodulation therapies typically apply constant frequency stimulation, but non-regular temporal patterns of stimulation may be more effective and more efficient. However, the design space for temporal patterns is exceedingly large, and model-based optimization is required for pattern design. We designed and implemented a modified genetic algorithm (GA) intended for design optimal temporal patterns of electrical neuromodulation. Approach. We tested and modified standard GA methods for application to designing temporal patterns of neural stimulation. We evaluated each modification individually and all modifications collectively by comparing performance to the standard GA across three test functions and two biophysically-based models of neural stimulation. Main results. The proposed modifications of the GA significantly improved performance across the test functions and performed best when all were used collectively. The standard GA found patterns that outperformed fixed-frequency, clinically-standard patterns in biophysically-based models of neural stimulation, but the modified GA, in many fewer iterations, consistently converged to higher-scoring, non-regular patterns of stimulation. Significance. The proposed improvements to standard GA methodology reduced the number of iterations required for convergence and identified superior solutions.

  11. Simultaneous trace multielement determination by ICP-OES after solid phase extraction with modified octadecyl silica gel.

    PubMed

    Karbasi, Mohamad-Hadi; Jahanparast, Babak; Shamsipur, Mojtaba; Hassan, Jalal

    2009-10-15

    Multielement simultaneous determination of 35 trace elements in environmental samples was carried out by inductively coupled plasma emission spectrometry (ICP-OES) after preconcentration with octadecyl silicagel, modified with aurin tricarboxylic acid (Aluminon). Optimal experimental conditions including pH of sample solution, sample volume, sample and eluent flow rate, type, concentration and volume of eluent and foreign ions effect were investigated and established. Trace element ions in aqueous solution were quantitatively adsorbed onto octadecyl silicagel modified with aurin tricarboxylic acid at pH 8.0 with a flow rate of 11.0 mL min(-1). The adsorbed element ions were eluted with 3-5 mL of 0.5 mol L(-1) HNO(3) at a flow rate of 10.0 mL min(-1) and analyzed by ICP-OES simultaneously. The proposed method has at least preconcentration factor of 100 in water samples, which results high sensitive detection of ultra-trace and trace analysis. The present methodology gave recoveries better than 70% and RSD less than 16%.

  12. Treatment of swine wastewater using chemically modified zeolite and bioflocculant from activated sludge.

    PubMed

    Guo, Junyuan; Yang, Chunping; Zeng, Guangming

    2013-09-01

    Sterilization, alkaline-thermal and acid-thermal treatments were applied to activated sludge and the pre-treated sludge was used as raw material for Rhodococcus R3 to produce polymeric substances. After 60 h of fermentation, bioflocculant of 2.7 and 4.2 g L(-1) were produced in sterilized and alkaline-thermal treated sludge as compared to that of 0.9 g L(-1) in acid-thermal treated sludge. Response surface methodology (RSM) was employed to optimize the treatment process of swine wastewater using the composite of bioflocculant and zeolite modified by calcining with MgO. The optimal flocculating conditions were bioflocculant of 24 mg L(-1), modified zeolite of 12 g L(-1), CaCl2 of 16 mg L(-1), pH of 8.3 and contact time of 55 min, and the corresponding removal rates of COD, ammonium and turbidity were 87.9%, 86.9%, and 94.8%. The use of the composite by RSM provides a feasible way to improve the pollutant removal efficiencies and recycle high-level of ammonium from wastewater. Copyright © 2013 Elsevier Ltd. All rights reserved.

  13. The electrochemical performance of graphene modified electrodes: an analytical perspective.

    PubMed

    Brownson, Dale A C; Foster, Christopher W; Banks, Craig E

    2012-04-21

    We explore the use of graphene modified electrodes towards the electroanalytical sensing of various analytes, namely dopamine hydrochloride, uric acid, acetaminophen and p-benzoquinone via cyclic voltammetry. In line with literature methodologies and to investigate the full-implications of employing graphene in this electrochemical context, we modify electrode substrates that exhibit either fast or slow electron transfer kinetics (edge- or basal- plane pyrolytic graphite electrodes respectively) with well characterised commercially available graphene that has not been chemically treated, is free from surfactants and as a result of its fabrication has an extremely low oxygen content, allowing the true electroanalytical applicability of graphene to be properly de-convoluted and determined. In comparison to the unmodified underlying electrode substrates (constructed from graphite), we find that graphene exhibits a reduced analytical performance in terms of sensitivity, linearity and observed detection limits towards each of the various analytes studied within. Owing to graphene's structural composition, low proportion of edge plane sites and consequent slow heterogeneous electron transfer rates, there appears to be no advantages, for the analytes studied here, of employing graphene in this electroanalytical context.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bekar, Kursat B; Miller, Thomas Martin; Patton, Bruce W

    The characteristic X-rays produced by the interactions of the electron beam with the sample in a scanning electron microscope (SEM) are usually captured with a variable-energy detector, a process termed energy dispersive spectrometry (EDS). The purpose of this work is to exploit inverse simulations of SEM-EDS spectra to enable rapid determination of sample properties, particularly elemental composition. This is accomplished using penORNL, a modified version of PENELOPE, and a modified version of the traditional Levenberg Marquardt nonlinear optimization algorithm, which together is referred to as MOZAIK-SEM. The overall conclusion of this work is that MOZAIK-SEM is a promising method formore » performing inverse analysis of X-ray spectra generated within a SEM. As this methodology exists now, MOZAIK-SEM has been shown to calculate the elemental composition of an unknown sample within a few percent of the actual composition.« less

  15. Preparation of transition metal nanoparticles and surfaces modified with (CO)polymers synthesized by RAFT

    DOEpatents

    McCormick, III., Charles L.; Lowe, Andrew B.; Sumerlin, Brent S.

    2006-11-21

    A new, facile, general one-phase method of generating thio-functionalized transition metal nanoparticles and surfaces modified by (co)polymers synthesized by the RAFT method is described. The method includes the stops of forming a (co)polymer in aqueous solution using the RAFT methodology, forming a colloidal transition metal precursor solution from an appropriate transition metal; adding the metal precursor solution or surface to the (co)polymer solution, adding a reducing agent into the solution to reduce the metal colloid in situ to produce the stabilized nanoparticles or surface, and isolating the stabilized nanoparticles or surface in a manner such that aggregation is minimized. The functionalized surfaces generated using these methods can further undergo planar surface modifications, such as functionalization with a variety of different chemical groups, expanding their utility and application.

  16. Preparation of transition metal nanoparticles and surfaces modified with (co)polymers synthesized by RAFT

    DOEpatents

    McCormick, III, Charles L.; Lowe, Andrew B [Hattiesburg, MS; Sumerlin, Brent S [Pittsburgh, PA

    2011-12-27

    A new, facile, general one-phase method of generating thiol-functionalized transition metal nanoparticles and surfaces modified by (co)polymers synthesized by the RAFT method is described. The method includes the steps of forming a (co)polymer in aqueous solution using the RAFT methodology, forming a colloidal transition metal precursor solution from an appropriate transition metal; adding the metal precursor solution or surface to the (co)polymer solution, adding a reducing agent into the solution to reduce the metal colloid in situ to produce the stabilized nanoparticles or surface, and isolating the stabilized nanoparticles or surface in a manner such that aggregation is minimized. The functionalized surfaces generated using these methods can further undergo planar surface modifications, such as functionalization with a variety of different chemical groups, expanding their utility and application.

  17. Semi-Infinite Geology Modeling Algorithm (SIGMA): a Modular Approach to 3D Gravity

    NASA Astrophysics Data System (ADS)

    Chang, J. C.; Crain, K.

    2015-12-01

    Conventional 3D gravity computations can take up to days, weeks, and even months, depending on the size and resolution of the data being modeled. Additional modeling runs, due to technical malfunctions or additional data modifications, only compound computation times even further. We propose a new modeling algorithm that utilizes vertical line elements to approximate mass, and non-gridded (point) gravity observations. This algorithm is (1) magnitudes faster than conventional methods, (2) accurate to less than 0.1% error, and (3) modular. The modularity of this methodology means that researchers can modify their geology/terrain or gravity data, and only the modified component needs to be re-run. Additionally, land-, sea-, and air-based platforms can be modeled at their observation point, without having to filter data into a synthesized grid.

  18. Optimization of the Synthesis of Structured Phosphatidylcholine with Medium Chain Fatty Acid.

    PubMed

    Ochoa-Flores, Angélica A; Hernández-Becerra, Josafat A; Cavazos-Garduño, Adriana; Vernon-Carter, Eduardo J; García, Hugo S

    2017-11-01

    Structured phosphatidylcholine was successfully produced by acidolysis between phosphatidylcholine and free medium chain fatty acid, using phospholipase A 1 immobilized on Duolite A568. Response surface methodology was applied to optimize the reaction system using three process parameters: molar ratio of substrates (phosphatidylcholine to free medium chain fatty acid), enzyme loading, and reaction temperature. All parameters evaluated showed linear and quadratic significant effects on the production of modified phosphatidylcholine; molar ratio of substrates contributed positively, but temperature influenced negatively. Increased enzyme loading also led to increased production of modified phosphatidylcholine but only during the first 9 hours of the acidolysis reaction. Optimal conditions obtained from the model were a ratio of phosphatidylcholine to free medium chain fatty acid of 1:15, an enzyme loading of 12%, and a temperature of 45°C. Under these conditions a production of modified phosphatidylcholine of 52.98 % were obtained after 24 h of reaction. The prediction was confirmed from the verification experiments; the production of modified phosphatidylcholine was 53.02%, the total yield of phosphatidylcholine 64.28% and the molar incorporation of medium chain fatty acid was 42.31%. The acidolysis reaction was scaled-up in a batch reactor with a similar production of modified phosphatidylcholine, total yield of phosphatidylcholine and molar incorporation of medium chain fatty acid. Purification by column chromatography of the structured phosphatidylcholine yielded 62.53% of phosphatidylcholine enriched with 42.52% of medium chain fatty acid.

  19. Hamstring autograft versus soft-tissue allograft in anterior cruciate ligament reconstruction: a systematic review and meta-analysis of randomized controlled trials.

    PubMed

    Cvetanovich, Gregory L; Mascarenhas, Randy; Saccomanno, Maristella F; Verma, Nikhil N; Cole, Brian J; Bush-Joseph, Charles A; Bach, Bernard R

    2014-12-01

    To compare outcomes of anterior cruciate ligament (ACL) reconstruction with hamstring autograft versus soft-tissue allograft by systematic review and meta-analysis. A systematic review of randomized controlled studies comparing hamstring autograft with soft-tissue allograft in ACL reconstruction was performed. Studies were identified by strict inclusion and exclusion criteria. Descriptive statistics were reported. Where possible, the data were pooled and a meta-analysis was performed using RevMan software (The Nordic Cochrane Centre, The Cochrane Collaboration, Copenhagen, Denmark). Dichotomous data were reported as risk ratios, whereas continuous data were reported as standardized mean differences and 95% confidence intervals. Heterogeneity was assessed by use of I(2) for each meta-analysis. Study methodologic quality was analyzed with the Modified Coleman Methodology Score and Jadad scale. Five studies with 504 combined patients (251 autograft and 253 allograft; 374 male and 130 female patients) with a mean age of 29.9 ± 2.2 years were included. The allografts used were fresh-frozen hamstring, irradiated hamstring, mixture of fresh-frozen and cryopreserved hamstring, fresh-frozen tibialis anterior, and fresh-frozen Achilles tendon grafts without bone blocks. The mean follow-up period was 47.4 ± 26.9 months, with a mean follow-up rate of 83.3% ± 8.6%. Two studies found a longer operative time with autograft than with allograft (77.1 ± 2.0 minutes v 59.9 ± 0.9 minutes, P = .008). Meta-analysis showed no statistically significant differences between autografts and allografts for any outcome measures (P > .05 for all tests). One study found significantly greater laxity for irradiated allograft than for autograft. The methodologic quality of the 5 studies was poor, with a mean Modified Coleman Methodology Score of 54.4 ± 6.9 and mean Jadad score of 1.6 ± 1.5. On the basis of this systematic review and meta-analysis of 5 randomized controlled trials, there is no statistically significant difference in outcome between patients undergoing ACL reconstruction with hamstring autograft and those undergoing ACL reconstruction with soft-tissue allograft. These results may not extrapolate to younger patient populations. The methodology of the available randomized controlled trials comparing hamstring autograft and soft-tissue allograft is poor. Level II, systematic review of Level I and II studies. Copyright © 2014 Arthroscopy Association of North America. Published by Elsevier Inc. All rights reserved.

  20. 1-Propanol probing methodology: two-dimensional characterization of the effect of solute on H2O.

    PubMed

    Koga, Yoshikata

    2013-09-21

    The wording "hydrophobicity/hydrophilicity" has been used in a loose manner based on human experiences. We have devised a more quantitative way to redefine "hydrophobes" and "hydrophiles" in terms of the mole fraction dependence pattern of one of the third derivative quantities, the enthalpic interaction between solute molecules. We then devised a thermodynamic methodology to characterize the effect of a solute on H2O in terms of its hydrophobicity and/or hydrophilicity. We use a thermodynamic signature, the enthalpic interaction of 1-propanol, H, to monitor how the test solute modifies H2O. By this method, characterization is facilitated by two indices; one pertaining to its hydrophobicity and the other its hydrophilicity. Hence differences among amphiphiles are quantified in a two-dimensional manner. Furthermore, an individual ion can be characterized independent of a counter ion. By using this methodology, we have studied the effects on H2O of a number of solutes, and gained some important new insights. For example, such commonly used examples of hydrophobes in the literature as tetramethyl urea, trimethylamine-N-oxide, and tetramethylammonium salts are in fact surprisingly hydrophilic. Hence the conclusions about "hydrophobes" using these samples ought to be interpreted with caution. The effects of anions on H2O found by this methodology are in the same sequence of the Hofmeister ranking, which will no doubt aid a further investigation into this enigma in biochemistry. Thus, it is likely that this methodology could play an important role in the characterization of the effects of solutes in H2O, and a perspective view may be useful. Here, we describe the basis on which the methodology is developed and the methodology itself in m.ore detail than given in individual papers. We then summarize the results in two dimensional hydrophobicity/hydrophilicity maps.

  1. Radical probing of spliceosome assembly.

    PubMed

    Grewal, Charnpal S; Kent, Oliver A; MacMillan, Andrew M

    2017-08-01

    Here we describe the synthesis and use of a directed hydroxyl radical probe, tethered to a pre-mRNA substrate, to map the structure of this substrate during the spliceosome assembly process. These studies indicate an early organization and proximation of conserved pre-mRNA sequences during spliceosome assembly. This methodology may be adapted to the synthesis of a wide variety of modified RNAs for use as probes of RNA structure and RNA-protein interaction. Copyright © 2017 Elsevier Inc. All rights reserved.

  2. J-Resistance Curves of Aluminum Specimens Using Moire Interferometry

    DTIC Science & Technology

    1989-04-01

    elastic-plastic fracture mechanics ( EPFM ) methodologies are based on the J-integral or the crack opening displacement (COD) approach. The J-resistance curve...in the HRR field [13,141. In this paper, we present further application of the approximate J-evaluation procedure in large 2024-0 and 5052-H32 aluminum...Davis, J. A. Joyce, and R. A. Hays, " Application of the J-Integral and the Modified J-Integral to Cases of Large Crack Extension and High Toughness

  3. Application of a substructuring technique to the problem of crack extension and closure

    NASA Technical Reports Server (NTRS)

    Armen, H., Jr.

    1974-01-01

    A substructuring technique, originally developed for the efficient reanalysis of structures, is incorporated into the methodology associated with the plastic analysis of structures. An existing finite-element computer program that accounts for elastic-plastic material behavior under cyclic loading was modified to account for changing kinematic constraint conditions - crack growth and intermittent contact of crack surfaces in two dimensional regions. Application of the analysis is presented for a problem of a centercrack panel to demonstrate the efficiency and accuracy of the technique.

  4. The Effects of Soldier Gear Encumbrance on Restraints in a Frontal Crash Environment

    DTIC Science & Technology

    2015-08-31

    their gear poses a challenge in restraint system design that is not typical in the automotive world. •The weight of the gear encumbrance may have a...Distribution Statement A. Approved for public release. TEST METHODOLOGY •A modified rigid steel seat similar to the type used for ECE R16 compliance testing...structure were non-deformable. 6 Shoulder Restraints Steel Non Deformable D-Rings 5th Point Restraint 5th Point Exiting Through the Seat

  5. Automobile Industry Retail Price Equivalent and Indirect Cost ...

    EPA Pesticide Factsheets

    This report develops a modified multiplier, referred to as an indirect cost (IC) multiplier, which specifically evaluates the components of indirect costs that are likely to be affected by vehicle modifications associated with environmental regulation. A range of IC multipliers are developed that 1) account for differences in the technical complexity of required vehicle modifications and 2) adjust over time as new technologies become assimilated into the automotive production process. To develop an improved methodology for estimating indirect costs of new environmental regulations on automobile manufacturers.

  6. Fast and Efficient Drosophila melanogaster Gene Knock-Ins Using MiMIC Transposons

    PubMed Central

    Vilain, Sven; Vanhauwaert, Roeland; Maes, Ine; Schoovaerts, Nils; Zhou, Lujia; Soukup, Sandra; da Cunha, Raquel; Lauwers, Elsa; Fiers, Mark; Verstreken, Patrik

    2014-01-01

    Modern molecular genetics studies necessitate the manipulation of genes in their endogenous locus, but most of the current methodologies require an inefficient donor-dependent homologous recombination step to locally modify the genome. Here we describe a methodology to efficiently generate Drosophila knock-in alleles by capitalizing on the availability of numerous genomic MiMIC transposon insertions carrying recombinogenic attP sites. Our methodology entails the efficient PhiC31-mediated integration of a recombination cassette flanked by unique I-SceI and/or I-CreI restriction enzyme sites into an attP-site. These restriction enzyme sites allow for double-strand break−mediated removal of unwanted flanking transposon sequences, while leaving the desired genomic modifications or recombination cassettes. As a proof-of-principle, we mutated LRRK, tau, and sky by using different MiMIC elements. We replaced 6 kb of genomic DNA encompassing the tau locus and 35 kb encompassing the sky locus with a recombination cassette that permits easy integration of DNA at these loci and we also generated a functional LRRKHA knock in allele. Given that ~92% of the Drosophila genes are located within the vicinity (<35 kb) of a MiMIC element, our methodology enables the efficient manipulation of nearly every locus in the fruit fly genome without the need for inefficient donor-dependent homologous recombination events. PMID:25298537

  7. Fast and efficient Drosophila melanogaster gene knock-ins using MiMIC transposons.

    PubMed

    Vilain, Sven; Vanhauwaert, Roeland; Maes, Ine; Schoovaerts, Nils; Zhou, Lujia; Soukup, Sandra; da Cunha, Raquel; Lauwers, Elsa; Fiers, Mark; Verstreken, Patrik

    2014-10-08

    Modern molecular genetics studies necessitate the manipulation of genes in their endogenous locus, but most of the current methodologies require an inefficient donor-dependent homologous recombination step to locally modify the genome. Here we describe a methodology to efficiently generate Drosophila knock-in alleles by capitalizing on the availability of numerous genomic MiMIC transposon insertions carrying recombinogenic attP sites. Our methodology entails the efficient PhiC31-mediated integration of a recombination cassette flanked by unique I-SceI and/or I-CreI restriction enzyme sites into an attP-site. These restriction enzyme sites allow for double-strand break-mediated removal of unwanted flanking transposon sequences, while leaving the desired genomic modifications or recombination cassettes. As a proof-of-principle, we mutated LRRK, tau, and sky by using different MiMIC elements. We replaced 6 kb of genomic DNA encompassing the tau locus and 35 kb encompassing the sky locus with a recombination cassette that permits easy integration of DNA at these loci and we also generated a functional LRRK(HA) knock in allele. Given that ~92% of the Drosophila genes are located within the vicinity (<35 kb) of a MiMIC element, our methodology enables the efficient manipulation of nearly every locus in the fruit fly genome without the need for inefficient donor-dependent homologous recombination events. Copyright © 2014 Vilain et al.

  8. Solving a methodological challenge in work stress evaluation with the Stress Assessment and Research Toolkit (StART): a study protocol.

    PubMed

    Guglielmi, Dina; Simbula, Silvia; Vignoli, Michela; Bruni, Ilaria; Depolo, Marco; Bonfiglioli, Roberta; Tabanelli, Maria Carla; Violante, Francesco Saverio

    2013-06-22

    Stress evaluation is a field of strong interest and challenging due to several methodological aspects in the evaluation process. The aim of this study is to propose a study protocol to test a new method (i.e., the Stress Assessment and Research Toolkit) to assess psychosocial risk factors at work. This method addresses several methodological issues (e.g., subjective vs. objective, qualitative vs quantitative data) by assessing work-related stressors using different kinds of data: i) organisational archival data (organisational indicators sheet); ii) qualitative data (focus group); iii) worker perception (questionnaire); and iv) observational data (observational checklist) using mixed methods research. In addition, it allows positive and negative aspects of work to be considered conjointly, using an approach that considers at the same time job demands and job resources. The integration of these sources of data can reduce the theoretical and methodological bias related to stress research in the work setting, allows researchers and professionals to obtain a reliable description of workers' stress, providing a more articulate vision of psychosocial risks, and allows a large amount of data to be collected. Finally, the implementation of the method ensures in the long term a primary prevention for psychosocial risk management in that it aims to reduce or modify the intensity, frequency or duration of organisational demands.

  9. Solving a methodological challenge in work stress evaluation with the Stress Assessment and Research Toolkit (StART): a study protocol

    PubMed Central

    2013-01-01

    Background Stress evaluation is a field of strong interest and challenging due to several methodological aspects in the evaluation process. The aim of this study is to propose a study protocol to test a new method (i.e., the Stress Assessment and Research Toolkit) to assess psychosocial risk factors at work. Design This method addresses several methodological issues (e.g., subjective vs. objective, qualitative vs quantitative data) by assessing work-related stressors using different kinds of data: i) organisational archival data (organisational indicators sheet); ii) qualitative data (focus group); iii) worker perception (questionnaire); and iv) observational data (observational checklist) using mixed methods research. In addition, it allows positive and negative aspects of work to be considered conjointly, using an approach that considers at the same time job demands and job resources. Discussion The integration of these sources of data can reduce the theoretical and methodological bias related to stress research in the work setting, allows researchers and professionals to obtain a reliable description of workers’ stress, providing a more articulate vision of psychosocial risks, and allows a large amount of data to be collected. Finally, the implementation of the method ensures in the long term a primary prevention for psychosocial risk management in that it aims to reduce or modify the intensity, frequency or duration of organisational demands. PMID:23799950

  10. Human perception testing methodology for evaluating EO/IR imaging systems

    NASA Astrophysics Data System (ADS)

    Graybeal, John J.; Monfort, Samuel S.; Du Bosq, Todd W.; Familoni, Babajide O.

    2018-04-01

    The U.S. Army's RDECOM CERDEC Night Vision and Electronic Sensors Directorate (NVESD) Perception Lab is tasked with supporting the development of sensor systems for the U.S. Army by evaluating human performance of emerging technologies. Typical research questions involve detection, recognition and identification as a function of range, blur, noise, spectral band, image processing techniques, image characteristics, and human factors. NVESD's Perception Lab provides an essential bridge between the physics of the imaging systems and the performance of the human operator. In addition to quantifying sensor performance, perception test results can also be used to generate models of human performance and to drive future sensor requirements. The Perception Lab seeks to develop and employ scientifically valid and efficient perception testing procedures within the practical constraints of Army research, including rapid development timelines for critical technologies, unique guidelines for ethical testing of Army personnel, and limited resources. The purpose of this paper is to describe NVESD Perception Lab capabilities, recent methodological improvements designed to align our methodology more closely with scientific best practice, and to discuss goals for future improvements and expanded capabilities. Specifically, we discuss modifying our methodology to improve training, to account for human fatigue, to improve assessments of human performance, and to increase experimental design consultation provided by research psychologists. Ultimately, this paper outlines a template for assessing human perception and overall system performance related to EO/IR imaging systems.

  11. Effect of the solvent on the size of clay nanoparticles in solution as determined using an ultraviolet-visible (UV-Vis) spectroscopy methodology.

    PubMed

    Alin, Jonas; Rubino, Maria; Auras, Rafael

    2015-06-01

    Ultraviolet-visible (UV-Vis) spectroscopy methodology was developed and utilized for the in situ nanoscale measurement of the size of mineral clay agglomerates in various liquid suspensions. The clays studied were organomodified and unmodified montmorillonite clays (I.44p, Cloisite 93a, and PGN). The methodology was compared and validated against dynamic light scattering (DLS) analysis. The method was able to measure clay agglomerates in solvents in situations where DLS analysis was unsuccessful due to the shapes, polydispersity, and high aspect ratios of the clay particles and the complexity of the aggregates, or dispersion medium. The measured clay agglomerates in suspension were found to be in the nanometer range in the more compatible solvents, and their sizes correlated with the Hansen solubility parameter space distance between the clay modifiers and the solvents. Mass detection limits for size determination were in the range from 1 to 9 mg/L. The methodology thus provides simple, rapid, and inexpensive characterization of clays or particles in the nano- or microsize range in low concentrations in various liquid media, including complex mixtures or highly viscous fluids that are difficult to analyze with DLS. In addition, by combining UV-VIS spectroscopy with DLS it was possible to discern flocculation behavior in liquids, which otherwise could result in false size measurements by DLS alone.

  12. Mapping human long bone compartmentalisation during ontogeny: a new methodological approach.

    PubMed

    Cambra-Moo, Oscar; Nacarino Meneses, Carmen; Rodríguez Barbero, Miguel Ángel; García Gil, Orosia; Rascón Pérez, Josefina; Rello-Varona, Santiago; Campo Martín, Manuel; González Martín, Armando

    2012-06-01

    Throughout ontogeny, human bones undergo differentiation in terms of shape, size and tissue type; this is a complex scenario in which the variations in the tissue compartmentalisation of the cortical bone are still poorly understood. Currently, compartmentalisation is studied using methodologies that oversimplify the bone tissue complexity. Here, we present a new methodological approach that integrates a histological description and a mineral content analysis to study the compartmentalisation of the whole mineralised and non-mineralised tissues (i.e., spatial distribution in long bone sections). This new methodology, based on Geographical Information System (GIS) software, allows us to draw areas of interest (i.e., tracing vectorial shapes which are quantifiable) in raw images that are extracted from microscope and compared them spatially in a semi-automatic and quantitative fashion. As an example of our methodology, we have studied the tibiae from individuals with different age at death (infant, juvenile and adult). The tibia's cortical bone presents a well-formed fibrolamellar bone, in which remodelling is clearly evidenced from early ontogeny, and we discuss the existence of "lines of arrested growth". Concurrent with the histological variation, Raman and FT-IR spectroscopy analyses corroborate that the mineral content in the cortical bone changes differentially. The anterior portion of the tibia remains highly pierced and is less crystalline than the rest of the cortex during growth, which is evidence of more active and continuous remodelling. Finally, while porosity and other "non-mineralised cavities" are largely modified, the mineralised portion and the marrow cavity size persist proportionally during ontogeny. Copyright © 2012 Elsevier Inc. All rights reserved.

  13. Enhanced methodology for porting ion chromatography retention data.

    PubMed

    Park, Soo Hyun; Shellie, Robert A; Dicinoski, Greg W; Schuster, Georg; Talebi, Mohammad; Haddad, Paul R; Szucs, Roman; Dolan, John W; Pohl, Christopher A

    2016-03-04

    Porting is a powerful methodology to recalibrate an existing database of ion chromatography (IC) retention times by reflecting the changes of column behavior resulting from either batch-to-batch variability in the production of the column or the manufacture of new versions of a column. This approach has been employed to update extensive databases of retention data of inorganic and organic anions forming part of the "Virtual Column" software marketed by Thermo Fisher Scientific, which is the only available commercial optimization tool for IC separation. The current porting process is accomplished by performing three isocratic separations with two representative analyte ions in order to derive a porting equation which expresses the relationship between old and new data. Although the accuracy of retention prediction is generally enhanced on new columns, errors were observed on some columns. In this work, the porting methodology was modified in order to address this issue, where the porting equation is now derived by using six representative analyte ions (chloride, bromide, iodide, perchlorate, sulfate, and thiosulfate). Additionally, the updated porting methodology has been applied on three Thermo Fisher Scientific columns (AS20, AS19, and AS11HC). The proposed approach showed that the new porting methodology can provide more accurate and robust retention prediction on a wide range of columns, where average errors in retention times for ten test anions under three eluent conditions were less than 1.5%. Moreover, the retention prediction using this new approach provided an acceptable level of accuracy on a used column exhibiting changes in ion-exchange capacity. Crown Copyright © 2016. Published by Elsevier B.V. All rights reserved.

  14. Interventions to modify sexual risk behaviours for preventing HIV in homeless youth

    PubMed Central

    Naranbhai, Vivek; Karim, Quarraisha Abdool; Meyer-Weitz, Anna

    2013-01-01

    Background Homeless youth are at high risk for HIV infection as a consequence of risky sexual behavior. Interventions in homeless youth are challenging. Assessment of the effectiveness of interventions to modify sexual risk behaviours for preventing HIV in homeless youth is needed. Objectives To evaluate and summarize the effectiveness of interventions for modifying sexual risk behaviours and preventing transmission of HIV among homeless youth. Search methods We searched electronic databases (CENTRAL, Medline, EMBASE, AIDSearch, Gateway, PsycInfo, LILACS), reference lists of eligible articles, international health agency publication lists, and clinical trial registries. The search was updated January 2010. We contacted authors of published reports and other key role players. Selection criteria Randomized studies of interventions to modify sexual risk behavior (biological, self-report sexual-risk behavior or health seeking behavior) in homeless youth (12–24 years). Data collection and analysis Data from eligible studies were extracted by two reviewers. We assessed risk of bias per the Cochrane Collaborations tool. None of the eligible studies reported any primary biological outcomes for this review and the reporting of self-report sexual risk behavior outcomes was highly variable across studies precluding calculation of summary measures of effect; we present the outcomes descriptively for each study. We contacted authors for missing or ambiguous data. Results We identified three eligible studies after screening a total of 255 unique records. All three were performed in the United States of America and recruited substance-abusing male and female adolescents (total N=615) through homeless shelters into randomised controlled trials of independent and non-overlapping behavioural interventions. The three trials differed in theoretical background, delivery method, dosage (number of sessions,) content and outcome assessments. Overall, the variability in delivery and outcomes precluded estimation of summary of effect measures. We assessed the risk of bias to be high for each of the studies. Whilst some effect of the interventions on outcome measures were reported, heterogeneity and lack of robustness in these studies necessitate caution in interpreting the effectiveness of these interventions. Authors’ conclusions The body of evidence does not permit conclusions on the impact of interventions to modify sexual risk behaviour in homeless youth. More research is required. While the psychosocial and contextual factors that fuel sexual risk behaviours among homeless youth challenge stringent methodologies of RCT’s, novel ways for program delivery and trial retention are in need of development. Future trials should endeavour to comply with rigorous methodology in design, delivery, outcome measurement and reporting. PMID:21249691

  15. Cloning cattle: the methods in the madness.

    PubMed

    Oback, Björn; Wells, David N

    2007-01-01

    Somatic cell nuclear transfer (SCNT) is much more widely and efficiently practiced in cattle than in any other species, making this arguably the most important mammal cloned to date. While the initial objective behind cattle cloning was commercially driven--in particular to multiply genetically superior animals with desired phenotypic traits and to produce genetically modified animals-researchers have now started to use bovine SCNT as a tool to address diverse questions in developmental and cell biology. In this paper, we review current cattle cloning methodologies and their potential technical or biological pitfalls at any step of the procedure. In doing so, we focus on one methodological parameter, namely donor cell selection. We emphasize the impact of epigenetic and genetic differences between embryonic, germ, and somatic donor cell types on cloning efficiency. Lastly, we discuss adult phenotypes and fitness of cloned cattle and their offspring and illustrate some of the more imminent commercial cattle cloning applications.

  16. An Overview of Ophthalmologic Survey Methodology in the 2008-2015 Korean National Health and Nutrition Examination Surveys.

    PubMed

    Yoon, Kyung Chul; Choi, Won; Lee, Hyo Seok; Kim, Sang-Duck; Kim, Seung-Hyun; Kim, Chan Yun; Park, Ki Ho; Park, Young Jeung; Baek, Seung-Hee; Song, Su Jeong; Shin, Jae Pil; Yang, Suk-Woo; Yu, Seung-Young; Lee, Jong Soo; Lim, Key Hwan; Oh, Kyung Won; Kang, Se Woong

    2015-12-01

    The Korea National Health and Nutrition Examination Survey (KNHANES) is a national program designed to assess the health and nutritional status of the noninstitutionalized population of South Korea. The KNHANES was initiated in 1998 and has been conducted annually since 2007. Starting in the latter half of 2008, ophthalmologic examinations were included in the survey in order to investigate the prevalence and risk factors of common eye diseases such as visual impairment, refractive errors, strabismus, blepharoptosis, cataract, pterygium, diabetic retinopathy, age-related macular degeneration, glaucoma, dry eye disease, and color vision deficiency. The measurements included in the ophthalmic questionnaire and examination methods were modified in the KNHANES IV, V, and VI. In this article, we provide detailed information about the methodology of the ophthalmic examinations in KNHANES in order to aid in further investigations related to major eye diseases in South Korea.

  17. Computer-Aided Methodology for Syndromic Strabismus Diagnosis.

    PubMed

    Sousa de Almeida, João Dallyson; Silva, Aristófanes Corrêa; Teixeira, Jorge Antonio Meireles; Paiva, Anselmo Cardoso; Gattass, Marcelo

    2015-08-01

    Strabismus is a pathology that affects approximately 4 % of the population, causing aesthetic problems reversible at any age and irreversible sensory alterations that modify the vision mechanism. The Hirschberg test is one type of examination for detecting this pathology. Computer-aided detection/diagnosis is being used with relative success to aid health professionals. Nevertheless, the routine use of high-tech devices for aiding ophthalmological diagnosis and therapy is not a reality within the subspecialty of strabismus. Thus, this work presents a methodology to aid in diagnosis of syndromic strabismus through digital imaging. Two hundred images belonging to 40 patients previously diagnosed by an specialist were tested. The method was demonstrated to be 88 % accurate in esotropias identification (ET), 100 % for exotropias (XT), 80.33 % for hypertropias (HT), and 83.33 % for hypotropias (HoT). The overall average error was 5.6Δ and 3.83Δ for horizontal and vertical deviations, respectively, against the measures presented by the specialist.

  18. Assessing avian richness in remnant wetlands: Towards an improved methodology

    USGS Publications Warehouse

    Krzys, Greg; Waite, Thomas A.; Stapanian, Martin; Vucetich, John A.

    2002-01-01

    Because the North American Breeding Bird Survey provides inadequate coverage of wetland habitat, the Wetland Breeding Bird Survey was recently established in Ohio, USA. This program relies on volunteers to conduct 3 counts at each monitored wetland. Currently, all counts are conducted during the morning. Under the premise that volunteer participation could be increased by allowing evening counts, we evaluated the potential for modifying the methodology. We evaluated the sampling efficiency of all 3-count combinations of morning and evening counts using data collected at 14 wetlands. Estimates of overall species richness decreased with increasing numbers of evening counts. However, this pattern did not hold when analyses were restricted to wetland-dependent species or those of conservation concern. Our findings suggest that it would be reasonable to permit evening counts, particularly if the data are to be used to monitor wetland dependent species and those of concern.

  19. Case Study: Applying OpenEHR Archetypes to a Clinical Data Repository in a Chinese Hospital.

    PubMed

    Min, Lingtong; Wang, Li; Lu, Xudong; Duan, Huilong

    2015-01-01

    openEHR is a flexible and scalable modeling methodology for clinical information and has been widely adopted in Europe and Australia. Due to the reasons of differences in clinical process and management, there are few research projects involving openEHR in China. To investigate the feasibility of openEHR methodology for clinical information modelling in China, this paper carries out a case study to apply openEHR archetypes to Clinical Data Repository (CDR) in a Chinese hospital. The results show that a set of 26 archetypes are found to cover all the concepts used in the CDR. Of all these, 9 (34.6%) are reused without change, 10 are modified and/or extended, and 7 are newly defined. The reasons for modification, extension and newly definition have been discussed, including granularity of archetype, metadata-level versus data-level modelling, and the representation of relationships between archetypes.

  20. Modified Dynamic Inversion to Control Large Flexible Aircraft: What's Going On?

    NASA Technical Reports Server (NTRS)

    Gregory, Irene M.

    1999-01-01

    High performance aircraft of the future will be designed lighter, more maneuverable, and operate over an ever expanding flight envelope. One of the largest differences from the flight control perspective between current and future advanced aircraft is elasticity. Over the last decade, dynamic inversion methodology has gained considerable popularity in application to highly maneuverable fighter aircraft, which were treated as rigid vehicles. This paper explores dynamic inversion application to an advanced highly flexible aircraft. An initial application has been made to a large flexible supersonic aircraft. In the course of controller design for this advanced vehicle, modifications were made to the standard dynamic inversion methodology. The results of this application were deemed rather promising. An analytical study has been undertaken to better understand the nature of the made modifications and to determine its general applicability. This paper presents the results of this initial analytical look at the modifications to dynamic inversion to control large flexible aircraft.

  1. Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS)

    NASA Technical Reports Server (NTRS)

    Alexander, Tiffaney Miller

    2017-01-01

    Research results have shown that more than half of aviation, aerospace and aeronautics mishaps incidents are attributed to human error. As a part of Safety within space exploration ground processing operations, the identification and/or classification of underlying contributors and causes of human error must be identified, in order to manage human error. This research provides a framework and methodology using the Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS), as an analysis tool to identify contributing factors, their impact on human error events, and predict the Human Error probabilities (HEPs) of future occurrences. This research methodology was applied (retrospectively) to six (6) NASA ground processing operations scenarios and thirty (30) years of Launch Vehicle related mishap data. This modifiable framework can be used and followed by other space and similar complex operations.

  2. Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS)

    NASA Technical Reports Server (NTRS)

    Alexander, Tiffaney Miller

    2017-01-01

    Research results have shown that more than half of aviation, aerospace and aeronautics mishaps/incidents are attributed to human error. As a part of Safety within space exploration ground processing operations, the identification and/or classification of underlying contributors and causes of human error must be identified, in order to manage human error. This research provides a framework and methodology using the Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS), as an analysis tool to identify contributing factors, their impact on human error events, and predict the Human Error probabilities (HEPs) of future occurrences. This research methodology was applied (retrospectively) to six (6) NASA ground processing operations scenarios and thirty (30) years of Launch Vehicle related mishap data. This modifiable framework can be used and followed by other space and similar complex operations.

  3. Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS)

    NASA Technical Reports Server (NTRS)

    Alexander, Tiffaney Miller

    2017-01-01

    Research results have shown that more than half of aviation, aerospace and aeronautics mishaps incidents are attributed to human error. As a part of Quality within space exploration ground processing operations, the identification and or classification of underlying contributors and causes of human error must be identified, in order to manage human error.This presentation will provide a framework and methodology using the Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS), as an analysis tool to identify contributing factors, their impact on human error events, and predict the Human Error probabilities (HEPs) of future occurrences. This research methodology was applied (retrospectively) to six (6) NASA ground processing operations scenarios and thirty (30) years of Launch Vehicle related mishap data. This modifiable framework can be used and followed by other space and similar complex operations.

  4. 2dFLenS and KiDS: determining source redshift distributions with cross-correlations

    NASA Astrophysics Data System (ADS)

    Johnson, Andrew; Blake, Chris; Amon, Alexandra; Erben, Thomas; Glazebrook, Karl; Harnois-Deraps, Joachim; Heymans, Catherine; Hildebrandt, Hendrik; Joudaki, Shahab; Klaes, Dominik; Kuijken, Konrad; Lidman, Chris; Marin, Felipe A.; McFarland, John; Morrison, Christopher B.; Parkinson, David; Poole, Gregory B.; Radovich, Mario; Wolf, Christian

    2017-03-01

    We develop a statistical estimator to infer the redshift probability distribution of a photometric sample of galaxies from its angular cross-correlation in redshift bins with an overlapping spectroscopic sample. This estimator is a minimum-variance weighted quadratic function of the data: a quadratic estimator. This extends and modifies the methodology presented by McQuinn & White. The derived source redshift distribution is degenerate with the source galaxy bias, which must be constrained via additional assumptions. We apply this estimator to constrain source galaxy redshift distributions in the Kilo-Degree imaging survey through cross-correlation with the spectroscopic 2-degree Field Lensing Survey, presenting results first as a binned step-wise distribution in the range z < 0.8, and then building a continuous distribution using a Gaussian process model. We demonstrate the robustness of our methodology using mock catalogues constructed from N-body simulations, and comparisons with other techniques for inferring the redshift distribution.

  5. The fuzzy cube and causal efficacy: representation of concomitant mechanisms in stroke.

    PubMed

    Jobe, Thomas H.; Helgason, Cathy M.

    1998-04-01

    Twentieth century medical science has embraced nineteenth century Boolean probability theory based upon two-valued Aristotelian logic. With the later addition of bit-based, von Neumann structured computational architectures, an epistemology based on randomness has led to a bivalent epidemiological methodology that dominates medical decision making. In contrast, fuzzy logic, based on twentieth century multi-valued logic, and computational structures that are content addressed and adaptively modified, has advanced a new scientific paradigm for the twenty-first century. Diseases such as stroke involve multiple concomitant causal factors that are difficult to represent using conventional statistical methods. We tested which paradigm best represented this complex multi-causal clinical phenomenon-stroke. We show that the fuzzy logic paradigm better represented clinical complexity in cerebrovascular disease than current probability theory based methodology. We believe this finding is generalizable to all of clinical science since multiple concomitant causal factors are involved in nearly all known pathological processes.

  6. Transmutation of singularities and zeros in graded index optical instruments: a methodology for designing practical devices.

    PubMed

    Hooper, I R; Philbin, T G

    2013-12-30

    We describe a design methodology for modifying the refractive index profile of graded-index optical instruments that incorporate singularities or zeros in their refractive index. The process maintains the device performance whilst resulting in graded profiles that are all-dielectric, do not require materials with unrealistic values, and that are impedance matched to the bounding medium. This is achieved by transmuting the singularities (or zeros) using the formalism of transformation optics, but with an additional boundary condition requiring the gradient of the co-ordinate transformation be continuous. This additional boundary condition ensures that the device is impedance matched to the bounding medium when the spatially varying permittivity and permeability profiles are scaled to realizable values. We demonstrate the method in some detail for an Eaton lens, before describing the profiles for an "invisible disc" and "multipole" lenses.

  7. [The root of the deep and fast ongoing evolution of both structure and methodology of clinical research].

    PubMed

    Tavazzi, Luigi

    2016-03-01

    The growing scientific knowledge and technology development are leading to radical changes in biological and medical research. The prevalent lines of development deal with a pragmatic evolution of controlled clinical trials, a massive diffusion of observational research, which is progressively incorporated in clinical practice, new models and designs of clinical research, the systematic use of information technology to build up vast networks of medical centers producing huge amounts of shared data to be managed through the big data methodology, personalized as well as precision medicine, a reshaped physician-patient relationship based on a co-working principle. All this is leading to profound changes in public health governance, a renewal of clinical epidemiology and prevention, a modified structure of several specific sectors of medical care, hopefully guided by scientific evidences. A few aspects of such an evolving picture are discussed in this article.

  8. Tracking and Control of Gas Turbine Engine Component Damage/Life

    NASA Technical Reports Server (NTRS)

    Jaw, Link C.; Wu, Dong N.; Bryg, David J.

    2003-01-01

    This paper describes damage mechanisms and the methods of controlling damages to extend the on-wing life of critical gas turbine engine components. Particularly, two types of damage mechanisms are discussed: creep/rupture and thermo-mechanical fatigue. To control these damages and extend the life of engine hot-section components, we have investigated two methodologies to be implemented as additional control logic for the on-board electronic control unit. This new logic, the life-extending control (LEC), interacts with the engine control and monitoring unit and modifies the fuel flow to reduce component damages in a flight mission. The LEC methodologies were demonstrated in a real-time, hardware-in-the-loop simulation. The results show that LEC is not only a new paradigm for engine control design, but also a promising technology for extending the service life of engine components, hence reducing the life cycle cost of the engine.

  9. Simulating Colour Vision Deficiency from a Spectral Image.

    PubMed

    Shrestha, Raju

    2016-01-01

    People with colour vision deficiency (CVD) have difficulty seeing full colour contrast and can miss some of the features in a scene. As a part of universal design, researcher have been working on how to modify and enhance the colour of images in order to make them see the scene with good contrast. For this, it is important to know how the original colour image is seen by different individuals with CVD. This paper proposes a methodology to simulate accurate colour deficient images from a spectral image using cone sensitivity of different cases of deficiency. As the method enables generation of accurate colour deficient image, the methodology is believed to help better understand the limitations of colour vision deficiency and that in turn leads to the design and development of more effective imaging technologies for better and wider accessibility in the context of universal design.

  10. Response surface methodology for the determination of the design space of enantiomeric separations on cinchona-based zwitterionic chiral stationary phases by high performance liquid chromatography.

    PubMed

    Hanafi, Rasha Sayed; Lämmerhofer, Michael

    2018-01-26

    Quality-by-Design approach for enantioselective HPLC method development surpasses Quality-by-Testing in offering the optimal separation conditions with the least number of experiments and in its ability to describe the method's Design Space visually which helps to determine enantiorecognition to a significant extent. Although some schemes exist for enantiomeric separations on Cinchona-based zwitterionic stationary phases, the exact design space and the weights by which each of the chromatographic parameters influences the separation have not yet been statistically studied. In the current work, a screening design followed by a Response Surface Methodology optimization design were adopted for enantioseparation optimization of 3 model drugs namely the acidic Fmoc leucine, the amphoteric tryptophan and the basic salbutamol. The screening design proved that the acid/base additives are of utmost importance for the 3 chiral drugs, and that among 3 different pairs of acids and bases, acetic acid and diethylamine is the couple able to provide acceptable resolution at variable conditions. Visualization of the response surface of the retention factor, separation factor and resolution helped describe accurately the magnitude by which each chromatographic factor (% MeOH, concentration and ratio of acid base modifiers) affects the separation while interacting with other parameters. The global optima compromising highest enantioresolution with the least run time for the 3 chiral model drugs varied extremely, where it was best to set low % methanol with equal ratio of acid-base modifiers for the acidic drug, very high % methanol and 10-fold higher concentration of the acid for the amphoteric drug while 20 folds of the base modifier with moderate %methanol were needed for the basic drug. Considering the selected drugs as models for many series of structurally related compounds, the design space defined and the optimum conditions computed are the key for method development on cinchona-based chiral stationary phases. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Functionalization of silicon oxide using supercritical fluid deposition of 3,4-epoxybutyltrimethoxysilane for the immobilization of amino-modified oligonucleotide

    NASA Astrophysics Data System (ADS)

    Rull, Jordi; Nonglaton, Guillaume; Costa, Guillaume; Fontelaye, Caroline; Marchi-Delapierre, Caroline; Ménage, Stéphane; Marchand, Gilles

    2015-11-01

    The functionalization of silicon oxide based substrates using silanes is generally performed through liquid phase methodologies. These processes involve a huge quantity of potentially toxic solvents and present some important disadvantages for the functionalization of microdevices or porous materials, for example the low diffusion. To overcome this drawback, solvent-free methodologies like molecular vapor deposition (MVD) or supercritical fluid deposition (SFD) have been developed. In this paper, the deposition process of 3,4-epoxybutyltrimethoxysilane (EBTMOS) on silicon oxide using supercritical carbon dioxide (scCO2) as a solvent is studied for the first time. The oxirane ring of epoxy silanes readily reacts with amine group and is of particular interest for the grafting of amino-modified oligonucleotides or antibodies for diagnostic application. Then the ability of this specific EBTMOS layer to react with amine functions has been evaluated using the immobilization of amino-modified oligonucleotide probes. The presence of the probes is revealed by fluorescence using hybridization with a fluorescent target oligonucleotide. The performances of SFD of EBTMOS have been optimized and then compared with the dip coating and molecular vapor deposition methods, evidencing a better grafting efficiency and homogeneity, a lower reaction time in addition to the eco-friendly properties of the supercritical carbon dioxide. The epoxysilane layers have been characterized by surface enhanced ellipsometric contrast optical technique, atomic force microscopy, multiple internal reflection infrared spectroscopy and X-ray photoelectron spectroscopy. The shelf life of the 3,4-epoxybutyltrimethoxysilane coating layer has also been studied. Finally, two different strategies of NH2-oligonucleotide grafting on EBTMOS coating layer have been compared, i.e. reductive amination and nucleophilic substitution, SN2. This EBTMOS based coating layer can be used for a wide range of applications such as the preparation of new supported and recoverable catalysts and new integrated silicon microdevices for healthcare purposes.

  12. Engineering single-molecule, nanoscale, and microscale bio-functional materials via click chemistry

    NASA Astrophysics Data System (ADS)

    Daniele, Michael Angelo-Anthony

    To expand the design envelope and supplement the materials library available to biomaterials scientists, the copper(I)-catalyzed azide-alkyne cycloaddition (CuCAAC) was explored as a route to design, synthesize and characterize bio-functional small-molecules, nanoparticles, and microfibers. In each engineered system, the use of click chemistry provided facile, bio-orthogonal control for materials synthesis; moreover, the results provided a methodology and more complete, fundamental understanding of the use of click chemistry as a tool for the synergy of biotechnology, polymer and materials science. Fluorophores with well-defined photophysical characteristics (ranging from UV to NIR fluorescence) were used as building blocks for small-molecule, fluorescent biosensors. Fluorophores were paired to exhibit fluorescence resonant energy transfer (FRET) and used to probe the metabolic activity of carbazole 1,9a-dioxygenase (CARDO). The FRET pair exhibited a significant variation in PL response with exposure to the lysate of Pseudomonas resinovorans CA10, an organism which can degrade variants of both the donor and acceptor fluorophores. Nanoparticle systems were modified via CuCAAC chemistry to carry affinity tags for CARDO and were subsequently utilized for affinity based bioseparation of CARDO from crude cell lysate. The enzymes were baited with an azide-modified carbazolyl-moiety attached to a poly(propargyl acrylate) nanoparticle. Magnetic nanocluster systems were also modified via CuCAAC chemistry to carry fluorescent imaging tags. The iron-oxide nanoclusters were coated with poly(acrylic acid-co-propargyl acrylate) to provide a clickable surface. Ultimately, alternate Cu-free click chemistries were utilized to produce biohybrid microfibers. The biohybrid microfibers were synthesized under benign photopolymerization conditions inside a microchannel, allowing the encapsulation of viable bacteria. By adjusting pre-polymer solutions and laminar flow rates within the microchannel, the morphology, hydration, and thermal properties of the fibers were easily tuned. The methodology produced hydrogel fibers that sustained viable cells as demonstrated by the encapsulation and subsequent proliferation of Bacillus cereus and Escherichia coli communities.

  13. Restoring Natural Streamflow Variability by Modifying Multi-purpose Reservoir Operation

    NASA Astrophysics Data System (ADS)

    Shiau, J.

    2010-12-01

    Multi-purpose reservoirs typically provide benefits of water supply, hydroelectric power, and flood mitigation. Hydroelectric power generations generally do not consume water. However, temporal distribution of downstream flows is highly changed due to hydro-peaking effects. Associated with offstream diversion of water supplies for municipal, industrial, and agricultural requirements, natural streamflow characteristics of magnitude, duration, frequency, timing, and rate of change is significantly altered by multi-purpose reservoir operation. Natural flow regime has long been recognized a master factor for ecosystem health and biodiversity. Restoration of altered flow regime caused by multi-purpose reservoir operation is the main objective of this study. This study presents an optimization framework that modifying reservoir operation to seeking balance between human and environmental needs. The methodology presented in this study is applied to the Feitsui Reservoir, located in northern Taiwan, with main purpose of providing stable water-supply and auxiliary purpose of electricity generation and flood-peak attenuation. Reservoir releases are dominated by two decision variables, i.e., duration of water releases for each day and percentage of daily required releases within the duration. The current releasing policy of the Feitsui Reservoir releases water for water-supply and hydropower purposes during 8:00 am to 16:00 pm each day and no environmental flows releases. Although greater power generation is obtained by 100% releases distributed within 8-hour period, severe temporal alteration of streamflow is observed downstream of the reservoir. Modifying reservoir operation by relaxing these two variables and reserve certain ratio of streamflow as environmental flow to maintain downstream natural variability. The optimal reservoir releasing policy is searched by the multi-criterion decision making technique for considering reservoir performance in terms of shortage ratio and power generation and downstream hydrologic alterations in terms of ecological relevant indicators. The results show that the proposed methodology can mitigate hydro-peaking effects on natural variability, while maintains efficient reservoir operation.

  14. Methodological Reflections on the Contribution of Qualitative Research to the Evaluation of Clinical Ethics Support Services.

    PubMed

    Wäscher, Sebastian; Salloch, Sabine; Ritter, Peter; Vollmann, Jochen; Schildmann, Jan

    2017-05-01

    This article describes a process of developing, implementing and evaluating a clinical ethics support service intervention with the goal of building up a context-sensitive structure of minimal clinical-ethics in an oncology department without prior clinical ethics structure. Scholars from different disciplines have called for an improvement in the evaluation of clinical ethics support services (CESS) for different reasons over several decades. However, while a lot has been said about the concepts and methodological challenges of evaluating CESS up to the present time, relatively few empirical studies have been carried out. The aim of this article is twofold. On the one hand, it describes a process of development, modifying and evaluating a CESS intervention as part of the ETHICO research project, using the approach of qualitative-formative evaluation. On the other hand, it provides a methodological analysis which specifies the contribution of qualitative empirical methods to the (formative) evaluation of CESS. We conclude with a consideration of the strengths and limitations of qualitative evaluation research with regards to the evaluation and development of context sensitive CESS. We further discuss our own approach in contrast to rather traditional consult or committee models. © 2017 John Wiley & Sons Ltd.

  15. Discrete crack growth analysis methodology for through cracks in pressurized fuselage structures

    NASA Technical Reports Server (NTRS)

    Potyondy, David O.; Wawrzynek, Paul A.; Ingraffea, Anthony R.

    1994-01-01

    A methodology for simulating the growth of long through cracks in the skin of pressurized aircraft fuselage structures is described. Crack trajectories are allowed to be arbitrary and are computed as part of the simulation. The interaction between the mechanical loads acting on the superstructure and the local structural response near the crack tips is accounted for by employing a hierarchical modeling strategy. The structural response for each cracked configuration is obtained using a geometrically nonlinear shell finite element analysis procedure. Four stress intensity factors, two for membrane behavior and two for bending using Kirchhoff plate theory, are computed using an extension of the modified crack closure integral method. Crack trajectories are determined by applying the maximum tangential stress criterion. Crack growth results in localized mesh deletion, and the deletion regions are remeshed automatically using a newly developed all-quadrilateral meshing algorithm. The effectiveness of the methodology and its applicability to performing practical analyses of realistic structures is demonstrated by simulating curvilinear crack growth in a fuselage panel that is representative of a typical narrow-body aircraft. The predicted crack trajectory and fatigue life compare well with measurements of these same quantities from a full-scale pressurized panel test.

  16. Entropy-Based Performance Analysis of Jet Engines; Methodology and Application to a Generic Single-Spool Turbojet

    NASA Astrophysics Data System (ADS)

    Abbas, Mohammad

    Recently developed methodology that provides the direct assessment of traditional thrust-based performance of aerospace vehicles in terms of entropy generation (i.e., exergy destruction) is modified for stand-alone jet engines. This methodology is applied to a specific single-spool turbojet engine configuration. A generic compressor performance map along with modeled engine component performance characterizations are utilized in order to provide comprehensive traditional engine performance results (engine thrust, mass capture, and RPM), for on and off-design engine operation. Details of exergy losses in engine components, across the entire engine, and in the engine wake are provided and the engine performance losses associated with their losses are discussed. Results are provided across the engine operating envelope as defined by operational ranges of flight Mach number, altitude, and fuel throttle setting. The exergy destruction that occurs in the engine wake is shown to be dominant with respect to other losses, including all exergy losses that occur inside the engine. Specifically, the ratio of the exergy destruction rate in the wake to the exergy destruction rate inside the engine itself ranges from 1 to 2.5 across the operational envelope of the modeled engine.

  17. Detecting dryland degradation through the use of Time Series Segmentation and Residual Trend analysis (TSS-RESTREND)

    NASA Astrophysics Data System (ADS)

    Burrell, A. L.; Evans, J. P.; Liu, Y.

    2017-12-01

    Dryland degradation is an issue of international significance as dryland regions play a substantial role in global food production. Remotely sensed data provide the only long term, large scale record of changes within dryland ecosystems. The Residual Trend, or RESTREND, method is applied to satellite observations to detect dryland degradation. Whilst effective in most cases, it has been shown that the RESTREND method can fail to identify degraded pixels if the relationship between vegetation and precipitation has broken-down as a result of severe or rapid degradation. This study presents an extended version of the RESTREND methodology that incorporates the Breaks For Additive Seasonal and Trend method to identify step changes in the time series that are related to significant structural changes in the ecosystem, e.g. land use changes. When applied to Australia, this new methodology, termed Time Series Segmentation and Residual Trend analysis (TSS-RESTREND), was able to detect degradation in 5.25% of pixels compared to only 2.0% for RESTREND alone. This modified methodology was then assessed in two regions with known histories of degradation where it was found to accurately capture both the timing and directionality of ecosystem change.

  18. Dissemination of Periodontal Pathogens in the Bloodstream after Periodontal Procedures: A Systematic Review

    PubMed Central

    Horliana, Anna Carolina Ratto Tempestini; Chambrone, Leandro; Foz, Adriana Moura; Artese, Hilana Paula Carillo; Rabelo, Mariana de Sousa; Pannuti, Cláudio Mendes; Romito, Giuseppe Alexandre

    2014-01-01

    Background To date, there is no compilation of evidence-based information associating bacteremia and periodontal procedures. This systematic review aims to assess magnitude, duration, prevalence and nature of bacteremia caused by periodontal procedures. Study Design Systematic Review Types of Studies Reviewed MEDLINE, EMBASE and LILACS databases were searched in duplicate through August, 2013 without language restriction. Observational studies were included if blood samples were collected before, during or after periodontal procedures of patients with periodontitis. The methodological quality was assessed in duplicate using the modified Newcastle-Ottawa scale (NOS). Results Search strategy identified 509 potentially eligible articles and nine were included. Only four studies demonstrated high methodological quality, whereas five were of medium or low methodological quality. The study characteristics were considered too heterogeneous to conduct a meta-analysis. Among 219 analyzed patients, 106 (49.4%) had positive bacteremia. More frequent bacteria were S. viridans, A. actinomycetemcomitans P. gingivalis, M. micros and species Streptococcus and Actinomyces, although identification methods of microbiologic assays were different among studies. Clinical Implications Although half of the patients presented positive bacteremia after periodontal procedures, accurate results regarding the magnitude, duration and nature of bacteremia could not be confidentially assessed. PMID:24870125

  19. BETA: Behavioral testability analyzer and its application to high-level test generation and synthesis for testability. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Chen, Chung-Hsing

    1992-01-01

    In this thesis, a behavioral-level testability analysis approach is presented. This approach is based on analyzing the circuit behavioral description (similar to a C program) to estimate its testability by identifying controllable and observable circuit nodes. This information can be used by a test generator to gain better access to internal circuit nodes and to reduce its search space. The results of the testability analyzer can also be used to select test points or partial scan flip-flops in the early design phase. Based on selection criteria, a novel Synthesis for Testability approach call Test Statement Insertion (TSI) is proposed, which modifies the circuit behavioral description directly. Test Statement Insertion can also be used to modify circuit structural description to improve its testability. As a result, Synthesis for Testability methodology can be combined with an existing behavioral synthesis tool to produce more testable circuits.

  20. Computational analysis of Variable Thrust Engine (VTE) performance

    NASA Technical Reports Server (NTRS)

    Giridharan, M. G.; Krishnan, A.; Przekwas, A. J.

    1993-01-01

    The Variable Thrust Engine (VTE) of the Orbital Maneuvering Vehicle (OMV) uses a hypergolic propellant combination of Monomethyl Hydrazine (MMH) and Nitrogen Tetroxide (NTO) as fuel and oxidizer, respectively. The performance of the VTE depends on a number of complex interacting phenomena such as atomization, spray dynamics, vaporization, turbulent mixing, convective/radiative heat transfer, and hypergolic combustion. This study involved the development of a comprehensive numerical methodology to facilitate detailed analysis of the VTE. An existing Computational Fluid Dynamics (CFD) code was extensively modified to include the following models: a two-liquid, two-phase Eulerian-Lagrangian spray model; a chemical equilibrium model; and a discrete ordinate radiation heat transfer model. The modified code was used to conduct a series of simulations to assess the effects of various physical phenomena and boundary conditions on the VTE performance. The details of the models and the results of the simulations are presented.

  1. Clustering approaches to feature change detection

    NASA Astrophysics Data System (ADS)

    G-Michael, Tesfaye; Gunzburger, Max; Peterson, Janet

    2018-05-01

    The automated detection of changes occurring between multi-temporal images is of significant importance in a wide range of medical, environmental, safety, as well as many other settings. The usage of k-means clustering is explored as a means for detecting objects added to a scene. The silhouette score for the clustering is used to define the optimal number of clusters that should be used. For simple images having a limited number of colors, new objects can be detected by examining the change between the optimal number of clusters for the original and modified images. For more complex images, new objects may need to be identified by examining the relative areas covered by corresponding clusters in the original and modified images. Which method is preferable depends on the composition and range of colors present in the images. In addition to describing the clustering and change detection methodology of our proposed approach, we provide some simple illustrations of its application.

  2. Recent Advances in Subunit Vaccine Carriers

    PubMed Central

    Vartak, Abhishek; Sucheck, Steven J.

    2016-01-01

    The lower immunogenicity of synthetic subunit antigens, compared to live attenuated vaccines, is being addressed with improved vaccine carriers. Recent reports indicate that the physio-chemical properties of these carriers can be altered to achieve optimal antigen presentation, endosomal escape, particle bio-distribution, and cellular trafficking. The carriers can be modified with various antigens and ligands for dendritic cells targeting. They can also be modified with adjuvants, either covalently or entrapped in the matrix, to improve cellular and humoral immune responses against the antigen. As a result, these multi-functional carrier systems are being explored for use in active immunotherapy against cancer and infectious diseases. Advancing technology, improved analytical methods, and use of computational methodology have also contributed to the development of subunit vaccine carriers. This review details recent breakthroughs in the design of nano-particulate vaccine carriers, including liposomes, polymeric nanoparticles, and inorganic nanoparticles. PMID:27104575

  3. The forced-choice paradigm and the perception of facial expressions of emotion.

    PubMed

    Frank, M G; Stennett, J

    2001-01-01

    The view that certain facial expressions of emotion are universally agreed on has been challenged by studies showing that the forced-choice paradigm may have artificially forced agreement. This article addressed this methodological criticism by offering participants the opportunity to select a none of these terms are correct option from a list of emotion labels in a modified forced-choice paradigm. The results show that agreement on the emotion label for particular facial expressions is still greater than chance, that artifactual agreement on incorrect emotion labels is obviated, that participants select the none option when asked to judge a novel expression, and that adding 4 more emotion labels does not change the pattern of agreement reported in universality studies. Although the original forced-choice format may have been prone to artifactual agreement, the modified forced-choice format appears to remedy that problem.

  4. [Surgeons training: today as always?].

    PubMed

    Jesus, Lisieux Eyer de

    2009-12-01

    This paper proposes to discuss the training methodologies for young surgeons, considering the modern needs, by discussing their expectations and the reality of the surgeons' job market nowadays. Scientific and technological novelties, the huge amount of information imposed daily, managerial interventions and cost issues modified radically the activities of the surgeons, especially if compared to classical conceptions. Recent re-readings of the classical ethical postulates demand a new behavior of the doctors concerning the patients and the society per se. Contemporaneous social culture bring about individual expectations concerning quality of life and professional perspective issues. It becomes necessary to modify the training methods for surgeons to make them adequate to the need of continuous learning and adaptation to new technological instruments. They also should adapt to social interactions with the patients and the other health professionals that fit nowadays expectations. Those structural adaptations are fundamental to maintain the interest of the new professionals in the area of surgery.

  5. Phenotype-Based Screening of Small Molecules to Modify Plant Cell Walls Using BY-2 Cells.

    PubMed

    Okubo-Kurihara, Emiko; Matsui, Minami

    2018-01-01

    The plant cell wall is an important and abundant biomass with great potential for use as a modern recyclable resource. For effective utilization of this cellulosic biomass, its ability to degrade efficiently is key point. With the aim of modifying the cell wall to allow easy decomposition, we used chemical biological technology to alter its structure. As a first step toward evaluating the chemicals in the cell wall we employed a phenotype-based approach using high-throughput screening. As the plant cell wall is essential in determining cell morphology, phenotype-based screening is particularly effective in identifying compounds that bring about alterations in the cell wall. For rapid and reproducible screening, tobacco BY-2 cell is an excellent system in which to observe cell morphology. In this chapter, we provide a detailed chemical biological methodology for studying cell morphology using tobacco BY-2 cells.

  6. Optimized Reaction Conditions for Amide Bond Formation in DNA-Encoded Combinatorial Libraries.

    PubMed

    Li, Yizhou; Gabriele, Elena; Samain, Florent; Favalli, Nicholas; Sladojevich, Filippo; Scheuermann, Jörg; Neri, Dario

    2016-08-08

    DNA-encoded combinatorial libraries are increasingly being used as tools for the discovery of small organic binding molecules to proteins of biological or pharmaceutical interest. In the majority of cases, synthetic procedures for the formation of DNA-encoded combinatorial libraries incorporate at least one step of amide bond formation between amino-modified DNA and a carboxylic acid. We investigated reaction conditions and established a methodology by using 1-ethyl-3-(3-(dimethylamino)propyl)carbodiimide, 1-hydroxy-7-azabenzotriazole and N,N'-diisopropylethylamine (EDC/HOAt/DIPEA) in combination, which provided conversions greater than 75% for 423/543 (78%) of the carboxylic acids tested. These reaction conditions were efficient with a variety of primary and secondary amines, as well as with various types of amino-modified oligonucleotides. The reaction conditions, which also worked efficiently over a broad range of DNA concentrations and reaction scales, should facilitate the synthesis of novel DNA-encoded combinatorial libraries.

  7. DTREEv2, a computer-based support system for the risk assessment of genetically modified plants.

    PubMed

    Pertry, Ine; Nothegger, Clemens; Sweet, Jeremy; Kuiper, Harry; Davies, Howard; Iserentant, Dirk; Hull, Roger; Mezzetti, Bruno; Messens, Kathy; De Loose, Marc; de Oliveira, Dulce; Burssens, Sylvia; Gheysen, Godelieve; Tzotzos, George

    2014-03-25

    Risk assessment of genetically modified organisms (GMOs) remains a contentious area and a major factor influencing the adoption of agricultural biotech. Methodologically, in many countries, risk assessment is conducted by expert committees with little or no recourse to databases and expert systems that can facilitate the risk assessment process. In this paper we describe DTREEv2, a computer-based decision support system for the identification of hazards related to the introduction of GM-crops into the environment. DTREEv2 structures hazard identification and evaluation by means of an Event-Tree type of analysis. The system produces an output flagging identified hazards and potential risks. It is intended to be used for the preparation and evaluation of biosafety dossiers and, as such, its usefulness extends to researchers, risk assessors and regulators in government and industry. Copyright © 2013 Elsevier B.V. All rights reserved.

  8. 'Mendelian randomization': an approach for exploring causal relations in epidemiology.

    PubMed

    Gupta, V; Walia, G K; Sachdeva, M P

    2017-04-01

    To assess the current status of Mendelian randomization (MR) approach in effectively influencing the observational epidemiology for examining causal relationships. Narrative review on studies related to principle, strengths, limitations, and achievements of MR approach. Observational epidemiological studies have repeatedly produced several beneficiary associations which were discarded when tested by standard randomized controlled trials (RCTs). The technique which is more feasible, highly similar to RCTs, and has the potential to establish a causal relationship between modifiable exposures and disease outcomes is known as MR. The technique uses genetic variants related to modifiable traits/exposures as instruments for detecting causal and directional associations with outcomes. In the last decade, the approach of MR has methodologically developed and progressed to a stage of high acceptance among the epidemiologists and is gradually expanding the landscape of causal relationships in non-communicable chronic diseases. Copyright © 2016 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.

  9. Pencil graphite electrodes for improved electrochemical detection of oleuropein by the combination of Natural Deep Eutectic Solvents and graphene oxide.

    PubMed

    Gomez, Federico J V; Spisso, Adrian; Fernanda Silva, María

    2017-11-01

    A novel methodology is presented for the enhanced electrochemical detection of oleuropein in complex plant matrices by Graphene Oxide Pencil Grahite Electrode (GOPGE) in combination with a buffer modified with a Natural Deep Eutectic Solvent, containing 10% (v/v) of Lactic acid, Glucose and H 2 O (LGH). The electrochemical behavior of oleuropein in the modified-working buffer was examined using differential pulse voltammetry. The combination of both modifications, NADES modified buffer and nanomaterial modified electrode, LGH-GOPGE, resulted on a signal enhancement of 5.3 times higher than the bare electrode with unmodified buffer. A calibration curve of oleuropein was performed between 0.10 to 37 μM and a good linearity was obtained with a correlation coefficient of 0.989. Detection and quantification limits of the method were obtained as 30 and 102 nM, respectively. In addition, precision studies indicated that the voltammetric method was sufficiently repeatable, %RSD 0.01 and 3.16 (n = 5) for potential and intensity, respectively. Finally, the proposed electrochemical sensor was successfully applied to the determination of oleuropein in an olive leaf extract prepared by ultrasound-assisted extraction. The results obtained with the proposed electrochemical sensor were compared with Capillary Zone Electrophoresis analysis with satisfactory results. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Development of a nationwide consensus syllabus of palliative medicine for undergraduate medical education in Japan: a modified Delphi method.

    PubMed

    Kizawa, Yoshiyuki; Tsuneto, Satoru; Tamba, Kaichiro; Takamiya, Yusuke; Morita, Tatsuya; Bito, Seiji; Otaki, Junji

    2012-07-01

    There is currently no consensus syllabus of palliative medicine for undergraduate medical education in Japan, although the Cancer Control Act proposed in 2007 covers the dissemination of palliative care. To develop a nationwide consensus syllabus of palliative medicine for undergraduate medical education in Japan using a modified Delphi method. We adopted the following three-step method: (1) a workshop to produce the draft syllabus; (2) a survey-based provisional syllabus; (3) Delphi rounds and a panel meeting (modified Delphi method) to produce the working syllabus. Educators in charge of palliative medicine from 63% of the medical schools in Japan collaborated to develop a survey-based provisional syllabus before the Delphi rounds. A panel of 32 people was then formed for the modified Delphi rounds comprising 28 educators and experts in palliative medicine, one cancer survivor, one bereaved family member, and two medical students. The final consensus syllabus consists of 115 learning objectives across seven sections as follows: basic principles; disease process and comprehensive assessment; symptom management; psychosocial care; cultural, religious, and spiritual issues; ethical issues; and legal frameworks. Learning objectives were categorized as essential or desirable (essential: 66; desirable: 49). A consensus syllabus of palliative medicine for undergraduate medical education was developed using a clear and innovative methodology. The final consensus syllabus will be made available for further dissemination of palliative care education throughout the country.

  11. Analysis of total polyphenols in wines by FIA with highly stable amperometric detection using carbon nanotube-modified electrodes.

    PubMed

    Arribas, Alberto Sánchez; Martínez-Fernández, Marta; Moreno, Mónica; Bermejo, Esperanza; Zapardiel, Antonio; Chicharro, Manuel

    2013-02-15

    The use of glassy carbon electrodes (GCEs) modified with multi-walled carbon nanotube (CNT) films for the continuous monitoring of polyphenols in flow systems has been examined. The performance of these modified electrodes was evaluated and compared to bare GCE by cyclic voltammetry experiments and by flow injection analysis (FIA) with amperometric detection monitoring the response of gallic, caffeic, ferulic and p-coumaric acids in 0.050 M acetate buffer pH 4.5 containing 100 mM NaCl. The GCE modified with CNT dispersions in polyethyleneimine (PEI) provided lower overpotentials, higher sensitivity and much higher signal stability under a dynamic regime than bare GCEs. These properties allowed the estimation of the total polyphenol content in red and white wines with a remarkable long-term stability in the measurements despite the presence of potential fouling substances in the wine matrix. In addition, the versatility of the electrochemical methodology allowed the selective estimation of the easily oxidisable polyphenol fraction as well as the total polyphenol content just by tuning the detection potential at +0.30 or 0.70 V, respectively. The significance of the electrochemical results was demonstrated through correlation studies with the results obtained with conventional spectrophotometric assays for polyphenols (Folin-Ciocalteu, absorbance at 280 nm index and colour intensity index). Copyright © 2012 Elsevier Ltd. All rights reserved.

  12. Carbon nanostructured films modified by metal nanoparticles supported on filtering membranes for electroanalysis.

    PubMed

    Paramo, Erica; Palmero, Susana; Heras, Aranzazu; Colina, Alvaro

    2018-02-01

    A novel methodology to prepare sensors based on carbon nanostructures electrodes modified by metal nanoparticles is proposed. As a proof of concept, a novel bismuth nanoparticle/carbon nanofiber (Bi-NPs/CNF) electrode and a carbon nanotube (CNT)/gold nanoparticle (Au-NPs) have been developed. Bi-NPs/CNF films were prepared by 1) filtering a dispersion of CNFs on a polytetrafluorethylene (PTFE) filter, and 2) filtering a dispersion of Bi-NPs chemically synthesized through this CNF/PTFE film. Next the electrode is prepared by sticking the Bi-NPs/CNF/PTFE film on a PET substrate. In this work, Bi-NPs/CNF ratio was optimized using a Cd 2+ solution as a probe sample. The Cd anodic stripping peak intensity, registered by differential pulse anodic stripping voltammetry (DPASV), is selected as target signal. The voltammograms registered for Cd stripping with this Bi-NPs/CNF/PTFE electrode showed well-defined and highly reproducible electrochemical. The optimized Bi-NPs/CNF electrode exhibits a Cd 2+ detection limit of 53.57 ppb. To demonstrate the utility and versatility of this methodology, single walled carbon nanotubes (SWCNTs) and gold nanoparticles (Au-NPs) were selected to prepare a completely different electrode. Thus, the new Au-NPs/SWCNT/PTFE electrode was tested with a multiresponse technique. In this case, UV/Vis absorption spectroelectrochemistry experiments were carried out for studying dopamine, demonstrating the good performance of the Au-NPs/SWCNT electrode developed. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Drainage after Modified Radical Mastectomy – A Methodological Mini-Review

    PubMed Central

    Tsocheva, Dragostina; Marinova, Katerina; Dobrev, Emil; Nenkov, Rumen

    2017-01-01

    Breast cancer is a socially relevant group of malignant conditions of the mammary gland, affecting both males and females. Most commonly the surgical approach of choice is a modified radical mastectomy (MRM), due to it allowing for both the removal of the main tumor mass and adjacent glandular tissue, which are suspected of infiltration and multifocality of the process, and a sentinel axillary lymph node removal. Most common post-surgical complications following MRM are the formation of a hematoma, the infection of the surgical wound and the formation of a seroma. These post-surgical complications can, at least in part, be attributed to the drainage of the surgical wound. However, the lack of modern and official guidelines provides an ample scope for innovation, but also leads to a need for a randomized comparison of the results. We compared different approaches to wound drainage after MRM, reviewed based on the armamentarium, number of drains, location, type of drainage system, timing of drain removal and no drainage alternatives. Currently, based on the general results, scientific and comparative discussions, seemingly the most affordable methodology with the best patient outcome, with regards to hospital stay and post-operative complications, is the placement of one medial to lateral (pectoro-axillary) drain with low negative pressure. Ideally, the drain should be removed on the second or third postoperative day or when the amount of drained fluid in the last 24 hours reaches below 50 milliliters. PMID:28929038

  14. Detection limits of the strip test and PCR for genetically modified corn in Brazil.

    PubMed

    Nascimento, V E; Von Pinho, É V R; Von Pinho, R G; do Nascimento, A D

    2012-08-16

    Brazilian legislation establishes a labeling limit for products that contain more than 1% material from genetically modified organisms (GMOs). We assessed the sensitivity of the lateral flow strip test in detection of the GMO corn varieties Bt11 and MON810 and the specificity and sensitivity of PCR techniques for their detection. For the strip test, the GMO seeds were mixed with conventional seeds at levels of 0.2, 0.4 and 0.8% for Bt11, and 0.4, 0.8 and 1.6% for MON810. Three different methodologies were assessed and whole seeds, their endosperm and embryonic axis were used. For the PCR technique, the GMO seeds of each of the two varieties were mixed with conventional seeds at levels of 20, 10, 5, 2, 1, and 0.5%. The seeds were ground and the DNA extracted. For detection of the GMO material, specific primers were used for MON810 and Bt11 and maize zein as an endogenous control. The sensitivity of the strip test varied for both maize varieties and methodologies. The test was positive for Bt11 only at 0.8%, in contrast with the detection limit of 0.4% indicated by the manufacturer. In the multiplex PCR, the primers proved to be specific for the different varieties. These varieties were detected in samples with one GMO seed in 100. Thus, this technique proved to be efficient in detecting contaminations equal to or greater than 1%.

  15. A new proposal for randomized start design to investigate disease-modifying therapies for Alzheimer disease.

    PubMed

    Zhang, Richard Y; Leon, Andrew C; Chuang-Stein, Christy; Romano, Steven J

    2011-02-01

    The increasing prevalence of Alzheimer disease (AD) and lack of effective agents to attenuate progression have accelerated research and development of disease modifying (DM) therapies. The traditional parallel group design and single time point analysis used in the support of past AD drug approvals address symptomatic benefit over relatively short treatment durations. More recent trials investigating disease modification are by necessity longer in duration and require larger sample sizes. Nevertheless, trial design and analysis remain mostly unchanged and may not be adequate to meet the objective of demonstrating disease modification. Randomized start design (RSD) has been proposed as an option to study DM effects, but its application in AD trials may have been hampered by certain methodological challenges. To address the methodological issues that have impeded more extensive use of RSD in AD trial and to encourage other researchers to develop novel design and analysis methodologies to better ascertain DM effects for the next generation of AD therapies, we propose a stepwise testing procedure to evaluate potential DM effects of novel AD therapies. Alzheimer Disease Assessment Scale-Cognitive Subscale (ADAS-cog) is used for illustration. We propose to test three hypotheses in a stepwise sequence. The three tests pertain to treatment difference at two separate time points and a difference in the rate of change. Estimation is facilitated by the Mixed-effects Model for Repeated Measures approach. The required sample size is estimated using Monte Carlo simulations and by modeling ADAS-cog data from prior longitudinal AD studies. The greatest advantage of the RSD proposed in this article is its ability to critically address the question on a DM effect. The AD trial using the new approach would be longer (12-month placebo period plus 12-month delay-start period; total 24-month duration) and require more subjects (about 1000 subjects per arm for the non-inferiority margin chosen in the illustration). It would also require additional evaluations to estimate the rate of ADAS-cog change toward the end of the trial. A regulatory claim of disease modification for any compound will likely require additional verification of a drug's effect on a validated biomarker of Alzheimer's pathology. Incorporation of the RSD in AD trials is feasible. With proper trial setup and statistical procedures, this design could support the detection of a disease-modifying effect. In our opinion, a two-phase RSD with a stepwise hypothesis testing procedure could be a reasonable option for future studies.

  16. Evaluation of a new modified QuEChERS method for the monitoring of carbamate residues in high-fat cheeses by using UHPLC-MS/MS.

    PubMed

    Hamed, Ahmed M; Moreno-González, David; Gámiz-Gracia, Laura; García-Campaña, Ana M

    2017-01-01

    A simple and efficient method for the determination of 28 carbamates in high-fat cheeses is proposed. The methodology is based on a modified quick, easy, cheap, effective, rugged, and safe procedure as sample treatment using a new sorbent (Z-Sep + ) followed by ultra-high performance liquid chromatography with tandem mass spectrometry determination. The method has been validated in different kinds of cheese (Gorgonzola, Roquefort, and Camembert), achieving recoveries of 70-115%, relative standard deviations lower than 13% and limits of quantification lower than 5.4 μg/kg, below the maximum residue levels tolerated for these compounds by the European legislation. The matrix effect was lower than ±30% for all the studied pesticides. The combination of ultra-high performance liquid chromatography and tandem mass spectrometry with this modified quick, easy, cheap, effective, rugged, and safe procedure using Z-Sep + allowed a high sample throughput and an efficient cleaning of extracts for the control of these residues in cheeses with a high fat content. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Tripodal penta(p-phenylene) for the biofunctionalization of alkynyl-modified silicon surfaces

    NASA Astrophysics Data System (ADS)

    Sánchez-Molina, María; Díaz, Amelia; Valpuesta, María; Contreras-Cáceres, Rafael; López-Romero, J. Manuel; López-Ramírez, M. Rosa

    2018-07-01

    Here we report the optimization on the covalent grafting methodology of a tripod-shaped penta(p-phenylene), 1, on alkynyl-terminated silicon surfaces, and the incorporation of an active theophylline derivative, 2, for the specific immobilization of proteins. The tripodal molecule presents azide-terminal groups to be attached onto a silicon surface containing an alkynyl monolayer. Initially, compound 1 has been covalently incorporated on alkynyl-terminated Si wafers, by the copper catalyzed alkyne-azide 1,3-dipolar cycloaddition (CuAAC, a click reaction). The tripod density on the silicon surface is tuned by performing the CuAAC reaction at different concentrations of 1, as well as under different experimental conditions (T, base, copper source, shaking). Then, tripod 1-modified surface has also been biofunctionalized with 2. The effective preparation of this silicon-modified surface allowed us to study the streptavidin immobilization on the surface. Characterization of the different surfaces has been carried out by X-ray Photoelectron Spectroscopy (XPS), Atomic Force Microscopy (AFM) and Bright-Field Optical Transmission Microscopy (Confocal) techniques. We also include density functional theory (DFT) analysis of the organic structures to confirm the height-profile and the tripod-surface relative configuration extracted from AFM images.

  18. Genetic Variation as a Modifier of Association between Therapeutic Exposure and Subsequent Malignant Neoplasms in Cancer Survivors

    PubMed Central

    Bhatia, Smita

    2014-01-01

    Subsequent malignant neoplasms (SMNs) are associated with significant morbidity and are a major cause of premature mortality among cancer survivors. Several large studies have demonstrated a strong association between the radiation and/or chemotherapy used to treat the primary cancer and the risk of developing SMNs. However, for any given therapeutic exposure, the risk of developing an SMN varies between individuals. Genomic variation can potentially modify the association between therapeutic exposures and SMN risk, and can possibly explain the observed inter-individual variability. This article provides a brief overview of the current knowledge regarding the role of genomic variation in the development of therapy-related SMNs. This article also discusses the methodological challenges in undertaking an endeavor to develop a deeper understanding of the molecular underpinnings of therapy-related SMNs, such as, an appropriate study design, identification of an adequately sized study population together with a reliable plan for collecting and maintaining high quality DNA, clinical validation of the phenotype, and selection of an appropriate approach or platform for genotyping. Understanding the modifiers of risk of treatment-related SMNs is critical to developing targeted intervention strategies and optimizing risk-based health care of cancer survivors. PMID:25355167

  19. The Use of the Ex Vivo Chandler Loop Apparatus to Assess the Biocompatibility of Modified Polymeric Blood Conduits

    PubMed Central

    Slee, Joshua B.; Alferiev, Ivan S.; Levy, Robert J.; Stachelek, Stanley J.

    2014-01-01

    The foreign body reaction occurs when a synthetic surface is introduced to the body. It is characterized by adsorption of blood proteins and the subsequent attachment and activation of platelets, monocyte/macrophage adhesion, and inflammatory cell signaling events, leading to post-procedural complications. The Chandler Loop Apparatus is an experimental system that allows researchers to study the molecular and cellular interactions that occur when large volumes of blood are perfused over polymeric conduits. To that end, this apparatus has been used as an ex vivo model allowing the assessment of the anti-inflammatory properties of various polymer surface modifications. Our laboratory has shown that blood conduits, covalently modified via photoactivation chemistry with recombinant CD47, can confer biocompatibility to polymeric surfaces. Appending CD47 to polymeric surfaces could be an effective means to promote the efficacy of polymeric blood conduits. Herein is the methodology detailing the photoactivation chemistry used to append recombinant CD47 to clinically relevant polymeric blood conduits and the use of the Chandler Loop as an ex vivo experimental model to examine blood interactions with the CD47 modified and control conduits. PMID:25178087

  20. Modifiable Midlife Risk Factors for Late-Life Cognitive Impairment and Dementia

    PubMed Central

    Hughes, Tiffany F.; Ganguli, Mary

    2009-01-01

    The baby boom generation is approaching the age of greatest risk for cognitive impairment and dementia. There is growing interest in strategies to modify the environment in midlife to increase the probability of maintaining cognitive health in late life. Several potentially modifiable risk factors have been studied in relation to cognitive impairment and dementia in late life, but methodological limitations of observational research have resulted in some inconsistencies across studies. The most promising strategies are maintaining cardiovascular health, engagement in mental, physical, and social activities, using alcohol in moderation, abstaining from tobacco use, and following a heart-healthy diet. Other factors that may influence cognitive health are occupational attainment, depression, personality, exposure to general anesthesia, head injury, postmenopausal hormone therapy, non-steroidal anti-inflammatory medications, and nutritional supplements such as antioxidants. Some long-term observational studies initiated in midlife or earlier, and some randomized controlled trials, have examined the effects of specific cognitive health promotion behaviors in midlife on the risk of cognitive impairment in late life. Overall, these studies provide limited support for risk reduction at this time. Recommendations and challenges for developing effective strategies to reduce the burden of cognitive impairment and dementia in the future are discussed. PMID:19946443

  1. Biodiesel production using lipase immobilized on epoxychloropropane-modified Fe3O4 sub-microspheres.

    PubMed

    Zhang, Qian; Zheng, Zhong; Liu, Changxia; Liu, Chunqiao; Tan, Tianwei

    2016-04-01

    Superparamagnetic Fe3O4 sub-microspheres with diameters of approximately 200 nm were prepared via a solvothermal method, and then modified with epoxychloropropane. Lipase was immobilized on the modified sub-microspheres. The immobilized lipase was used in the production of biodiesel fatty acid methyl esters (FAMEs) from acidified waste cooking oil (AWCO). The effects of the reaction conditions on the biodiesel yield were investigated using a combination of response surface methodology and three-level/three-factor Box-Behnken design (BBD). The optimum synthetic conditions, which were identified using Ridge max analysis, were as follows: immobilized lipase:AWCO mass ratio 0.02:1, fatty acid:methanol molar ratio 1:1.10, hexane:AWCO ratio 1.33:1 (mL/g), and temperature 40 °C. A 97.11% yield was obtained under these conditions. The BBD and experimental data showed that the immobilized lipase could generate biodiesel over a wide temperature range, from 0 to 40 °C. Consistently high FAME yields, in excess of 80%, were obtained when the immobilized lipase was reused in six replicate trials at 10 and 20 °C. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Enhanced capacity and stability for the separation of cesium in electrically switched ion exchange

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tawfic, A.F.; Dickson, S.E.; Kim, Y.

    2015-03-15

    Electrically switched ion exchange (ESIX) can be used to separate ionic contaminants from industrial wastewater, including that generated by the nuclear industry. The ESIX method involves sequential application of reduction and oxidation potentials to an ion exchange film to induce the respective loading and unloading of cesium. This technology is superior to conventional methods (e.g electrodialysis reversal or reverse osmosis) as it requires very little energy for ionic separation. In previous studies, ESIX films have demonstrated relatively low ion exchange capacities and limited film stabilities over repeated potential applications. In this study, the methodology for the deposition of electro-active filmsmore » (nickel hexacyanoferrate) on nickel electrodes was modified to improve the ion exchange capacity for cesium removal using ESIX. Cyclic voltammetry was used to investigate the ion exchange capacity and stability. Scanning electron microscopy (SEM) was used to characterize the modified film surfaces. Additionally, the films were examined for the separation of cesium ions. This modified film preparation technique enhanced the ion exchange capacity and improves the film stability compared to previous methods for the deposition of ESIX films. (authors)« less

  3. Testing the interval-level measurement property of multi-item visual analogue scales.

    PubMed

    Krabbe, Paul F M; Stalmeier, Peep F M; Lamers, Leida M; Busschbach, Jan J V

    2006-12-01

    Conditions were studied that may invalidate health-state values derived from the visual analogue scale (VAS). Respondents were asked to place cards with descriptions of EQ-5D health states on a 20 cm EuroQol VAS and modified versions of it, positioning them such that the distances between the states reflect their valuation for these states. Anchor-point bias was examined using the standard EuroQol VAS (n = 212) and a modified version (n = 97) with a different lower anchor. Context bias was examined in another group of respondents (n = 112) who valued three different sets of EQ-5D health states. Marker bias was studied in yet another group of respondents (n = 100) who placed the same EQ-5D states on the standard EuroQol VAS and on a modified VAS without anchors, categories, or measurement markers. No indication for the existence of the anchor-point and the marker bias was found. However, the VAS valuations were significantly affected by the context of the set of health states in the scaling task. Advanced methodologies should be incorporated in VAS valuation studies to deal with the context bias.

  4. Development of a BALB/c 3T3 neutral red uptake cytotoxicity test using a mainstream cigarette smoke exposure system

    PubMed Central

    2014-01-01

    Background Tobacco smoke toxicity has traditionally been assessed using the particulate fraction under submerged culture conditions which omits the vapour phase elements from any subsequent analysis. Therefore, methodologies that assess the full interactions and complexities of tobacco smoke are required. Here we describe the adaption of a modified BALB/c 3T3 neutral red uptake (NRU) cytotoxicity test methodology, which is based on the Interagency Coordinating Committee on the Validation of Alternative Methods (ICCVAM) protocol for in vitro acute toxicity testing. The methodology described takes into account the synergies of both the particulate and vapour phase of tobacco smoke. This is of particular importance as both phases have been independently shown to induce in vitro cellular cytotoxicity. Findings The findings from this study indicate that mainstream tobacco smoke and the gas vapour phase (GVP), generated using the Vitrocell® VC 10 smoke exposure system, have distinct and significantly different toxicity profiles. Within the system tested, mainstream tobacco smoke produced a dilution IC50 (dilution (L/min) at which 50% cytotoxicity is observed) of 6.02 L/min, whereas the GVP produced a dilution IC50 of 3.20 L/min. In addition, we also demonstrated significant dose-for-dose differences between mainstream cigarette smoke and the GVP fraction (P < 0.05). This demonstrates the importance of testing the entire tobacco smoke aerosol and not just the particulate fraction, as has been the historical preference. Conclusions We have adapted the NRU methodology based on the ICCVAM protocol to capture the full interactions and complexities of tobacco smoke. This methodology could also be used to assess the performance of traditional cigarettes, blend and filter technologies, tobacco smoke fractions and individual test aerosols. PMID:24935030

  5. Improving wait times to care for individuals with multimorbidities and complex conditions using value stream mapping.

    PubMed

    Sampalli, Tara; Desy, Michel; Dhir, Minakshi; Edwards, Lynn; Dickson, Robert; Blackmore, Gail

    2015-04-05

    Recognizing the significant impact of wait times for care for individuals with complex chronic conditions, we applied a LEAN methodology, namely - an adaptation of Value Stream Mapping (VSM) to meet the needs of people with multiple chronic conditions and to improve wait times without additional resources or funding. Over an 18-month time period, staff applied a patient-centric approach that included LEAN methodology of VSM to improve wait times to care. Our framework of evaluation was grounded in the needs and perspectives of patients and individuals waiting to receive care. Patient centric views were obtained through surveys such as Patient Assessment of Chronic Illness Care (PACIC) and process engineering based questions. In addition, LEAN methodology, VSM was added to identify non-value added processes contributing to wait times. The care team successfully reduced wait times to 2 months in 2014 with no wait times for care anticipated in 2015. Increased patient engagement and satisfaction are also outcomes of this innovative initiative. In addition, successful transformations and implementation have resulted in resource efficiencies without increase in costs. Patients have shown significant improvements in functional health following Integrated Chronic Care Service (ICCS) intervention. The methodology will be applied to other chronic disease management areas in Capital Health and the province. Wait times to care in the management of multimoribidities and other complex conditions can add a significant burden not only on the affected individuals but also on the healthcare system. In this study, a novel and modified LEAN methodology has been applied to embed the voice of the patient in care delivery processes and to reduce wait times to care in the management of complex chronic conditions. © 2015 by Kerman University of Medical Sciences.

  6. A methodology for the design of experiments in computational intelligence with multiple regression models.

    PubMed

    Fernandez-Lozano, Carlos; Gestal, Marcos; Munteanu, Cristian R; Dorado, Julian; Pazos, Alejandro

    2016-01-01

    The design of experiments and the validation of the results achieved with them are vital in any research study. This paper focuses on the use of different Machine Learning approaches for regression tasks in the field of Computational Intelligence and especially on a correct comparison between the different results provided for different methods, as those techniques are complex systems that require further study to be fully understood. A methodology commonly accepted in Computational intelligence is implemented in an R package called RRegrs. This package includes ten simple and complex regression models to carry out predictive modeling using Machine Learning and well-known regression algorithms. The framework for experimental design presented herein is evaluated and validated against RRegrs. Our results are different for three out of five state-of-the-art simple datasets and it can be stated that the selection of the best model according to our proposal is statistically significant and relevant. It is of relevance to use a statistical approach to indicate whether the differences are statistically significant using this kind of algorithms. Furthermore, our results with three real complex datasets report different best models than with the previously published methodology. Our final goal is to provide a complete methodology for the use of different steps in order to compare the results obtained in Computational Intelligence problems, as well as from other fields, such as for bioinformatics, cheminformatics, etc., given that our proposal is open and modifiable.

  7. Using scenarios and personas to enhance the effectiveness of heuristic usability evaluations for older adults and their care team.

    PubMed

    Kneale, Laura; Mikles, Sean; Choi, Yong K; Thompson, Hilaire; Demiris, George

    2017-09-01

    Using heuristics to evaluate user experience is a common methodology for human-computer interaction studies. One challenge of this method is the inability to tailor results towards specific end-user needs. This manuscript reports on a method that uses validated scenarios and personas of older adults and care team members to enhance heuristics evaluations of the usability of commercially available personal health records for homebound older adults. Our work extends the Chisnell and Redish heuristic evaluation methodology by using a protocol that relies on multiple expert reviews of each system. It further standardizes the heuristic evaluation process through the incorporation of task-based scenarios. We were able to use the modified version of the Chisnell and Redish heuristic evaluation methodology to identify potential usability challenges of two commercially available personal health record systems. This allowed us to: (1) identify potential usability challenges for specific types of users, (2) describe improvements that would be valuable to all end-users of the system, and (3) better understand how the interactions of different users may vary within a single personal health record. The methodology described in this paper may help designers of consumer health information technology tools, such as personal health records, understand the needs of diverse end-user populations. Such methods may be particularly helpful when designing systems for populations that are difficult to recruit for end-user evaluations through traditional methods. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. A methodology for the design of experiments in computational intelligence with multiple regression models

    PubMed Central

    Gestal, Marcos; Munteanu, Cristian R.; Dorado, Julian; Pazos, Alejandro

    2016-01-01

    The design of experiments and the validation of the results achieved with them are vital in any research study. This paper focuses on the use of different Machine Learning approaches for regression tasks in the field of Computational Intelligence and especially on a correct comparison between the different results provided for different methods, as those techniques are complex systems that require further study to be fully understood. A methodology commonly accepted in Computational intelligence is implemented in an R package called RRegrs. This package includes ten simple and complex regression models to carry out predictive modeling using Machine Learning and well-known regression algorithms. The framework for experimental design presented herein is evaluated and validated against RRegrs. Our results are different for three out of five state-of-the-art simple datasets and it can be stated that the selection of the best model according to our proposal is statistically significant and relevant. It is of relevance to use a statistical approach to indicate whether the differences are statistically significant using this kind of algorithms. Furthermore, our results with three real complex datasets report different best models than with the previously published methodology. Our final goal is to provide a complete methodology for the use of different steps in order to compare the results obtained in Computational Intelligence problems, as well as from other fields, such as for bioinformatics, cheminformatics, etc., given that our proposal is open and modifiable. PMID:27920952

  9. Flight simulator fidelity assessment in a rotorcraft lateral translation maneuver

    NASA Technical Reports Server (NTRS)

    Hess, R. A.; Malsbury, T.; Atencio, A., Jr.

    1992-01-01

    A model-based methodology for assessing flight simulator fidelity in closed-loop fashion is exercised in analyzing a rotorcraft low-altitude maneuver for which flight test and simulation results were available. The addition of a handling qualities sensitivity function to a previously developed model-based assessment criteria allows an analytical comparison of both performance and handling qualities between simulation and flight test. Model predictions regarding the existence of simulator fidelity problems are corroborated by experiment. The modeling approach is used to assess analytically the effects of modifying simulator characteristics on simulator fidelity.

  10. Developing CORBA-Based Distributed Scientific Applications From Legacy Fortran Programs

    NASA Technical Reports Server (NTRS)

    Sang, Janche; Kim, Chan; Lopez, Isaac

    2000-01-01

    An efficient methodology is presented for integrating legacy applications written in Fortran into a distributed object framework. Issues and strategies regarding the conversion and decomposition of Fortran codes into Common Object Request Broker Architecture (CORBA) objects are discussed. Fortran codes are modified as little as possible as they are decomposed into modules and wrapped as objects. A new conversion tool takes the Fortran application as input and generates the C/C++ header file and Interface Definition Language (IDL) file. In addition, the performance of the client server computing is evaluated.

  11. Stable adaptive neurocontrollers for spacecraft and space robots

    NASA Technical Reports Server (NTRS)

    Sanner, Robert M.

    1995-01-01

    This paper reviews recently developed techniques of adaptive nonlinear control using neural networks, and demonstrates their application to two important practical problems in orbital operations. An adaptive neurocontroller is first developed for spacecraft attitude control applications, and then the same design, slightly modified, is shown to be effective in the control of free-floating orbital manipulators. The algorithms discussed have guaranteed stability and convergence properties, and thus constitute viable alternatives to existing control methodologies. Simulation results are presented demonstrating the performance of each algorithm with representative dynamic models.

  12. Space Station communications system design and analysis

    NASA Technical Reports Server (NTRS)

    Ratliff, J. E.

    1986-01-01

    Attention is given to the methodologies currently being used as the framework within which the NASA Space Station's communications system is to be designed and analyzed. A key aspect of the CAD/analysis system being employed is its potential growth in size and capabilities, since Space Station design requirements will continue to be defined and modified. The Space Station is expected to furnish communications between itself and astronauts on EVA, Orbital Maneuvering Vehicles, Orbital Transfer Vehicles, Space Shuttle orbiters, free-flying spacecraft, coorbiting platforms, and the Space Shuttle's own Mobile Service Center.

  13. Software For Fault-Tree Diagnosis Of A System

    NASA Technical Reports Server (NTRS)

    Iverson, Dave; Patterson-Hine, Ann; Liao, Jack

    1993-01-01

    Fault Tree Diagnosis System (FTDS) computer program is automated-diagnostic-system program identifying likely causes of specified failure on basis of information represented in system-reliability mathematical models known as fault trees. Is modified implementation of failure-cause-identification phase of Narayanan's and Viswanadham's methodology for acquisition of knowledge and reasoning in analyzing failures of systems. Knowledge base of if/then rules replaced with object-oriented fault-tree representation. Enhancement yields more-efficient identification of causes of failures and enables dynamic updating of knowledge base. Written in C language, C++, and Common LISP.

  14. Accurate Energy Transaction Allocation using Path Integration and Interpolation

    NASA Astrophysics Data System (ADS)

    Bhide, Mandar Mohan

    This thesis investigates many of the popular cost allocation methods which are based on actual usage of the transmission network. The Energy Transaction Allocation (ETA) method originally proposed by A.Fradi, S.Brigonne and B.Wollenberg which gives unique advantage of accurately allocating the transmission network usage is discussed subsequently. Modified calculation of ETA based on simple interpolation technique is then proposed. The proposed methodology not only increase the accuracy of calculation but also decreases number of calculations to less than half of the number of calculations required in original ETAs.

  15. [Approaching the family: presenting a new definition of family and care].

    PubMed

    Wernet, Monika; Angelo, Margareth

    2003-03-01

    Identifying the need to comprehend how nurses 'move' to the family, with the intention of caring for them, the present study obtained twelve biographical narratives of pediatric nurses. The theoretical approach adopted was the Symbolic Interactionism and the methodological one was the Interpretative Interactionism. The analysis of the data allowed the reconstruction of a story which showed that through modifying the concept of caring and family is the process that approximate the pediatric nurses progressively to the family, to a being with them, reaching the freedom to act.

  16. Analysis of Extracellular Vesicles in the Tumor Microenvironment.

    PubMed

    Al-Nedawi, Khalid; Read, Jolene

    2016-01-01

    Extracellular vesicles (ECV) are membrane compartments shed from all types of cells in various physiological and pathological states. In recent years, ECV have gained an increasing interest from the scientific community for their role as an intercellular communicator that plays important roles in modifying the tumor microenvironment. Multiple techniques have been established to collect ECV from conditioned media of cell culture or physiological fluids. The gold standard methodology is differential centrifugation. Although alternative techniques exist to collect ECV, these techniques have not proven suitable as a substitution for the ultracentrifugation procedure.

  17. Quality and management of wastewater in sugar industry

    NASA Astrophysics Data System (ADS)

    Poddar, Pradeep Kumar; Sahu, Omprakash

    2017-03-01

    Wastewater from sugar industries is one that has complex characteristics and is considered a challenge for environmental engineers in terms of treatment as well as utilization. Before treatment and recycling, determination of physicochemical parameter is an important mechanism. Many different types of techniques are introduced and modified for the purpose, but depend upon the water quality parameters. The main aim of this study is to determine the physicochemical characteristics of sugar industry waste water by the standard method and minimize the fresh water consumption in sugar industry by water pinch methodology.

  18. NASA LeRC/Akron University Graduate Cooperative Fellowship Program and Graduate Student Researchers Program

    NASA Technical Reports Server (NTRS)

    Fertis, D. G.; Simon, A. L.

    1981-01-01

    The requisite methodology to solve linear and nonlinear problems associated with the static and dynamic analysis of rotating machinery, their static and dynamic behavior, and the interaction between the rotating and nonrotating parts of an engine is developed. Linear and nonlinear structural engine problems are investigated by developing solution strategies and interactive computational methods whereby the man and computer can communicate directly in making analysis decisions. Representative examples include modifying structural models, changing material, parameters, selecting analysis options and coupling with interactive graphical display for pre- and postprocessing capability.

  19. Multimodal hybrid reasoning methodology for personalized wellbeing services.

    PubMed

    Ali, Rahman; Afzal, Muhammad; Hussain, Maqbool; Ali, Maqbool; Siddiqi, Muhammad Hameed; Lee, Sungyoung; Ho Kang, Byeong

    2016-02-01

    A wellness system provides wellbeing recommendations to support experts in promoting a healthier lifestyle and inducing individuals to adopt healthy habits. Adopting physical activity effectively promotes a healthier lifestyle. A physical activity recommendation system assists users to adopt daily routines to form a best practice of life by involving themselves in healthy physical activities. Traditional physical activity recommendation systems focus on general recommendations applicable to a community of users rather than specific individuals. These recommendations are general in nature and are fit for the community at a certain level, but they are not relevant to every individual based on specific requirements and personal interests. To cover this aspect, we propose a multimodal hybrid reasoning methodology (HRM) that generates personalized physical activity recommendations according to the user׳s specific needs and personal interests. The methodology integrates the rule-based reasoning (RBR), case-based reasoning (CBR), and preference-based reasoning (PBR) approaches in a linear combination that enables personalization of recommendations. RBR uses explicit knowledge rules from physical activity guidelines, CBR uses implicit knowledge from experts׳ past experiences, and PBR uses users׳ personal interests and preferences. To validate the methodology, a weight management scenario is considered and experimented with. The RBR part of the methodology generates goal, weight status, and plan recommendations, the CBR part suggests the top three relevant physical activities for executing the recommended plan, and the PBR part filters out irrelevant recommendations from the suggested ones using the user׳s personal preferences and interests. To evaluate the methodology, a baseline-RBR system is developed, which is improved first using ranged rules and ultimately using a hybrid-CBR. A comparison of the results of these systems shows that hybrid-CBR outperforms the modified-RBR and baseline-RBR systems. Hybrid-CBR yields a 0.94% recall, a 0.97% precision, a 0.95% f-score, and low Type I and Type II errors. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Correlation Between Quality of Evidence and Number of Citations in Top 50 Cited Articles on Elbow Medial Ulnar Collateral Ligament Surgery.

    PubMed

    Jack, Robert A; Sochacki, Kyle R; Morehouse, Hannah A; McCulloch, Patrick C; Lintner, David M; Harris, Joshua D

    2018-04-01

    Several studies have analyzed the most cited articles in shoulder, elbow, pediatrics, and foot and ankle surgery. However, no study has analyzed the quality of the most cited articles in elbow medial ulnar collateral ligament (UCL) surgery. To (1) identify the top 50 most cited articles related to UCL surgery, (2) determine whether there was a correlation between the top cited articles and level of evidence, and (3) determine whether there was a correlation between study methodological quality and the top cited articles. Systematic review. Web of Science and Scopus online databases were searched to identify the top 50 cited articles in UCL surgery. Level of evidence, number of times cited, year of publication, name of journal, country of origin, and study type were recorded for each study. Study methodological quality was analyzed for each article with the Modified Coleman Methodology Score (MCMS) and the Methodological Index for Non-randomized Studies (MINORS). Correlation coefficients were calculated. The 50 most cited articles were published between 1981 and 2015. The number of citations per article ranged from 20 to 301 (mean ± SD, 71 ± 62 citations). Most articles (92%) were from the United States and were level 3 (16%), level 4 (58%), or unclassified (16%) evidence. There were no articles of level 1 evidence quality. The mean MCMS and MINORS scores were 28.1 ± 13.4 (range, 3-52) and 9.2 ± 3.6 (range, 2-19), respectively. There was no significant correlation between the mean number of citations and level of evidence or quality ( r s = -0.01, P = .917), MCMS ( r s = 0.09, P = .571), or MINORS ( r s = -0.26, P = .089). The top 50 cited articles in UCL surgery constitute a low level of evidence and low methodological quality, including no level 1 articles. There was no significant correlation between the mean number of citations and level of evidence or study methodological quality. However, weak correlations were observed for later publication date and improved level of evidence and methodological quality.

  1. Correlation Between Quality of Evidence and Number of Citations in Top 50 Cited Articles on Elbow Medial Ulnar Collateral Ligament Surgery

    PubMed Central

    Jack, Robert A.; Sochacki, Kyle R.; Morehouse, Hannah A.; McCulloch, Patrick C.; Lintner, David M.; Harris, Joshua D.

    2018-01-01

    Background: Several studies have analyzed the most cited articles in shoulder, elbow, pediatrics, and foot and ankle surgery. However, no study has analyzed the quality of the most cited articles in elbow medial ulnar collateral ligament (UCL) surgery. Purpose: To (1) identify the top 50 most cited articles related to UCL surgery, (2) determine whether there was a correlation between the top cited articles and level of evidence, and (3) determine whether there was a correlation between study methodological quality and the top cited articles. Study Design: Systematic review. Methods: Web of Science and Scopus online databases were searched to identify the top 50 cited articles in UCL surgery. Level of evidence, number of times cited, year of publication, name of journal, country of origin, and study type were recorded for each study. Study methodological quality was analyzed for each article with the Modified Coleman Methodology Score (MCMS) and the Methodological Index for Non-randomized Studies (MINORS). Correlation coefficients were calculated. Results: The 50 most cited articles were published between 1981 and 2015. The number of citations per article ranged from 20 to 301 (mean ± SD, 71 ± 62 citations). Most articles (92%) were from the United States and were level 3 (16%), level 4 (58%), or unclassified (16%) evidence. There were no articles of level 1 evidence quality. The mean MCMS and MINORS scores were 28.1 ± 13.4 (range, 3-52) and 9.2 ± 3.6 (range, 2-19), respectively. There was no significant correlation between the mean number of citations and level of evidence or quality (rs = –0.01, P = .917), MCMS (rs = 0.09, P = .571), or MINORS (rs = –0.26, P = .089). Conclusion: The top 50 cited articles in UCL surgery constitute a low level of evidence and low methodological quality, including no level 1 articles. There was no significant correlation between the mean number of citations and level of evidence or study methodological quality. However, weak correlations were observed for later publication date and improved level of evidence and methodological quality. PMID:29780841

  2. Attack Methodology Analysis: Emerging Trends in Computer-Based Attack Methodologies and Their Applicability to Control System Networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bri Rolston

    2005-06-01

    Threat characterization is a key component in evaluating the threat faced by control systems. Without a thorough understanding of the threat faced by critical infrastructure networks, adequate resources cannot be allocated or directed effectively to the defense of these systems. Traditional methods of threat analysis focus on identifying the capabilities and motivations of a specific attacker, assessing the value the adversary would place on targeted systems, and deploying defenses according to the threat posed by the potential adversary. Too many effective exploits and tools exist and are easily accessible to anyone with access to an Internet connection, minimal technical skills,more » and a significantly reduced motivational threshold to be able to narrow the field of potential adversaries effectively. Understanding how hackers evaluate new IT security research and incorporate significant new ideas into their own tools provides a means of anticipating how IT systems are most likely to be attacked in the future. This research, Attack Methodology Analysis (AMA), could supply pertinent information on how to detect and stop new types of attacks. Since the exploit methodologies and attack vectors developed in the general Information Technology (IT) arena can be converted for use against control system environments, assessing areas in which cutting edge exploit development and remediation techniques are occurring can provide significance intelligence for control system network exploitation, defense, and a means of assessing threat without identifying specific capabilities of individual opponents. Attack Methodology Analysis begins with the study of what exploit technology and attack methodologies are being developed in the Information Technology (IT) security research community within the black and white hat community. Once a solid understanding of the cutting edge security research is established, emerging trends in attack methodology can be identified and the gap between those threats and the defensive capabilities of control systems can be analyzed. The results of the gap analysis drive changes in the cyber security of critical infrastructure networks to close the gap between current exploits and existing defenses. The analysis also provides defenders with an idea of how threat technology is evolving and how defenses will need to be modified to address these emerging trends.« less

  3. Hydroxyl Radical Modification of Collagen Type II Increases Its Arthritogenicity and Immunogenicity

    PubMed Central

    Shahab, Uzma; Ahmad, Saheem; Moinuddin; Dixit, Kiran; Habib, Safia; Alam, Khursheed; Ali, Asif

    2012-01-01

    Background The oxidation of proteins by endogenously generated free radicals causes structural modifications in the molecules that lead to generation of neo-antigenic epitopes that have implications in various autoimmune disorders, including rheumatoid arthritis (RA). Collagen induced arthritis (CIA) in rodents (rats and mice) is an accepted experimental model for RA. Methodology/Principal Findings Hydroxyl radicals were generated by the Fenton reaction. Collagen type II (CII) was modified by •OH radical (CII-OH) and analysed by ultraviolet-visible (UV-VIS), fluorescence and circular dichroism (CD) spectroscopy. The immunogenicity of native and modified CII was checked in female Lewis rats and specificity of the induced antibodies was ascertained by enzyme linked immunosorbent assay (ELISA). The extent of CIA was evaluated by visual inspection. We also estimated the oxidative and inflammatory markers in the sera of immunized rats. A slight change in the triple helical structure of CII as well as fragmentation was observed after hydroxyl radical modification. The modified CII was found to be highly arthritogenic and immunogenic as compared to the native form. The CII-OH immunized rats exhibited increased oxidative stress and inflammation as compared to the CII immunized rats in the control group. Conclusions/Significance Neo-antigenic epitopes were generated on •OH modified CII which rendered it highly immunogenic and arthritogenic as compared to the unmodified form. Since the rodent CIA model shares many features with human RA, these results illuminate the role of free radicals in human RA. PMID:22319617

  4. An immunohistochemical and fluorescence in situ hybridization-based comparison between the Oracle HER2 Bond Immunohistochemical System, Dako HercepTest, and Vysis PathVysion HER2 FISH using both commercially validated and modified ASCO/CAP and United Kingdom HER2 IHC scoring guidelines.

    PubMed

    O'Grady, Anthony; Allen, David; Happerfield, Lisa; Johnson, Nicola; Provenzano, Elena; Pinder, Sarah E; Tee, Lilian; Gu, Mai; Kay, Elaine W

    2010-12-01

    Immunohistochemistry (IHC) is used as the frontline assay to determine HER2 status in invasive breast cancer patients. The aim of the study was to compare the performance of the Leica Oracle HER2 Bond IHC System (Oracle) with the current most readily accepted Dako HercepTest (HercepTest), using both commercially validated and modified ASCO/CAP and UK HER2 IHC scoring guidelines. A total of 445 breast cancer samples from 3 international clinical HER2 referral centers were stained with the 2 test systems and scored in a blinded fashion by experienced pathologists. The overall agreement between the 2 tests in a 3×3 (negative, equivocal and positive) analysis shows a concordance of 86.7% and 86.3%, respectively when analyzed using commercially validated and modified ASCO/CAP and UK HER2 IHC scoring guidelines. There is a good concordance between the Oracle and the HercepTest. The advantages of a complete fully automated test such as the Oracle include standardization of key analytical factors and improved turn around time. The implementation of the modified ASCO/CAP and UK HER2 IHC scoring guidelines has minimal effect on either assay interpretation, showing that Oracle can be used as a methodology for accurately determining HER2 IHC status in formalin fixed, paraffin-embedded breast cancer tissue.

  5. Optimization of tetracycline hydrochloride adsorption on amino modified SBA-15 using response surface methodology.

    PubMed

    Hashemikia, Samaneh; Hemmatinejad, Nahid; Ahmadi, Ebrahim; Montazer, Majid

    2015-04-01

    Several researchers are focused on preparation of mesoporous silica as drug carriers with high loading efficiency to control or sustain the drug release. Carriers with highly loaded drug are utilized to minimize the time of drug intake. In this study, amino modified SBA-15 was synthesized through grafting with amino propyl triethoxy silane and then loaded with tetracycline hydrochloride. The drug loading was optimized by using the response surface method considering various factors including drug to silica ratio, operation time, and temperature. The drug to silica ratio indicated as the most influential factor on the drug loading yield. Further, a quadratic polynomial equation was developed to predict the loading percentage. The experimental results indicated reasonable agreement with the predicted values. The modified and drug loaded mesoporous particles were characterized by FT-IR, SEM, TEM, X-ray diffraction (XRD), elemental analysis and N2 adsorption-desorption. The release profiles of tetracycline-loaded particles were studied in different pH. Also, Higuchi equation was used to analyze the release profile of the drug and to evaluate the kinetic of drug release. The drug release rate followed the conventional Higuchi model that could be controlled by amino-functionalized SBA-15. Further, the drug delivery system based on amino modified SBA-15 exhibits novel features with an appropriate usage as an anti-bacterial drug delivery system with effective management of drug adsorption and release. Copyright © 2014 Elsevier Inc. All rights reserved.

  6. Electrosynthesis and characterization of nanostructured polyquinone for use in detection and quantification of naturally occurring dsDNA.

    PubMed

    Hernández, Loreto A; Del Valle, María A; Armijo, Francisco

    2016-05-15

    The detection of naturally occurring desoxyribonucleic acid (DNA) has become a subject of study by the projections that would generate to be able to sense the genetic material for the detection of future diseases. Bearing this in mind, to provide new measuring strategies, in the current work the preparation of a low-cost electrode, modified with poly(1-amino-9,10-anthraquinone) nanowires using a SiO2 template, is carried out; the assembly is next modified by covalently attaching ssDNA strands. It must be noted that all this is accomplished by using solely electrochemical techniques, according to methodology developed for this purpose. SEM images of the modified surface show high order and homogeneity in the distribution of modified nanowires over the electrode surface. In turn, after the hybridization with its complementary strand, the voltammetric responses enable corroborating the linear relationship between hybridization at different DNA concentrations and normalized current response, obtaining a limit of detection (LOD) 5.7·10(-12)gL(-1) and limit of quantification (LOQ) 1.9·10(-11)gL(-1). The working dynamic range is between 1.4·10(-7) and 8.5·10(-9)gL(-1) with a correlation coefficient 0.9998. The successful obtaining of the modified electrode allows concluding that the high order reached by the nanostructures, guides the subsequent single strand of DNA (ssDNA) covalent attachment, which after hybridization with its complementary strand brings about a considerable current increase. This result allows foreseeing a guaranteed breakthrough with regard to the use of the biosensor in real samples. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. A novel MALDI–TOF based methodology for genotyping single nucleotide polymorphisms

    PubMed Central

    Blondal, Thorarinn; Waage, Benedikt G.; Smarason, Sigurdur V.; Jonsson, Frosti; Fjalldal, Sigridur B.; Stefansson, Kari; Gulcher, Jeffery; Smith, Albert V.

    2003-01-01

    A new MALDI–TOF based detection assay was developed for analysis of single nucleotide polymorphisms (SNPs). It is a significant modification on the classic three-step minisequencing method, which includes a polymerase chain reaction (PCR), removal of excess nucleotides and primers, followed by primer extension in the presence of dideoxynucleotides using modified thermostable DNA polymerase. The key feature of this novel assay is reliance upon deoxynucleotide mixes, lacking one of the nucleotides at the polymorphic position. During primer extension in the presence of depleted nucleotide mixes, standard thermostable DNA polymerases dissociate from the template at positions requiring a depleted nucleotide; this principal was harnessed to create a genotyping assay. The assay design requires a primer- extension primer having its 3′-end one nucleotide upstream from the interrogated site. The assay further utilizes the same DNA polymerase in both PCR and the primer extension step. This not only simplifies the assay but also greatly reduces the cost per genotype compared to minisequencing methodology. We demonstrate accurate genotyping using this methodology for two SNPs run in both singleplex and duplex reactions. We term this assay nucleotide depletion genotyping (NUDGE). Nucleotide depletion genotyping could be extended to other genotyping assays based on primer extension such as detection by gel or capillary electrophoresis. PMID:14654708

  8. Vortex pinning landscape in MOD-TFA YBCO nanostroctured films

    NASA Astrophysics Data System (ADS)

    Gutierrez, J.; Puig, T.; Pomar, A.; Obradors, X.

    2008-03-01

    A methodology of general validity to study vortex pinning in YBCO based on Jc transport measurements is described. It permits to identify, separate and quantify three basic vortex pinning contributions associated to anisotropic-strong, isotropic-strong and isotropic-weak pinning centers. Thereof, the corresponding vortex pinning phase diagrams are built up. This methodology is applied to the new solution-derived YBCO nanostructured films, including controlled interfacial pinning by the growth of nanostructured templates by means of self-assembled processes [1] and YBCO-BaZrO3 nanocomposites prepared by modified solution precursors. The application of the methodology and comparison with a standard solution-derived YBCO film [2], enables us to identify the nature and the effect of the additional pinning centers induced. The nanostructured templates films show c-axis pinning strongly increased, controlling most of the pinning phase diagram. On the other hand, the nanocomposites have achieved so far, the highest pinning properties in HTc-superconductors [3], being the isotropic-strong defects contribution the origin of their unique properties. [1] M. Gibert et al, Adv. Mat. vol 19, p. 3937 (2007) [2] Puig.T et al, SuST EUCAS 2007 (to be published) [3] J. Gutierrez et al, Nat. Mat. vol. 6, p. 367 (2007) * Work supported by HIPERCHEM, NANOARTIS and MAT2005-02047

  9. Documenting the use of the Long Term Resource Monitoring element’s fish monitoring methodologies throughout the Midwest

    USGS Publications Warehouse

    Solomon, Levi E.; Casper, Andrew F.

    2016-08-16

    The Upper Mississippi River Restoration (UMRR) Program’s Long Term Resource Monitoring (LTRM) element is designed to monitor and assess long term trends in the Upper Mississippi River System (UMRS). To accomplish this, standardized methods are used that allow for comparisons across pools and rivers. In recent years, other projects and other agencies have adopted the LTRM fish methodologies for use outside the UMRR. To determine how widespread the use of the Fish Component’s methods are, a twelve question survey was delivered via SurveyMonkey.com through the states comprising the American Fisheries Society (AFS) North Central Division and the Upper Mississippi River Conservation Committee. Approximately 2,000 professionals were reached with ≈11 percent participating. Results indicate that nearly all (95 percent) respondents use standardized methods in their sampling and 48 percent are familiar with the LTRM fish methodologies. Roughly one-third (35 percent) of all respondents have used the methods in the past and most (78 percent) of those have modified the methods to suit the information needs specific to their fishery. Results indicate that the LTRM methods have indeed spread outside the UMRR and are now a well-known and potentially widely used technique to sample fish communities.

  10. Defined surface immobilization of glycosaminoglycan molecules for probing and modulation of cell-material interactions.

    PubMed

    Wang, Kai; Luo, Ying

    2013-07-08

    As one important category of biological molecules on the cell surface and in the extracellular matrix (ECM), glycosaminoglycans (GAGs) have been widely studied for biomedical applications. With the understanding that the biological functions of GAGs are driven by the complex dynamics of physiological and pathological processes, methodologies are desired to allow the elucidation of cell-GAG interactions with molecular level precision. In this study, a microtiter plate-based system was devised through a new surface modification strategy involving polydopamine (PDA) and GAG molecules functionalized with hydrazide chemical groups. A small library of GAGs including hyaluronic acid (with different molecular weights), heparin, and chondroitin sulfate was successfully immobilized via defined binding sites onto the microtiter plate surface under facile aqueous conditions. The methodology then allowed parallel studies of the GAG-modified surfaces in a high-throughput format. The results show that immobilized GAGs possess distinct properties to mediate protein adsorption, cell adhesion, and inflammatory responses, with each property showing dependence on the type and molecular weight of specific GAG molecules. The PDA-assisted immobilization of hydrazide-functionalized GAGs allows biomimetic attachment of GAG molecules and retains their bioactivity, providing a new methodology to systematically probe fundamental cell-GAG interactions to modulate the bioactivity and biocompatibility of biomaterials.

  11. Decision to abort after a prenatal diagnosis of sex chromosome abnormality: a systematic review of the literature.

    PubMed

    Jeon, Kwon Chan; Chen, Lei-Shih; Goodson, Patricia

    2012-01-01

    We performed a systematic review of factors affecting parental decisions to continue or terminate a pregnancy after prenatal diagnosis of a sex chromosome abnormality, as reported in published studies from 1987 to May 2011. Based on the Matrix Method for systematic reviews, 19 studies were found in five electronic databases, meeting specific inclusion/exclusion criteria. Abstracted data were organized in a matrix. Alongside the search for factors influencing parental decisions, each study was judged on its methodological quality and assigned a methodological quality score. Decisions either to terminate or to continue a sex chromosome abnormality-affected pregnancy shared five similar factors: specific type of sex chromosome abnormality, gestational week at diagnosis, parents' age, providers' genetic expertise, and number of children/desire for (more) children. Factors unique to termination decisions included parents' fear/anxiety and directive counseling. Factors uniquely associated with continuation decisions were parents' socioeconomic status and ethnicity. The studies' average methodological quality score was 10.6 (SD = 1.67; range, 8-14). Findings from this review can be useful in adapting and modifying guidelines for genetic counseling after prenatal diagnosis of a sex chromosome abnormality. Moreover, improving the quality of future studies on this topic may allow clearer understanding of the most influential factors affecting parental decisions.

  12. Pharmacokinetic Profiling of Conjugated Therapeutic Oligonucleotides: A High-Throughput Method Based Upon Serial Blood Microsampling Coupled to Peptide Nucleic Acid Hybridization Assay.

    PubMed

    Godinho, Bruno M D C; Gilbert, James W; Haraszti, Reka A; Coles, Andrew H; Biscans, Annabelle; Roux, Loic; Nikan, Mehran; Echeverria, Dimas; Hassler, Matthew; Khvorova, Anastasia

    2017-12-01

    Therapeutic oligonucleotides, such as small interfering RNAs (siRNAs), hold great promise for the treatment of incurable genetically defined disorders by targeting cognate toxic gene products for degradation. To achieve meaningful tissue distribution and efficacy in vivo, siRNAs must be conjugated or formulated. Clear understanding of the pharmacokinetic (PK)/pharmacodynamic behavior of these compounds is necessary to optimize and characterize the performance of therapeutic oligonucleotides in vivo. In this study, we describe a simple and reproducible methodology for the evaluation of in vivo blood/plasma PK profiles and tissue distribution of oligonucleotides. The method is based on serial blood microsampling from the saphenous vein, coupled to peptide nucleic acid hybridization assay for quantification of guide strands. Performed with minimal number of animals, this method allowed unequivocal detection and sensitive quantification without the need for amplification, or further modification of the oligonucleotides. Using this methodology, we compared plasma clearances and tissue distribution profiles of two different hydrophobically modified siRNAs (hsiRNAs). Notably, cholesterol-hsiRNA presented slow plasma clearances and mainly accumulated in the liver, whereas, phosphocholine-docosahexaenoic acid-hsiRNA was rapidly cleared from the plasma and preferably accumulated in the kidney. These data suggest that the PK/biodistribution profiles of modified hsiRNAs are determined by the chemical nature of the conjugate. Importantly, the method described in this study constitutes a simple platform to conduct pilot assessments of the basic clearance and tissue distribution profiles, which can be broadly applied for evaluation of new chemical variants of siRNAs and micro-RNAs.

  13. E-learning, dual-task, and cognitive load: The anatomy of a failed experiment.

    PubMed

    Van Nuland, Sonya E; Rogers, Kem A

    2016-01-01

    The rising popularity of commercial anatomy e-learning tools has been sustained, in part, due to increased annual enrollment and a reduction in laboratory hours across educational institutions. While e-learning tools continue to gain popularity, the research methodologies used to investigate their impact on learning remain imprecise. As new user interfaces are introduced, it is critical to understand how functionality can influence the load placed on a student's memory resources, also known as cognitive load. To study cognitive load, a dual-task paradigm wherein a learner performs two tasks simultaneously is often used, however, its application within educational research remains uncommon. Using previous paradigms as a guide, a dual-task methodology was developed to assess the cognitive load imposed by two commercial anatomical e-learning tools. Results indicate that the standard dual-task paradigm, as described in the literature, is insensitive to the cognitive load disparities across e-learning tool interfaces. Confounding variables included automation of responses, task performance tradeoff, and poor understanding of primary task cognitive load requirements, leading to unreliable quantitative results. By modifying the secondary task from a basic visual response to a more cognitively demanding task, such as a modified Stroop test, the automation of secondary task responses can be reduced. Furthermore, by recording baseline measures for the primary task as well as the secondary task, it is possible for task performance tradeoff to be detected. Lastly, it is imperative that the cognitive load of the primary task be designed such that it does not overwhelm the individual's ability to learn new material. © 2015 American Association of Anatomists.

  14. Descriptive analysis of bacon smoked with Brazilian woods from reforestation: methodological aspects, statistical analysis, and study of sensory characteristics.

    PubMed

    Saldaña, Erick; Castillo, Luiz Saldarriaga; Sánchez, Jorge Cabrera; Siche, Raúl; de Almeida, Marcio Aurélio; Behrens, Jorge H; Selani, Miriam Mabel; Contreras-Castillo, Carmen J

    2018-06-01

    The aim of this study was to perform a descriptive analysis (DA) of bacons smoked with woods from reforestation and liquid smokes in order to investigate their sensory profile. Six samples of bacon were selected: three smoked bacons with different wood species (Eucalyptus citriodora, Acacia mearnsii, and Bambusa vulgaris), two artificially smoked bacon samples (liquid smoke) and one negative control (unsmoked bacon). Additionally, a commercial bacon sample was also evaluated. DA was developed successfully, presenting a good performance in terms of discrimination, consensus and repeatability. The study revealed that the smoking process modified the sensory profile by intensifying the "saltiness" and differentiating the unsmoked from the smoked samples. The results from the current research represent the first methodological development of descriptive analysis of bacon and may be used by food companies and other stakeholders to understand the changes in sensory characteristics of bacon due to traditional smoking process. Copyright © 2018 Elsevier Ltd. All rights reserved.

  15. Algorithm-Based Fault Tolerance for Numerical Subroutines

    NASA Technical Reports Server (NTRS)

    Tumon, Michael; Granat, Robert; Lou, John

    2007-01-01

    A software library implements a new methodology of detecting faults in numerical subroutines, thus enabling application programs that contain the subroutines to recover transparently from single-event upsets. The software library in question is fault-detecting middleware that is wrapped around the numericalsubroutines. Conventional serial versions (based on LAPACK and FFTW) and a parallel version (based on ScaLAPACK) exist. The source code of the application program that contains the numerical subroutines is not modified, and the middleware is transparent to the user. The methodology used is a type of algorithm- based fault tolerance (ABFT). In ABFT, a checksum is computed before a computation and compared with the checksum of the computational result; an error is declared if the difference between the checksums exceeds some threshold. Novel normalization methods are used in the checksum comparison to ensure correct fault detections independent of algorithm inputs. In tests of this software reported in the peer-reviewed literature, this library was shown to enable detection of 99.9 percent of significant faults while generating no false alarms.

  16. Modulation of the phenolic composition and colour of red wines subjected to accelerated ageing by controlling process variables.

    PubMed

    González-Sáiz, J M; Esteban-Díez, I; Rodríguez-Tecedor, S; Pérez-Del-Notario, N; Arenzana-Rámila, I; Pizarro, C

    2014-12-15

    The aim of the present work was to evaluate the effect of the main factors conditioning accelerated ageing processes (oxygen dose, chip dose, wood origin, toasting degree and maceration time) on the phenolic and chromatic profiles of red wines by using a multivariate strategy based on experimental design methodology. The results obtained revealed that the concentrations of monomeric anthocyanins and flavan-3-ols could be modified through the application of particular experimental conditions. This fact was particularly remarkable since changes in phenolic profile were closely linked to changes observed in chromatic parameters. The main strength of this study lies in the possibility of using its conclusions as a basis to make wines with specific colour properties based on quality criteria. To our knowledge, the influence of such a large number of alternative ageing parameters on wine phenolic composition and chromatic attributes has not been studied previously using a comprehensive experimental design methodology. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. Reconstruction and systemization of the methodologies for strategic environmental assessment in Taiwan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liou, M.-L.; Yeh, S.-C.; Yu, Y.-H.

    2006-03-15

    This paper discusses the current SEA procedures and assessment methodologies, aiming to propose strategies that can lead to effective improvement in a newly industrialized Asian country, Taiwan. Institutional and practical problems with regard to the regulations and tools of SEA in Taiwan are compared to those in other countries. According to the research results, it is suggested that extra evaluation processes should be incorporated into the current assessment procedures to improve their scientific validity and integrity. Moreover, it is also suggested that the sustainability appraisal approaches be included in the SEA framework. In this phase, revised evaluation indicators associated withmore » corresponding targets can be the first attempt for modifying the SEA system. It is believed that these can promote the operability in practice and also lead the whole assessment procedures to a direction closer to sustainable development. The trails that Taiwan has followed can help other countries that are going to adopt SEA to find a more effective and efficient way to follow.« less

  18. Complex mediascapes, complex realities: critically engaging with biotechnology debates in Ghana.

    PubMed

    Rock, Joeva

    2018-01-01

    The recent increase in research and commercialization of genetically modified (GM) crops in Africa has resulted in considerable and understandable interest from farmers, scholars, and practitioners. However, messy situations are often hard to critically engage in from afar, and the recent article published by Braimah et al. [(2017). Debated agronomy: Public discourse and the future of biotechnology policy in Ghana. Global Bioethics . doi:10.1080/11287462.2016.1261604] presents certain claims that further obfuscate - rather than clarify - an already complex landscape. In this commentary I first seek to clarify particular details of the Ghanaian "GMO" (as GM crops are colloquially called in Ghana) story with particular focus on certain actors and their stances. Next, I highlight some methodological shortcomings of Debated Agronomy and correct certain dubious quotations and claims. Finally, I suggest a more ethnographically and discourse-focused methodology to gain much needed insight into how Ghanaians are actively molding, contesting, and questioning GM discourse, funding, and use.

  19. Developing a Methodology for Eliciting Subjective Probability Estimates During Expert Evaluations of Safety Interventions: Application for Bayesian Belief Networks

    NASA Technical Reports Server (NTRS)

    Wiegmann, Douglas A.a

    2005-01-01

    The NASA Aviation Safety Program (AvSP) has defined several products that will potentially modify airline and/or ATC operations, enhance aircraft systems, and improve the identification of potential hazardous situations within the National Airspace System (NAS). Consequently, there is a need to develop methods for evaluating the potential safety benefit of each of these intervention products so that resources can be effectively invested to produce the judgments to develop Bayesian Belief Networks (BBN's) that model the potential impact that specific interventions may have. Specifically, the present report summarizes methodologies for improving the elicitation of probability estimates during expert evaluations of AvSP products for use in BBN's. The work involved joint efforts between Professor James Luxhoj from Rutgers University and researchers at the University of Illinois. The Rutgers' project to develop BBN's received funding by NASA entitled "Probabilistic Decision Support for Evaluating Technology Insertion and Assessing Aviation Safety System Risk." The proposed project was funded separately but supported the existing Rutgers' program.

  20. Complex mediascapes, complex realities: critically engaging with biotechnology debates in Ghana

    PubMed Central

    2018-01-01

    ABSTRACT The recent increase in research and commercialization of genetically modified (GM) crops in Africa has resulted in considerable and understandable interest from farmers, scholars, and practitioners. However, messy situations are often hard to critically engage in from afar, and the recent article published by Braimah et al. [(2017). Debated agronomy: Public discourse and the future of biotechnology policy in Ghana. Global Bioethics. doi:10.1080/11287462.2016.1261604] presents certain claims that further obfuscate – rather than clarify – an already complex landscape. In this commentary I first seek to clarify particular details of the Ghanaian “GMO” (as GM crops are colloquially called in Ghana) story with particular focus on certain actors and their stances. Next, I highlight some methodological shortcomings of Debated Agronomy and correct certain dubious quotations and claims. Finally, I suggest a more ethnographically and discourse-focused methodology to gain much needed insight into how Ghanaians are actively molding, contesting, and questioning GM discourse, funding, and use. PMID:29887770

  1. [Coronary artery disease--relevance of total coronary revascularization on the incidence of malignant arrhythmias].

    PubMed

    Brandt, A; Gulba, D C

    2006-12-01

    Myocardial ischemia induces redistribution of different ions (H(+), K(+), Na(+), Ca(++)) across the cardiomyocyte membrane, as well as the loss of intracellular ATP content. This results in changes in the electrical properties including shortening of the action potential, appearance of delayed afterpotentials, and a modified refractoriness of the cardiomyocyte. These changes may induce or support malignant cardiac arrhythmias. Supersensitivity of sympathetic denervated myocardium may further support the electrical instability of ischemic myocardium.Virtues of studies indicate that patients with coronary artery disease who develop complex arrhythmias during or after exercise bear a substantially increased risk for sudden cardiac death. Other studies report about arrhythmic stabilization and reduced mortality if patients with reversible myocardial ischemia receive complete revascularization. However, none of these studies is without methodological flaws. Due to the lack of methodologically sound studies in sufficiently large patient cohorts, the question whether complete coronary revascularisation improves the prognosis of patients with coronary artery disease and which strategy (medical, interventional, or surgical) warrants the best outcomes remains open.

  2. Expert Consensus for Discharge Referral Decisions Using Online Delphi

    PubMed Central

    Bowles, Kathy H.; Holmes, John H.; Naylor, Mary D.; Liberatore, Matthew; Nydick, Robert

    2003-01-01

    This paper describes the results of using a modified Delphi approach designed to achieve consensus from eight discharge planning experts regarding the decision to refer hospitalized older adults for post-discharge follow-up. Experts reviewed 150 cases using an online website designed to facilitate their interaction and efforts to reach agreement on the need for a referral for post-discharge care and the appropriate site for such care. In contrast to an average of eight weeks to complete just 50 cases using the traditional mail method, the first online Delphi round for 150 cases were completed in six weeks. Data provided by experts suggest that online Delphi is a time efficient and acceptable methodology for reaching group consensus. Other benefits include instant access to Delphi decision results, live knowledge of the time requirements and progress of each expert, and cost savings in postage, paper, copying, and storage of paper documents. This online Delphi methodology is highly recommended. PMID:14728143

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kapuscinski, A.R.; Hallerman, E.M.

    Among the many methodologies encompassing biotechnology in aquaculture, this report addresses: the production of genetically modified aquatic organisms (aquatic GMOs) by gene transfer, chromosome set manipulation, or hybridization or protoplast fusion between species; new health management tools, including DNA-Based diagnostics and recombinant DNA vaccines; Marker-assisted selection; cryopreservation; and stock marking. These methodologies pose a wide range of potential economic benefits for aquaculture by providing improved or new means to affect the mix of necessary material inputs, enhance production efficiency, or improve product quality. Advances in aquaculture through biotechnology could simulate growth of the aquaculture industry to provide a larger proportionmore » of consummer demand, and thereby reduce pressure and natural stocks from over-harvest. Judicious application of gamete cryopreservation and chromosome set manipulations to achieve sterilization could reduce environmental risks of some aquaculture operations. Given the significant losses to disease in many aquaculture enterprises, potential benefits of DNA-based health management tools are very high and appear to pose no major environmental risks or social concerns.« less

  4. Intermatrix Synthesis as a rapid, inexpensive and reproducible methodology for the in situ functionalization of nanostructured surfaces with quantum dots

    NASA Astrophysics Data System (ADS)

    Bastos-Arrieta, Julio; Muñoz, Jose; Stenbock-Fermor, Anja; Muñoz, Maria; Muraviev, Dmitri N.; Céspedes, Francisco; Tsarkova, Larisa A.; Baeza, Mireia

    2016-04-01

    Intermatrix Synthesis (IMS) technique has proven to be a valid methodology for the in situ incorporation of quantum dots (QDs) in a wide range of nanostructured surfaces for the preparation of advanced hybrid-nanomaterials. In this sense, this communication reports the recent advances in the application of IMS for the synthesis of CdS-QDs with favourable distribution on sulfonated polyetherether ketone (SPEEK) membrane thin films (TFs), multiwall carbon nanotubes (MWCNTs) and nanodiamonds (NDs). The synthetic route takes advantage of the ion exchange functionality of the reactive surfaces for the loading of the QDs precursor and consequent QDs appearance by precipitation. The benefits of such modified nanomaterials were studied using CdS-QDs@MWCNTs hybrid-nanomaterials. CdS-QDs@MWCNTs has been used as conducting filler for the preparation of electrochemical nanocomposite sensors, which present electrocatalytic properties. Finally, the optical properties of the QDs contained on MWCNTs could allow a new procedure for the analytical detection of nanostructured carbon allotropes in water.

  5. Developing Methodologies to Find Abbreviated Laboratory Test Names in Narrative Clinical Documents by Generating High Quality Q-Grams.

    PubMed

    Kim, Kyungmo; Choi, Jinwook

    2017-01-01

    Laboratory test names are used as basic information to diagnose diseases. However, this kind of medical information is usually written in a natural language. To find this information, lexicon based methods have been good solutions but they cannot find terms that do not have abbreviated expressions, such as "neuts" that means "neutrophils". To address this issue, similar word matching can be used; however, it can be disadvantageous because of significant false positives. Moreover, processing time is longer as the size of terms is bigger. Therefore, we suggest a novel q-gram based algorithm, named modified triangular area filtering, to find abbreviated laboratory test terms in clinical documents, minimizing the possibility to impair the lexicons' precision. In addition, we found the terms using the methodology with reasonable processing time. The results show that this method can achieve 92.54 precision, 87.72 recall, 90.06 f1-score in test sets when edit distance threshold(τ) = 3.

  6. Augmenting Probabilistic Risk Assesment with Malevolent Initiators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Curtis Smith; David Schwieder

    2011-11-01

    As commonly practiced, the use of probabilistic risk assessment (PRA) in nuclear power plants only considers accident initiators such as natural hazards, equipment failures, and human error. Malevolent initiators are ignored in PRA, but are considered the domain of physical security, which uses vulnerability assessment based on an officially specified threat (design basis threat). This paper explores the implications of augmenting and extending existing PRA models by considering new and modified scenarios resulting from malevolent initiators. Teaming the augmented PRA models with conventional vulnerability assessments can cost-effectively enhance security of a nuclear power plant. This methodology is useful for operatingmore » plants, as well as in the design of new plants. For the methodology, we have proposed an approach that builds on and extends the practice of PRA for nuclear power plants for security-related issues. Rather than only considering 'random' failures, we demonstrated a framework that is able to represent and model malevolent initiating events and associated plant impacts.« less

  7. Automated chemical kinetic modeling via hybrid reactive molecular dynamics and quantum chemistry simulations.

    PubMed

    Döntgen, Malte; Schmalz, Felix; Kopp, Wassja A; Kröger, Leif C; Leonhard, Kai

    2018-06-13

    An automated scheme for obtaining chemical kinetic models from scratch using reactive molecular dynamics and quantum chemistry simulations is presented. This methodology combines the phase space sampling of reactive molecular dynamics with the thermochemistry and kinetics prediction capabilities of quantum mechanics. This scheme provides the NASA polynomial and modified Arrhenius equation parameters for all species and reactions that are observed during the simulation and supplies them in the ChemKin format. The ab initio level of theory for predictions is easily exchangeable and the presently used G3MP2 level of theory is found to reliably reproduce hydrogen and methane oxidation thermochemistry and kinetics data. Chemical kinetic models obtained with this approach are ready-to-use for, e.g., ignition delay time simulations, as shown for hydrogen combustion. The presented extension of the ChemTraYzer approach can be used as a basis for methodologically advancing chemical kinetic modeling schemes and as a black-box approach to generate chemical kinetic models.

  8. Chip-based generation of carbon nanodots via electrochemical oxidation of screen printed carbon electrodes and the applications for efficient cell imaging and electrochemiluminescence enhancement

    NASA Astrophysics Data System (ADS)

    Xu, Yuanhong; Liu, Jingquan; Zhang, Jizhen; Zong, Xidan; Jia, Xiaofang; Li, Dan; Wang, Erkang

    2015-05-01

    A portable lab-on-a-chip methodology to generate ionic liquid-functionalized carbon nanodots (CNDs) was developed via electrochemical oxidation of screen printed carbon electrodes. The CNDs can be successfully applied for efficient cell imaging and solid-state electrochemiluminescence sensor fabrication on the paper-based chips.A portable lab-on-a-chip methodology to generate ionic liquid-functionalized carbon nanodots (CNDs) was developed via electrochemical oxidation of screen printed carbon electrodes. The CNDs can be successfully applied for efficient cell imaging and solid-state electrochemiluminescence sensor fabrication on the paper-based chips. Electronic supplementary information (ESI) available: Experimental section; Fig. S1. XPS spectra of the as-prepared CNDs after being dialyzed for 72 hours; Fig. S2. LSCM images showing time-dependent fluorescence signals of HeLa cells treated by the as-prepared CNDs; Tripropylamine analysis using the Nafion/CNDs modified ECL sensor. See DOI: 10.1039/c5nr01765c

  9. Updates on the Methodological Approaches for Carrying Out an In-Depth Study of the Cardiac Conduction System and the Autonomic Nervous System of Victims of Sudden Unexplained Fetal and Infant Death.

    PubMed

    Alfonsi, Graziella; Crippa, Marina

    2016-01-01

    This article contains a set of protocols for histopathological techniques that can be used for carrying out in-depth studies of cases of sudden infant death syndrome and sudden intrauterine unexplained fetal death syndrome. In order to enable researchers to advance hypotheses regarding the causes of the unexpected death of infants and fetuses, the authors propose three innovative and accurate methodologies for studying the cardiac conduction system, the peripheral cardiac nervous system, and the central autonomic nervous system. Over the years, these protocols have been developed, modified, and improved on a vast number of cases which has enabled pathologists to carry out the microscopic analyses of the structures which regulate life, in order to highlight all the possible morphological substrates of pathophysiological mechanisms that may underlie these syndromes. In memory of our research professor Lino Rossi (1923-2004).

  10. Evolving RBF neural networks for adaptive soft-sensor design.

    PubMed

    Alexandridis, Alex

    2013-12-01

    This work presents an adaptive framework for building soft-sensors based on radial basis function (RBF) neural network models. The adaptive fuzzy means algorithm is utilized in order to evolve an RBF network, which approximates the unknown system based on input-output data from it. The methodology gradually builds the RBF network model, based on two separate levels of adaptation: On the first level, the structure of the hidden layer is modified by adding or deleting RBF centers, while on the second level, the synaptic weights are adjusted with the recursive least squares with exponential forgetting algorithm. The proposed approach is tested on two different systems, namely a simulated nonlinear DC Motor and a real industrial reactor. The results show that the produced soft-sensors can be successfully applied to model the two nonlinear systems. A comparison with two different adaptive modeling techniques, namely a dynamic evolving neural-fuzzy inference system (DENFIS) and neural networks trained with online backpropagation, highlights the advantages of the proposed methodology.

  11. Paratransgenic Control of Vector Borne Diseases

    PubMed Central

    Hurwitz, Ivy; Fieck, Annabeth; Read, Amber; Hillesland, Heidi; Klein, Nichole; Kang, Angray; Durvasula, Ravi

    2011-01-01

    Conventional methodologies to control vector borne diseases with chemical pesticides are often associated with environmental toxicity, adverse effects on human health and the emergence of insect resistance. In the paratransgenic strategy, symbiotic or commensal microbes of host insects are transformed to express gene products that interfere with pathogen transmission. These genetically altered microbes are re-introduced back to the insect where expression of the engineered molecules decreases the host's ability to transmit the pathogen. We have successfully utilized this strategy to reduce carriage rates of Trypanosoma cruzi, the causative agent of Chagas disease, in the triatomine bug, Rhodnius prolixus, and are currently developing this methodology to control the transmission of Leishmania donovani by the sand fly Phlebotomus argentipes. Several effector molecules, including antimicrobial peptides and highly specific single chain antibodies, are currently being explored for their anti-parasite activities in these two systems. In preparation for eventual field use, we are actively engaged in risk assessment studies addressing the issue of horizontal gene transfer from the modified bacteria to environmental microbes. PMID:22110385

  12. Simulation of olive grove gross primary production by the combination of ground and multi-sensor satellite data

    NASA Astrophysics Data System (ADS)

    Brilli, L.; Chiesi, M.; Maselli, F.; Moriondo, M.; Gioli, B.; Toscano, P.; Zaldei, A.; Bindi, M.

    2013-08-01

    We developed and tested a methodology to estimate olive (Olea europaea L.) gross primary production (GPP) combining ground and multi-sensor satellite data. An eddy-covariance station placed in an olive grove in central Italy provided carbon and water fluxes over two years (2010-2011), which were used as reference to evaluate the performance of a GPP estimation methodology based on a Monteith type model (modified C-Fix) and driven by meteorological and satellite (NDVI) data. A major issue was related to the consideration of the two main olive grove components, i.e. olive trees and inter-tree ground vegetation: this issue was addressed by the separate simulation of carbon fluxes within the two ecosystem layers, followed by their recombination. In this way the eddy covariance GPP measurements were successfully reproduced, with the exception of two periods that followed tillage operations. For these periods measured GPP could be approximated by considering synthetic NDVI values which simulated the expected response of inter-tree ground vegetation to tillages.

  13. Adopted Methodology for Cool-Down of SST-1 Superconducting Magnet System: Operational Experience with the Helium Refrigerator

    NASA Astrophysics Data System (ADS)

    Sahu, A. K.; Sarkar, B.; Panchal, P.; Tank, J.; Bhattacharya, R.; Panchal, R.; Tanna, V. L.; Patel, R.; Shukla, P.; Patel, J. C.; Singh, M.; Sonara, D.; Sharma, R.; Duggar, R.; Saxena, Y. C.

    2008-03-01

    The 1.3 kW at 4.5 K helium refrigerator / liquefier (HRL) was commissioned during the year 2003. The HRL was operated with its different modes as per the functional requirements of the experiments. The superconducting magnets system (SCMS) of SST-1 was successfully cooled down to 4.5 K. The actual loads were different from the originally predicted boundary conditions and an adjustment in the thermodynamic balance of the refrigerator was necessary. This led to enhanced capacity, which was achieved without any additional hardware. The required control system for the HRL was tuned to achieve the stable thermodynamic balance, while keeping the turbines' operating parameters at optimized conditions. An extra mass flow rate requirement was met by exploiting the margin available with the compressor station. The methodology adopted to modify the capacity of the HRL, the safety precautions and experience of SCMS cool down to 4.5 K, are discussed.

  14. Direct determination of Pb in raw milk by graphite furnace atomic absorption spectrometry (GF AAS) with electrothermal atomization sampling from slurries.

    PubMed

    de Oliveira, Tatiane Milão; Augusto Peres, Jayme; Lurdes Felsner, Maria; Cristiane Justi, Karin

    2017-08-15

    Milk is an important food in the human diet due to its physico-chemical composition; therefore, it is necessary to monitor contamination by toxic metals such as Pb. Milk sample slurries were prepared using Triton X-100 and nitric acid for direct analysis of Pb using graphite furnace atomic absorption spectrometry - GF AAS. After dilution of the slurries, 10.00µl were directly introduced into the pyrolytic graphite tube without use of a chemical modifier, which acts as an advantage considering this type of matrix. The limits of detection and quantification were 0.64 and 2.14µgl -1 , respectively. The figures of merit studied showed that the proposed methodology without pretreatment of the raw milk sample and using external standard calibration is suitable. The methodology was applied in milk samples from the Guarapuava region, in Paraná State (Brazil) and Pb concentrations ranged from 2.12 to 37.36µgl -1 . Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Repositioning the substrate activity screening (SAS) approach as a fragment-based method for identification of weak binders.

    PubMed

    Gladysz, Rafaela; Cleenewerck, Matthias; Joossens, Jurgen; Lambeir, Anne-Marie; Augustyns, Koen; Van der Veken, Pieter

    2014-10-13

    Fragment-based drug discovery (FBDD) has evolved into an established approach for "hit" identification. Typically, most applications of FBDD depend on specialised cost- and time-intensive biophysical techniques. The substrate activity screening (SAS) approach has been proposed as a relatively cheap and straightforward alternative for identification of fragments for enzyme inhibitors. We have investigated SAS for the discovery of inhibitors of oncology target urokinase (uPA). Although our results support the key hypotheses of SAS, we also encountered a number of unreported limitations. In response, we propose an efficient modified methodology: "MSAS" (modified substrate activity screening). MSAS circumvents the limitations of SAS and broadens its scope by providing additional fragments and more coherent SAR data. As well as presenting and validating MSAS, this study expands existing SAR knowledge for the S1 pocket of uPA and reports new reversible and irreversible uPA inhibitor scaffolds. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Consistency Between SC#21REF Solar XUV Energy Input and the 1973 Pioneer 10 Observations of the Jovian Photoelectron Excited H2 Airglow

    NASA Technical Reports Server (NTRS)

    Gangopadhyay, P.; Ogawa, H. S.; Judge, D. L.

    1988-01-01

    It has been suggested in the literature that the F74113 solar spectrum for the solar minimum condition needs to be modified to explain the production of photoelectrons in the Earth's atmosphere. We have studied here the effect of another solar minimum spectrum, SC#21REF, on the Jovian upper atmosphere emissions and we have compared the predicted photoelectron excited H2 airglow with the 1973 Pioneer 10 observations, analyzed according to the methodology of Shemansky and Judge (1988). In this model calculation we find that in 1973, the Jovian H2 band emissions can be accounted for almost entirely by photoelectron excitation, if the preflight calibration of the Pioneer 10 ultraviolet photometer is adopted. If the SC#21REF flux shortward of 250 A is multiplied by 2 as proposed by Richards and Torr (1988) then the Pioneer 10 calibration and/or the airglow model used must be modified in order to have a self consistent set of observations.

  17. Self-assembled monolayers of alendronate on Ti6Al4V alloy surfaces enhance osteogenesis in mesenchymal stem cells

    NASA Astrophysics Data System (ADS)

    Rojo, Luis; Gharibi, Borzo; McLister, Robert; Meenan, Brian J.; Deb, Sanjukta

    2016-07-01

    Phosphonates have emerged as an alternative for functionalization of titanium surfaces by the formation of homogeneous self-assembled monolayers (SAMs) via Ti-O-P linkages. This study presents results from an investigation of the modification of Ti6Al4V alloy by chemisorption of osseoinductive alendronate using a simple, effective and clean methodology. The modified surfaces showed a tailored topography and surface chemistry as determined by SEM microscopy and RAMAN spectroscopy. X-ray photoelectron spectroscopy revealed that an effective mode of bonding is created between the metal oxide surface and the phosphate residue of alendronate, leading to formation of homogenous drug distribution along the surface. In-vitro studies showed that alendronate SAMs induce differentiation of hMSC to a bone cell phenotype and promote bone formation on modified surfaces. Here we show that this novel method for the preparation of functional coatings on titanium-based medical devices provides osseoinductive bioactive molecules to promote enhanced integration at the site of implantation.

  18. A modified operational sequence methodology for zoo exhibit design and renovation: conceptualizing animals, staff, and visitors as interdependent coworkers.

    PubMed

    Kelling, Nicholas J; Gaalema, Diann E; Kelling, Angela S

    2014-01-01

    Human factors analyses have been used to improve efficiency and safety in various work environments. Although generally limited to humans, the universality of these analyses allows for their formal application to a much broader domain. This paper outlines a model for the use of human factors to enhance zoo exhibits and optimize spaces for all user groups; zoo animals, zoo visitors, and zoo staff members. Zoo exhibits are multi-faceted and each user group has a distinct set of requirements that can clash or complement each other. Careful analysis and a reframing of the three groups as interdependent coworkers can enhance safety, efficiency, and experience for all user groups. This paper details a general creation and specific examples of the use of the modified human factors tools of function allocation, operational sequence diagram and needs assessment. These tools allow for adaptability and ease of understanding in the design or renovation of exhibits. © 2014 Wiley Periodicals, Inc.

  19. A novel allosteric mechanism in the cysteine peptidase cathepsin K discovered by computational methods

    NASA Astrophysics Data System (ADS)

    Novinec, Marko; Korenč, Matevž; Caflisch, Amedeo; Ranganathan, Rama; Lenarčič, Brigita; Baici, Antonio

    2014-02-01

    Allosteric modifiers have the potential to fine-tune enzyme activity. Therefore, targeting allosteric sites is gaining increasing recognition as a strategy in drug design. Here we report the use of computational methods for the discovery of the first small-molecule allosteric inhibitor of the collagenolytic cysteine peptidase cathepsin K, a major target for the treatment of osteoporosis. The molecule NSC13345 is identified by high-throughput docking of compound libraries to surface sites on the peptidase that are connected to the active site by an evolutionarily conserved network of residues (protein sector). The crystal structure of the complex shows that NSC13345 binds to a novel allosteric site on cathepsin K. The compound acts as a hyperbolic mixed modifier in the presence of a synthetic substrate, it completely inhibits collagen degradation and has good selectivity for cathepsin K over related enzymes. Altogether, these properties qualify our methodology and NSC13345 as promising candidates for allosteric drug design.

  20. Self-assembled monolayers of alendronate on Ti6Al4V alloy surfaces enhance osteogenesis in mesenchymal stem cells

    PubMed Central

    Rojo, Luis; Gharibi, Borzo; McLister, Robert; Meenan, Brian J.; Deb, Sanjukta

    2016-01-01

    Phosphonates have emerged as an alternative for functionalization of titanium surfaces by the formation of homogeneous self-assembled monolayers (SAMs) via Ti-O-P linkages. This study presents results from an investigation of the modification of Ti6Al4V alloy by chemisorption of osseoinductive alendronate using a simple, effective and clean methodology. The modified surfaces showed a tailored topography and surface chemistry as determined by SEM microscopy and RAMAN spectroscopy. X-ray photoelectron spectroscopy revealed that an effective mode of bonding is created between the metal oxide surface and the phosphate residue of alendronate, leading to formation of homogenous drug distribution along the surface. In-vitro studies showed that alendronate SAMs induce differentiation of hMSC to a bone cell phenotype and promote bone formation on modified surfaces. Here we show that this novel method for the preparation of functional coatings on titanium-based medical devices provides osseoinductive bioactive molecules to promote enhanced integration at the site of implantation. PMID:27468811

  1. A Meta-Analysis of Effects of Bt Crops on Honey Bees (Hymenoptera: Apidae)

    PubMed Central

    Duan, Jian J.; Marvier, Michelle; Huesing, Joseph; Dively, Galen; Huang, Zachary Y.

    2008-01-01

    Background Honey bees (Apis mellifera L.) are the most important pollinators of many agricultural crops worldwide and are a key test species used in the tiered safety assessment of genetically engineered insect-resistant crops. There is concern that widespread planting of these transgenic crops could harm honey bee populations. Methodology/Principal Findings We conducted a meta-analysis of 25 studies that independently assessed potential effects of Bt Cry proteins on honey bee survival (or mortality). Our results show that Bt Cry proteins used in genetically modified crops commercialized for control of lepidopteran and coleopteran pests do not negatively affect the survival of either honey bee larvae or adults in laboratory settings. Conclusions/Significance Although the additional stresses that honey bees face in the field could, in principle, modify their susceptibility to Cry proteins or lead to indirect effects, our findings support safety assessments that have not detected any direct negative effects of Bt crops for this vital insect pollinator. PMID:18183296

  2. Chemically modified graphene/polyimide composite films based on utilization of covalent bonding and oriented distribution.

    PubMed

    Huang, Ting; Lu, Renguo; Su, Chao; Wang, Hongna; Guo, Zheng; Liu, Pei; Huang, Zhongyuan; Chen, Haiming; Li, Tongsheng

    2012-05-01

    Herein, we have developed a rather simple composite fabrication approach to achieving molecular-level dispersion and planar orientation of chemically modified graphene (CMG) in the thermosetting polyimide (PI) matrix as well as realizing strong adhesion at the interfacial regions between reinforcing filler and matrix. The covalent adhesion of CMG to PI matrix and oriented distribution of CMG were carefully confirmed and analyzed by detailed investigations. Combination of covalent bonding and oriented distribution could enlarge the effectiveness of CMG in the matrix. Efficient stress transfer was found at the CMG/PI interfaces. Significant improvements in the mechanical performances, thermal stability, electrical conductivity, and hydrophobic behavior were achieved by addition of only a small amount of CMG. Furthermore, it is noteworthy that the hydrophilic-to-hydrophobic transition and the electrical percolation were observed at only 0.2 wt % CMG in this composite system. This facile methodology is believed to afford broad application potential in graphene-based polymer nanocomposites, especially other types of high-performance thermosetting systems.

  3. Increasing the realism of a laparoscopic box trainer: a simple, inexpensive method.

    PubMed

    Hull, Louise; Kassab, Eva; Arora, Sonal; Kneebone, Roger

    2010-01-01

    Simulation-based training in medical education is increasing. Realism is an integral element of creating an engaging, effective training environment. Although physical trainers offer a low-cost alternative to expensive virtual reality (VR) simulators, many lack in realism. The aim of this research was to enhance the realism of a laparoscopic box trainer by using a simple, inexpensive method. Digital images of the abdominal cavity were captured from a VR simulator. The images were printed onto a laminated card that lined the bottom and sides of the box-trainer cavity. The standard black neoprene material that encloses the abdominal cavity was replaced with a skin-colored silicon model. The realism of the modified box trainer was assessed by surgeons, using quantitative and qualitative methodologies. Results suggest that the modified box trainer was more realistic than a standard box trainer alone. Incorporating this technique in the training of laparoscopic skills is an inexpensive means of emulating surgical reality that may enhance the engagement of the learner in simulation.

  4. Mesquite Gum as a Novel Reducing and Stabilizing Agent for Modified Tollens Synthesis of Highly Concentrated Ag Nanoparticles

    PubMed Central

    Moreno-Trejo, Maira Berenice; Sánchez-Domínguez, Margarita

    2016-01-01

    The synthesis that is described in this study is for the preparation of silver nanoparticles of sizes ranging from 10 nm to 30 nm with a defined shape (globular), confirmed by UV-vis, SEM, STEM and DLS analysis. This simple and favorable one-step modified Tollens reaction does not require any special equipment or other stabilizing or reducing agent except for a solution of purified mesquite gum, and it produces aqueous colloidal dispersions of silver nanoparticles with a stability that exceeds three months, a relatively narrow size distribution, a low tendency to aggregate and a yield of at least 95% for all cases. Reaction times are between 15 min and 60 min to obtain silver nanoparticles in concentrations ranging from 0.1 g to 3 g of Ag per 100 g of reaction mixture. The proposed synthetic method presents a high potential for scale-up, since its production capacity is rather high and the methodology is simple. PMID:28773938

  5. Modifying Photovoice for community-based participatory Indigenous research.

    PubMed

    Castleden, Heather; Garvin, Theresa

    2008-03-01

    Scientific research occurs within a set of socio-political conditions, and in Canada research involving Indigenous communities has a historical association with colonialism. Consequently, Indigenous peoples have been justifiably sceptical and reluctant to become the subjects of academic research. Community-Based Participatory Research (CBPR) is an attempt to develop culturally relevant research models that address issues of injustice, inequality, and exploitation. The work reported here evaluates the use of Photovoice, a CBPR method that uses participant-employed photography and dialogue to create social change, which was employed in a research partnership with a First Nation in Western Canada. Content analysis of semi-structured interviews (n=45) evaluated participants' perspectives of the Photovoice process as part of a larger study on health and environment issues. The analysis revealed that Photovoice effectively balanced power, created a sense of ownership, fostered trust, built capacity, and responded to cultural preferences. The authors discuss the necessity of modifying Photovoice, by building in an iterative process, as being key to the methodological success of the project.

  6. Scanpath-based analysis of objects conspicuity in context of human vision physiology.

    PubMed

    Augustyniak, Piotr

    2007-01-01

    This paper discusses principal aspects of objects conspicuity investigated with use of an eye tracker and interpreted on the background of human vision physiology. Proper management of objects conspicuity is fundamental in several leading edge applications in the information society like advertisement, web design, man-machine interfacing and ergonomics. Although some common rules of human perception are applied since centuries in the art, the interest of human perception process is motivated today by the need of gather and maintain the recipient attention by putting selected messages in front of the others. Our research uses the visual tasks methodology and series of progressively modified natural images. The modifying details were attributed by their size, color and position while the scanpath-derived gaze points confirmed or not the act of perception. The statistical analysis yielded the probability of detail perception and correlations with the attributes. This probability conforms to the knowledge about the retina anatomy and perception physiology, although we use noninvasive methods only.

  7. Optimization of Culture Medium Enhances Viable Biomass Production and Biocontrol Efficacy of the Antagonistic Yeast, Candida diversa.

    PubMed

    Liu, Jia; Li, Guangkun; Sui, Yuan

    2017-01-01

    Viable biomass production is a key determinant of suitability of antagonistic yeasts as potential biocontrol agents. This study investigated the effects of three metal ions (magnesium, ferrous, and zinc) on biomass production and viability of the antagonistic yeast, Candida diversa . Using response surface methodology to optimize medium components, a maximum biomass was obtained, when the collective Mg 2+ , Fe 2+ , and Zn 2+ concentrations were adjusted in a minimal mineral (MM) medium. Compared with the unmodified MM, and three ion-deficient MM media, yeast cells cultured in the three ion-modified MM medium exhibited a lower level of cellular oxidative damage, and a higher level of antioxidant enzyme activity. A biocontrol assay indicated that C. diversa grown in the ion-modified MM exhibited the greatest level of control of gray mold on apple fruit. These results provide new information on culture medium optimization to grow yeast antagonists in order to improve biomass production and biocontrol efficacy.

  8. Defining and Enabling Resiliency of Electric Distribution Systems With Multiple Microgrids

    DOE PAGES

    Chanda, Sayonsom; Srivastava, Anurag K.

    2016-05-02

    This paper presents a method for quantifying and enabling the resiliency of a power distribution system (PDS) using analytical hierarchical process and percolation theory. Using this metric, quantitative analysis can be done to analyze the impact of possible control decisions to pro-actively enable the resilient operation of distribution system with multiple microgrids and other resources. Developed resiliency metric can also be used in short term distribution system planning. The benefits of being able to quantify resiliency can help distribution system planning engineers and operators to justify control actions, compare different reconfiguration algorithms, develop proactive control actions to avert power systemmore » outage due to impending catastrophic weather situations or other adverse events. Validation of the proposed method is done using modified CERTS microgrids and a modified industrial distribution system. Furthermore, simulation results show topological and composite metric considering power system characteristics to quantify the resiliency of a distribution system with the proposed methodology, and improvements in resiliency using two-stage reconfiguration algorithm and multiple microgrids.« less

  9. Modified Distribution-Free Goodness-of-Fit Test Statistic.

    PubMed

    Chun, So Yeon; Browne, Michael W; Shapiro, Alexander

    2018-03-01

    Covariance structure analysis and its structural equation modeling extensions have become one of the most widely used methodologies in social sciences such as psychology, education, and economics. An important issue in such analysis is to assess the goodness of fit of a model under analysis. One of the most popular test statistics used in covariance structure analysis is the asymptotically distribution-free (ADF) test statistic introduced by Browne (Br J Math Stat Psychol 37:62-83, 1984). The ADF statistic can be used to test models without any specific distribution assumption (e.g., multivariate normal distribution) of the observed data. Despite its advantage, it has been shown in various empirical studies that unless sample sizes are extremely large, this ADF statistic could perform very poorly in practice. In this paper, we provide a theoretical explanation for this phenomenon and further propose a modified test statistic that improves the performance in samples of realistic size. The proposed statistic deals with the possible ill-conditioning of the involved large-scale covariance matrices.

  10. [Evaluation of our psycho-educative program by participating caregivers].

    PubMed

    Bier, J C; Van den Berge, D; de Wouters d'Oplinter, N; Bosman, N; Fery, P

    2010-09-01

    Facing difficulties due to dementia syndromes, systemic care is necessary. Amongst therapies assessed specifically to caregivers, psychoeducative steps seem to be the strongest effective one on neuropsychiatrics symptoms. Psychoeducations tend to teach the caregivers to modify their interactions with patients via a better understanding of illnesses and patients. Our training "Pour mieux vivre avec la maladie d'Alzheimer", applied in groups of eight to twelve persons, consists in twelve sessions of two hours each. To assure the biggest possible availability, we recently incorporated the concomitant coverage of patients into artistic workshops. These sessions of art-therapy realized in parallel to our psychoeducative program will thus be estimated according to the same rigorous methodology. The critical evaluations realized by participants at the end of our program reflect the outcome of our main objective (to teach to modify interactions with the patients) while contributing to the improvement of social contacts and to the learning of calling to existing helps. These preliminary results strongly argue for the pursuit and even extension of this kind of caregiver's management.

  11. Optimization of Culture Medium Enhances Viable Biomass Production and Biocontrol Efficacy of the Antagonistic Yeast, Candida diversa

    PubMed Central

    Liu, Jia; Li, Guangkun; Sui, Yuan

    2017-01-01

    Viable biomass production is a key determinant of suitability of antagonistic yeasts as potential biocontrol agents. This study investigated the effects of three metal ions (magnesium, ferrous, and zinc) on biomass production and viability of the antagonistic yeast, Candida diversa. Using response surface methodology to optimize medium components, a maximum biomass was obtained, when the collective Mg2+, Fe2+, and Zn2+ concentrations were adjusted in a minimal mineral (MM) medium. Compared with the unmodified MM, and three ion-deficient MM media, yeast cells cultured in the three ion-modified MM medium exhibited a lower level of cellular oxidative damage, and a higher level of antioxidant enzyme activity. A biocontrol assay indicated that C. diversa grown in the ion-modified MM exhibited the greatest level of control of gray mold on apple fruit. These results provide new information on culture medium optimization to grow yeast antagonists in order to improve biomass production and biocontrol efficacy. PMID:29089939

  12. Phase-assisted synthesis and DNA unpacking evaluation of biologically inspired metallo nanocomplexes using peptide as unique building block.

    PubMed

    Raman, N; Sudharsan, S

    2011-12-01

    The goal of nanomaterials' surface modification using a biomaterial is to preserve the materials' bulk properties while modifying only their surface to possess desired recognition and specificity. Here, we have developed a phase-assisted, modified Brust-Schiffrin methodological synthesis of metallo nanocomplexes anchored by a peptide, N,N'-(1,3-propylene)-bis-hippuricamide. The spectral, thermal and morphological characterizations assure the formation of nanocomplexes. Therapeutic behavior of all the nanocomplexes has been well sighted by evaluating their DNA unpacking skills. In addition, we demonstrate their biological inspiration by targeting few bacterial and fungal strains. The in vitro antimicrobial investigation reports that all the nanocomplexes disrupt microbial cell walls/membranes efficiently and inhibit the growth of microbes. These sorts of nanocomplexes synthesized in large quantities and at low cost, deliver versatile biomedical applications, and can be used to treat various diseases which may often cause high mortality. Copyright © 2011 Elsevier Inc. All rights reserved.

  13. Evaluation of Iodine Bioavailability in Seaweed Using in Vitro Methods.

    PubMed

    Domínguez-González, M Raquel; Chiocchetti, Gabriela M; Herbello-Hermelo, Paloma; Vélez, Dinoraz; Devesa, Vicenta; Bermejo-Barrera, Pilar

    2017-09-27

    Due to the high levels of iodine present in seaweed, the ingestion of a large amount of this type of food can produce excessive intake of iodine. However, the food after ingestion undergoes different chemistry and physical processes that can modify the amount of iodine that reaches the systemic circulation (bioavailability). Studies on the bioavailability of iodine from food are scarce and indicate that the bioavailable amount is generally lower than ingested. Iodine in vitro bioavailability estimation from different commercialized seaweed has been studied using different in vitro approaches (solubility, dialyzability, and transport and uptake by intestinal cells). Results indicate that iodine is available after gastrointestinal digestion for absorption (bioaccessibility: 49-82%), kombu being the seaweed with the highest bioaccessibility. The incorporation of dialysis cell cultures to elucidate bioavailability modifies the estimation of the amount of iodine that may reach the systemic circulation (dialysis, 5-28%; cell culture, ≤3%). The paper discusses advantages and drawbacks of these methodologies for iodine bioavailability in seaweed.

  14. Geologic map of the Zarkashan-Anguri copper and gold deposits, Ghazni Province, Afghanistan, modified from the 1968 original map compilation of E.P. Meshcheryakov and V.P. Sayapin

    USGS Publications Warehouse

    Peters, Stephen G.; Stettner, Will R.; Masonic, Linda M.; Moran, Thomas W.

    2011-01-01

    This map is a modified version of Geological map of the area of Zarkashan-Anguri gold deposits, scale 1:50,000, which was compiled by E.P. Meshcheryakov and V.P. Sayapin in 1968. Scientists from the U.S. Geological Survey, in cooperation with the Afghan Geological Survey and the Task Force for Business and Stability Operations of the U.S. Department of Defense, studied the original document and related reports and also visited the field area in April 2010. This modified map, which includes a cross section, illustrates the geologic setting of the Zarkashan-Anguri copper and gold deposits. The map reproduces the topology (contacts, faults, and so forth) of the original Soviet map and cross section and includes modifications based on our examination of that and other documents, and based on observations made and sampling undertaken during our field visit. (Refer to the Introduction and the References in the Map PDF for an explanation of our methodology and for complete citations of the original map and related reports.) Elevations on the cross section are derived from the original Soviet topography and may not match the newer topography used on the current map.

  15. Genetic variation as a modifier of association between therapeutic exposure and subsequent malignant neoplasms in cancer survivors.

    PubMed

    Bhatia, Smita

    2015-03-01

    Subsequent malignant neoplasms (SMNs) are associated with significant morbidity and are a major cause of premature mortality among cancer survivors. Several large studies have demonstrated a strong association between the radiation and/or chemotherapy used to treat primary cancer and the risk of developing SMNs. However, for any given therapeutic exposure, the risk of developing an SMN varies between individuals. Genomic variation can potentially modify the association between therapeutic exposures and SMN risk and may explain the observed interindividual variability. In this review, the author provides a brief overview of the current knowledge regarding the role of genomic variation in the development of therapy-related SMNs and discusses the methodological challenges in undertaking an endeavor to develop a deeper understanding of the molecular underpinnings of therapy-related SMNs, such as an appropriate study design, the identification of an adequately sized study population together with a reliable plan for collecting and maintaining high-quality DNA, clinical validation of the phenotype, and the selection of an appropriate approach or platform for genotyping. Understanding the factors that can modify the risk of treatment-related SMNs is critical to developing targeted intervention strategies and optimizing risk-based health care for cancer survivors. © 2014 American Cancer Society.

  16. Molecular dynamics study of some non-hydrogen-bonding base pair DNA strands

    NASA Astrophysics Data System (ADS)

    Tiwari, Rakesh K.; Ojha, Rajendra P.; Tiwari, Gargi; Pandey, Vishnudatt; Mall, Vijaysree

    2018-05-01

    In order to elucidate the structural activity of hydrophobic modified DNA, the DMMO2-D5SICS, base pair is introduced as a constituent in different set of 12-mer and 14-mer DNA sequences for the molecular dynamics (MD) simulation in explicit water solvent. AMBER 14 force field was employed for each set of duplex during the 200ns production-dynamics simulation in orthogonal-box-water solvent by the Particle-Mesh-Ewald (PME) method in infinite periodic boundary conditions (PBC) to determine conformational parameters of the complex. The force-field parameters of modified base-pair were calculated by Gaussian-code using Hartree-Fock /ab-initio methodology. RMSD Results reveal that the conformation of the duplex is sequence dependent and the binding energy of the complex depends on the position of the modified base-pair in the nucleic acid strand. We found that non-bonding energy had a significant contribution to stabilising such type of duplex in comparison to electrostatic energy. The distortion produced within strands by such type of base-pair was local and destabilised the duplex integrity near to substitution, moreover the binding energy of duplex depends on the position of substitution of hydrophobic base-pair and the DNA sequence and strongly supports the corresponding experimental study.

  17. Joint Effects of Granule Size and Degree of Substitution on Octenylsuccinated Sweet Potato Starch Granules As Pickering Emulsion Stabilizers.

    PubMed

    Li, Jinfeng; Ye, Fayin; Lei, Lin; Zhou, Yun; Zhao, Guohua

    2018-05-02

    The granules of sweet potato starch were size fractionated into three portions with significantly different median diameters ( D 50 ) of 6.67 (small-sized), 11.54 (medium-sized), and 16.96 μm (large-sized), respectively. Each portion was hydrophobized at the mass-based degrees of substitution (DS m ) of approximately 0.0095 (low), 0.0160 (medium), and 0.0230 (high). The Pickering emulsion-stabilizing capacities of modified granules were tested, and the resultant emulsions were characterized. The joint effects of granule size and DS m on emulsifying capacity (EC) were investigated by response surface methodology. For small-, medium-, and large-sized fractions, their highest emulsifying capacities are comparable but, respectively, encountered at high (0.0225), medium (0.0158), and low (0.0095) DS m levels. The emulsion droplet size increased with granule size, and the number of freely scattered granules in emulsions decreased with DS m . In addition, the term of surface density of the octenyl succinic group (SD -OSG ) was first proposed for modified starch granules, and it was proved better than DS m in interpreting the emulsifying capacities of starch granules with varying sizes. The present results implied that, as the particulate stabilizers, the optimal DS m of modified starch granules is size specific.

  18. Electrochemical growth of CoNi and Pt-CoNi soft magnetic composites on an alkanethiol monolayer-modified ITO substrate.

    PubMed

    Escalera-López, D; Gómez, E; Vallés, E

    2015-07-07

    CoNi and Pt-CoNi magnetic layers on indium-tin oxide (ITO) substrates modified by an alkanethiol self-assembled monolayer (SAM) have been electrochemically obtained as an initial stage to prepare semiconducting layer-SAM-magnetic layer hybrid structures. The best conditions to obtain the maximum compactness of adsorbed layers of dodecanethiol (C12-SH) on ITO substrate have been studied using contact angle, AFM, XPS and electrochemical tests. The electrochemical characterization (electrochemical probe or voltammetric response in blank solutions) is fundamental to ensure the maximum blocking of the substrate. Although the electrodeposition process on the SAM-modified ITO substrate is very slow if the blocking of the surface is significant, non-cracked metallic layers of CoNi, with or without a previously electrodeposited seed-layer of platinum, have been obtained by optimizing the deposition potentials. Initial nucleation is expected to take place at the pinhole defects of the C12-SH SAM, followed by a mushroom-like growth regime through the SAM interface that allows the formation of a continuous metallic layer electrically connected to the ITO surface. Due to the potential of the methodology, the preparation of patterned metallic deposits on ITO substrate using SAMs with different coverage as templates is feasible.

  19. Electrochemical DNA biosensor based on poly(2,6-pyridinedicarboxylic acid) modified glassy carbon electrode for the determination of anticancer drug gemcitabine.

    PubMed

    Tığ, Gözde Aydoğdu; Zeybek, Bülent; Pekyardımcı, Şule

    2016-07-01

    In this study, a simple methodology was used to develop a new electrochemical DNA biosensor based on poly(2,6-pyridinedicarboxylic acid) (P(PDCA)) modified glassy carbon electrode (GCE). This modified electrode was used to monitor for the electrochemical interaction between the dsDNA and gemcitabine (GEM) for the first time. A decrease in oxidation signals of guanine after the interaction of the dsDNA with the GEM was used as an indicator for the selective determination of the GEM via differential pulse voltammetry (DPV). The guanine oxidation peak currents were linearly proportional to the concentrations of the GEM in the range of 1-30mgL(‒1). Limit of detection (LOD) and limit of quantification (LOQ) were found to be 0.276mgL(‒1) and 0.922mgL(‒1), respectively. The reproducibility, repeatability, and applicability of the analysis to pharmaceutical dosage forms and human serum samples were also examined. In addition to DPV method, UV-vis and viscosity measurements were utilized to propose the interaction mechanism between the GEM and the dsDNA. The novel DNA biosensor could serve for sensitive, accurate and rapid determination of the GEM. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. New screening methodology for selection of polymeric materials for transdermal drug delivery devices

    NASA Astrophysics Data System (ADS)

    Falcone, Roberto P.

    As medical advances extend the human lifespan, the level of chronic illnesses will increase and thus straining the needs of the health care system that, as a result, governments will need to balance expenses without upsetting national budgets. Therefore, the selection of a precise and affordable drug delivery technology is seen as the most practical solution for governments, health care professionals, and consumers. Transdermal drug delivery patches (TDDP) are one of the best economical technologies that are favored by pharmaceutical companies and physicians alike because it offers fewer complications when compared to other delivery technologies. TDDP provides increased efficiency, safety and convenience for the patient. The TDDP segment within the US and Global drug delivery markets were valued at 5.6 and 12.7 billion respectively in 2009. TDDP is forecasted to reach $31.5 billion in 2015. The present TDDP technology involves the fabrication of a patch that consists of a drug embedded in a polymeric matrix. The diffusion coefficient is determined from the slope of the cumulative drug release versus time. It is a trial and error method that is time and labor consuming. With all the advantages that TDDPs can offer, the methodology used to achieve the so-called optimum design has resulted in several incidents where the safety and design have been put to question in recent times (e.g. Fentanyl). A more logical screening methodology is needed. This work shows the use of a modified Duda Zielinsky equation (DZE). Experimental release curves from commercial are evaluated. The experimental and theoretical Diffusion Coefficient values are found to be within the limits specified in the patent literature. One interesting finding is that the accuracy of the DZE is closer to experimental values when the type of Molecular Shape and Radius are used. This work shows that the modified DZE could be used as an excellent screening tool to determine the optimal polymeric matrices that will yield the desired Diffusion Coefficient and thus effectively decreasing the amount of time and labor when developing TDDPs.

Top