Sample records for approach involves generating

  1. Design, development, and evaluation of a second generation interactive Simulator for Engineering Ethics Education (SEEE2).

    PubMed

    Alfred, Michael; Chung, Christopher A

    2012-12-01

    This paper describes a second generation Simulator for Engineering Ethics Education. Details describing the first generation activities of this overall effort are published in Chung and Alfred (Sci Eng Ethics 15:189-199, 2009). The second generation research effort represents a major development in the interactive simulator educational approach. As with the first generation effort, the simulator places students in first person perspective scenarios involving different types of ethical situations. Students must still gather data, assess the situation, and make decisions. The approach still requires students to develop their own ability to identify and respond to ethical engineering situations. However, were as, the generation one effort involved the use of a dogmatic model based on National Society of Professional Engineers' Code of Ethics, the new generation two model is based on a mathematical model of the actual experiences of engineers involved in ethical situations. This approach also allows the use of feedback in the form of decision effectiveness and professional career impact. Statistical comparisons indicate a 59 percent increase in overall knowledge and a 19 percent improvement in teaching effectiveness over an Internet Engineering Ethics resource based approach.

  2. Approaches and Strategies in Next Generation Science Learning

    ERIC Educational Resources Information Center

    Khine, Myint Swe, Ed.; Saleh, Issa M., Ed.

    2013-01-01

    "Approaches and Strategies in Next Generation Science Learning" examines the challenges involved in the development of modern curriculum models, teaching strategies, and assessments in science education in order to prepare future students in the 21st century economies. This comprehensive collection of research brings together science educators,…

  3. Whole-School Approaches to Incorporating Mindfulness-Based Interventions: Supporting the Capacity for Optimal Functioning in School Settings

    ERIC Educational Resources Information Center

    Kielty, Michele L.; Gilligan, Tammy D.; Staton, A. Renee

    2017-01-01

    With any intervention program, involving all stakeholders in a joint effort toward implementation is most likely to lead to success. Whole-school approaches that involve school personnel, students, families, and local communities have been associated with positive, sustained outcomes. For mindfulness training programs to generate the most…

  4. Developing Empathetic Skills among Teachers and Learners in High Schools in Tshwane: An Inter-Generational Approach Involving People with Dementia

    ERIC Educational Resources Information Center

    Alant, Erna; Geyer, Stephan; Verde, Michael

    2015-01-01

    This article describes the implementation and outcomes of an experiential learning approach to facilitate the development of empathetic skills among teachers and learners at two high schools in Tshwane, South Africa. An inter-generational training programme, the Memory Bridge Initiative (MBI), aimed at exposing participants to interactions with…

  5. A Low-Cost, Passive Approach for Bacterial Growth and Distribution for Large-Scale Implementation of Bioaugmentation

    DTIC Science & Technology

    2012-07-01

    technologies with significant capital costs, secondary waste streams, the involvement of hazardous materials, and the potential for additional worker...or environmental exposure. A more ideal technology would involve lower capital costs, would not generate secondary waste streams, would be...of bioaugmentation technology in general include low risk to human health and the environment during implementation, low secondary waste generation

  6. Unapparent Information Revelation: Text Mining for Counterterrorism

    NASA Astrophysics Data System (ADS)

    Srihari, Rohini K.

    Unapparent information revelation (UIR) is a special case of text mining that focuses on detecting possible links between concepts across multiple text documents by generating an evidence trail explaining the connection. A traditional search involving, for example, two or more person names will attempt to find documents mentioning both these individuals. This research focuses on a different interpretation of such a query: what is the best evidence trail across documents that explains a connection between these individuals? For example, all may be good golfers. A generalization of this task involves query terms representing general concepts (e.g. indictment, foreign policy). Previous approaches to this problem have focused on graph mining involving hyperlinked documents, and link analysis exploiting named entities. A new robust framework is presented, based on (i) generating concept chain graphs, a hybrid content representation, (ii) performing graph matching to select candidate subgraphs, and (iii) subsequently using graphical models to validate hypotheses using ranked evidence trails. We adapt the DUC data set for cross-document summarization to evaluate evidence trails generated by this approach

  7. The Formative Method for Adapting Psychotherapy (FMAP): A community-based developmental approach to culturally adapting therapy

    PubMed Central

    Hwang, Wei-Chin

    2010-01-01

    How do we culturally adapt psychotherapy for ethnic minorities? Although there has been growing interest in doing so, few therapy adaptation frameworks have been developed. The majority of these frameworks take a top-down theoretical approach to adapting psychotherapy. The purpose of this paper is to introduce a community-based developmental approach to modifying psychotherapy for ethnic minorities. The Formative Method for Adapting Psychotherapy (FMAP) is a bottom-up approach that involves collaborating with consumers to generate and support ideas for therapy adaptation. It involves 5-phases that target developing, testing, and reformulating therapy modifications. These phases include: (a) generating knowledge and collaborating with stakeholders (b) integrating generated information with theory and empirical and clinical knowledge, (c) reviewing the initial culturally adapted clinical intervention with stakeholders and revising the culturally adapted intervention, (d) testing the culturally adapted intervention, and (e) finalizing the culturally adapted intervention. Application of the FMAP is illustrated using examples from a study adapting psychotherapy for Chinese Americans, but can also be readily applied to modify therapy for other ethnic groups. PMID:20625458

  8. Active Learning Crosses Generations.

    ERIC Educational Resources Information Center

    Woodard, Diane K.

    2002-01-01

    Describes the benefits of intergenerational programs, highlighting a child care program that offers age-appropriate and mutually beneficial activities for children and elders within a nearby retirement community. The program has adopted High/Scope's active learning approach to planning and implementing activities that involve both generations. The…

  9. Generation of Mouse Lung Epithelial Cells.

    PubMed

    Kasinski, Andrea L; Slack, Frank J

    2013-08-05

    Although in vivo models are excellent for assessing various facets of whole organism physiology, pathology, and overall response to treatments, evaluating basic cellular functions, and molecular events in mammalian model systems is challenging. It is therefore advantageous to perform these studies in a refined and less costly setting. One approach involves utilizing cells derived from the model under evaluation. The approach to generate such cells varies based on the cell of origin and often the genetics of the cell. Here we describe the steps involved in generating epithelial cells from the lungs of Kras LSL-G12D/+ ; p53 LSL-R172/+ mice (Kasinski and Slack, 2012). These mice develop aggressive lung adenocarcinoma following cre-recombinase dependent removal of a stop cassette in the transgenes and subsequent expression of Kra -G12D and p53 R172 . While this protocol may be useful for the generation of epithelial lines from other genetic backgrounds, it should be noted that the Kras; p53 cell line generated here is capable of proliferating in culture without any additional genetic manipulation that is often needed for less aggressive backgrounds.

  10. Generation and Evaluation of Lunar Dust Adhesion Mitigating Materials

    NASA Technical Reports Server (NTRS)

    Wohl, Christopher J.; Connell, John W.; Lin, Yi; Belcher, Marcus A.; Palmieri, Frank L.

    2011-01-01

    Particulate contamination is of concern in a variety of environments. This issue is especially important in confined spaces with highly controlled atmospheres such as space exploration vehicles involved in extraterrestrial surface missions. Lunar dust was a significant challenge for the Apollo astronauts and will be of greater concern for longer duration, future missions. Passive mitigation strategies, those not requiring external energy, may decrease some of these concerns, and have been investigated in this work. A myriad of approaches to modify the surface chemistry and topography of a variety of substrates was investigated. These involved generation of novel materials, photolithographic techniques, and other template approaches. Additionally, single particle and multiple particle methods to quantitatively evaluate the particle-substrate adhesion interactions were developed.

  11. First-Generation College Students: Perceptions, Access, and Participation at Urban University

    ERIC Educational Resources Information Center

    Lovano McCann, Erica

    2017-01-01

    This study explores how participation in a first-generation college student sophomore seminar course, a high impact practice, influences students' perceptions of campus climate, access to forms of capital and involvement behaviors. Utilizing a mixed methods approach of quantitative surveys and qualitative interviews to explore the experience of…

  12. Proteasix: a tool for automated and large-scale prediction of proteases involved in naturally occurring peptide generation.

    PubMed

    Klein, Julie; Eales, James; Zürbig, Petra; Vlahou, Antonia; Mischak, Harald; Stevens, Robert

    2013-04-01

    In this study, we have developed Proteasix, an open-source peptide-centric tool that can be used to predict in silico the proteases involved in naturally occurring peptide generation. We developed a curated cleavage site (CS) database, containing 3500 entries about human protease/CS combinations. On top of this database, we built a tool, Proteasix, which allows CS retrieval and protease associations from a list of peptides. To establish the proof of concept of the approach, we used a list of 1388 peptides identified from human urine samples, and compared the prediction to the analysis of 1003 randomly generated amino acid sequences. Metalloprotease activity was predominantly involved in urinary peptide generation, and more particularly to peptides associated with extracellular matrix remodelling, compared to proteins from other origins. In comparison, random sequences returned almost no results, highlighting the specificity of the prediction. This study provides a tool that can facilitate linking of identified protein fragments to predicted protease activity, and therefore into presumed mechanisms of disease. Experiments are needed to confirm the in silico hypotheses; nevertheless, this approach may be of great help to better understand molecular mechanisms of disease, and define new biomarkers, and therapeutic targets. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Educational Approaches When Implementing the Next Generation Science Standards

    NASA Astrophysics Data System (ADS)

    Dwyer, Brian

    This paper overviews the history of science education from the launch of Sputnik through reform movements and associated legislation to the most recent Next Generation Science Standards (NGSS). The paper also considers stakeholder groups that would need to be involved if NGSS is to be implemented properly, including teachers, parents and unions. Each group holds a responsibility within a school system that needs to be addressed from a practical standpoint to increase the likelihood of the effective adoption of the Next Generation Science Standards. This paper provides background and program information about the Next Generation Science Standards (NGSS). It also considers the educational, philosophical, and instructional approach known as inquiry which is strongly advocated by NGSS and explores where and how other well-studied instructional approaches might have a place within an inquiry-based classroom.

  14. Next generation initiation techniques

    NASA Technical Reports Server (NTRS)

    Warner, Tom; Derber, John; Zupanski, Milija; Cohn, Steve; Verlinde, Hans

    1993-01-01

    Four-dimensional data assimilation strategies can generally be classified as either current or next generation, depending upon whether they are used operationally or not. Current-generation data-assimilation techniques are those that are presently used routinely in operational-forecasting or research applications. They can be classified into the following categories: intermittent assimilation, Newtonian relaxation, and physical initialization. It should be noted that these techniques are the subject of continued research, and their improvement will parallel the development of next generation techniques described by the other speakers. Next generation assimilation techniques are those that are under development but are not yet used operationally. Most of these procedures are derived from control theory or variational methods and primarily represent continuous assimilation approaches, in which the data and model dynamics are 'fitted' to each other in an optimal way. Another 'next generation' category is the initialization of convective-scale models. Intermittent assimilation systems use an objective analysis to combine all observations within a time window that is centered on the analysis time. Continuous first-generation assimilation systems are usually based on the Newtonian-relaxation or 'nudging' techniques. Physical initialization procedures generally involve the use of standard or nonstandard data to force some physical process in the model during an assimilation period. Under the topic of next-generation assimilation techniques, variational approaches are currently being actively developed. Variational approaches seek to minimize a cost or penalty function which measures a model's fit to observations, background fields and other imposed constraints. Alternatively, the Kalman filter technique, which is also under investigation as a data assimilation procedure for numerical weather prediction, can yield acceptable initial conditions for mesoscale models. The third kind of next-generation technique involves strategies to initialize convective scale (non-hydrostatic) models.

  15. Ontology-based automatic generation of computerized cognitive exercises.

    PubMed

    Leonardi, Giorgio; Panzarasa, Silvia; Quaglini, Silvana

    2011-01-01

    Computer-based approaches can add great value to the traditional paper-based approaches for cognitive rehabilitation. The management of a big amount of stimuli and the use of multimedia features permits to improve the patient's involvement and to reuse and recombine them to create new exercises, whose difficulty level should be adapted to the patient's performance. This work proposes an ontological organization of the stimuli, to support the automatic generation of new exercises, tailored on the patient's preferences and skills, and its integration into a commercial cognitive rehabilitation tool. The possibilities offered by this approach are presented with the help of real examples.

  16. A continuous quality improvement team approach to adverse drug reaction reporting.

    PubMed

    Flowers, P; Dzierba, S; Baker, O

    1992-07-01

    Crossfunctional teams can generate more new ideas, concepts, and possible solutions than does a department-based process alone. Working collaboratively can increase knowledge of teams using CQI approaches and appropriate tools. CQI produces growth and development at multiple levels resulting from involvement in the process of incremental improvement.

  17. Automated Assessment in Massive Open Online Courses

    ERIC Educational Resources Information Center

    Ivaniushin, Dmitrii A.; Shtennikov, Dmitrii G.; Efimchick, Eugene A.; Lyamin, Andrey V.

    2016-01-01

    This paper describes an approach to use automated assessments in online courses. Open edX platform is used as the online courses platform. The new assessment type uses Scilab as learning and solution validation tool. This approach allows to use automated individual variant generation and automated solution checks without involving the course…

  18. Assessment of equine waste as a biomass resource in New York State

    USDA-ARS?s Scientific Manuscript database

    Equine operations may generate excessive quantities of biomass (manure and used bedding) that could either become a waste or a resource, especially when the biomass is developed as an alternative energy source. Using the generated biomass as a resource can involve a variety of approaches such as la...

  19. A New Approach for Proving or Generating Combinatorial Identities

    ERIC Educational Resources Information Center

    Gonzalez, Luis

    2010-01-01

    A new method for proving, in an immediate way, many combinatorial identities is presented. The method is based on a simple recursive combinatorial formula involving n + 1 arbitrary real parameters. Moreover, this formula enables one not only to prove, but also generate many different combinatorial identities (not being required to know them "a…

  20. Teaching the "Geo" in Geography with the Next Generation Science Standards

    ERIC Educational Resources Information Center

    Wysession, Michael E.

    2016-01-01

    The Next Generation Science Standards (NGSS; Achieve 2014, 532; Figure 1A) represent a new approach to K-12 science education that involves the interweaving of three educational dimensions: Science and Engineering Practices (SEPs), Disciplinary Core Ideas (DCIs), and Crosscutting Concepts (CCCs). Unlike most preexisting state science standards for…

  1. B-cell Ligand Processing Pathways Detected by Large-scale Comparative Analysis

    PubMed Central

    Towfic, Fadi; Gupta, Shakti; Honavar, Vasant; Subramaniam, Shankar

    2012-01-01

    The initiation of B-cell ligand recognition is a critical step for the generation of an immune response against foreign bodies. We sought to identify the biochemical pathways involved in the B-cell ligand recognition cascade and sets of ligands that trigger similar immunological responses. We utilized several comparative approaches to analyze the gene coexpression networks generated from a set of microarray experiments spanning 33 different ligands. First, we compared the degree distributions of the generated networks. Second, we utilized a pairwise network alignment algorithm, BiNA, to align the networks based on the hubs in the networks. Third, we aligned the networks based on a set of KEGG pathways. We summarized our results by constructing a consensus hierarchy of pathways that are involved in B cell ligand recognition. The resulting pathways were further validated through literature for their common physiological responses. Collectively, the results based on our comparative analyses of degree distributions, alignment of hubs, and alignment based on KEGG pathways provide a basis for molecular characterization of the immune response states of B-cells and demonstrate the power of comparative approaches (e.g., gene coexpression network alignment algorithms) in elucidating biochemical pathways involved in complex signaling events in cells. PMID:22917187

  2. Interventions That Target Criminogenic Needs for Justice-Involved Persons With Serious Mental Illnesses: A Targeted Service Delivery Approach.

    PubMed

    Wilson, Amy Blank; Farkas, Kathleen; Bonfine, Natalie; Duda-Banwar, Janelle

    2018-05-01

    This research describes the development of a targeted service delivery approach that tailors the delivery of interventions that target criminogenic needs to the specific learning and treatment needs of justice-involved people with serious mental illnesses (SMIs). This targeted service delivery approach includes five service delivery strategies: repetition and summarizing, amplification, active coaching, low-demand practice, and maximizing participation. Examples of how to apply each strategy in session are provided, as well as recommendations on when to use each strategy during the delivery of interventions that target criminogenic needs. This targeted service delivery approach makes an important contribution to the development of interventions for justice-involved people with SMI by increasing the chances that people with SMI can participate fully in and benefit from these interventions that target criminogenic needs. These developments come at a critical time in the field as the next generation of services for justice-involved people with SMI are being developed.

  3. What methods are used to apply positive deviance within healthcare organisations? A systematic review

    PubMed Central

    Baxter, Ruth; Taylor, Natalie; Kellar, Ian; Lawton, Rebecca

    2016-01-01

    Background The positive deviance approach focuses on those who demonstrate exceptional performance, despite facing the same constraints as others. ‘Positive deviants’ are identified and hypotheses about how they succeed are generated. These hypotheses are tested and then disseminated within the wider community. The positive deviance approach is being increasingly applied within healthcare organisations, although limited guidance exists and different methods, of varying quality, are used. This paper systematically reviews healthcare applications of the positive deviance approach to explore how positive deviance is defined, the quality of existing applications and the methods used within them, including the extent to which staff and patients are involved. Methods Peer-reviewed articles, published prior to September 2014, reporting empirical research on the use of the positive deviance approach within healthcare, were identified from seven electronic databases. A previously defined four-stage process for positive deviance in healthcare was used as the basis for data extraction. Quality assessments were conducted using a validated tool, and a narrative synthesis approach was followed. Results 37 of 818 articles met the inclusion criteria. The positive deviance approach was most frequently applied within North America, in secondary care, and to address healthcare-associated infections. Research predominantly identified positive deviants and generated hypotheses about how they succeeded. The approach and processes followed were poorly defined. Research quality was low, articles lacked detail and comparison groups were rarely included. Applications of positive deviance typically lacked staff and/or patient involvement, and the methods used often required extensive resources. Conclusion Further research is required to develop high quality yet practical methods which involve staff and patients in all stages of the positive deviance approach. The efficacy and efficiency of positive deviance must be assessed and compared with other quality improvement approaches. PROSPERO registration number CRD42014009365. PMID:26590198

  4. Interdisciplinary research: maintaining the constructive impulse in a culture of criticism

    Treesearch

    S.T.A. Pickett; William R. Burch; J. Morgan. Grove

    1999-01-01

    We approach the benefits and burdens of interdisciplinary research (IDR) from the perspective that science involves both constructive and critical approaches. The constructive aspect generates concepts, theories, and data to understand the observable world, while criticism tests the internal consistency of understanding and its fit to the observable world (Pickett and...

  5. Mobilizing Change in a Business School Using Appreciative Inquiry

    ERIC Educational Resources Information Center

    Grandy, Gina; Holton, Judith

    2010-01-01

    Purpose: The purpose of this paper is to explore how appreciative inquiry (AI) as a pedagogical tool can be generative in nature creating opportunities for development and change in a business school context. Design/methodology/approach: Using a qualitative approach this research involved data collection and analysis in three stages of AI with a…

  6. Service User Involvement in UK Social Service Agencies and Social Work Education

    ERIC Educational Resources Information Center

    Goossen, Carolyn; Austin, Michael J.

    2017-01-01

    Forming partnerships with service users became a requirement for social work education programs in the United Kingdom as of 2003, leading to the development of innovative approaches to social work education that involve service users as experts who are helping to teach the future generation of social workers. This article examines the perceptions…

  7. The significance of urban trees and forests: toward a deeper understanding of values

    Treesearch

    John F. Dwyer; Herbert W. Schroeder; Paul H. Gobster

    1991-01-01

    Many city dwellers hold very strong personal ties to urban trees and forests, with some attachments approaching a spiritual involvement. Ties between people and trees are associated with traditions, symbolism, and the need to "get involved" at the local level to sustain or enhance the environment for present and future generations. Urban forestry programs...

  8. Head Start Program Quality: Examination of Classroom Quality and Parent Involvement in Predicting Children's Vocabulary, Literacy, and Mathematics Achievement Trajectories

    ERIC Educational Resources Information Center

    Wen, Xiaoli; Bulotsky-Shearer, Rebecca J.; Hahs-Vaughn, Debbie L.; Korfmacher, Jon

    2012-01-01

    Guided by a developmental-ecological framework and Head Start's two-generational approach, this study examined two dimensions of Head Start program quality, classroom quality and parent involvement and their unique and interactive contribution to children's vocabulary, literacy, and mathematics skills growth from the beginning of Head Start…

  9. Measuring Father Involvement in the Early Head Start Evaluation: A Multidimensional Conceptualization.

    ERIC Educational Resources Information Center

    Cabrera, Natasha J.; Tamis-LeMonda, Catherine S.; Lamb, Michael E.; Boller, Kimberly

    Early Head Start (EHS) is a comprehensive, two-generation program that includes intensive services that begin before the child is born and concentrate on enhancing the child's development and supporting the family during the critical first 3 years of a child's life. This paper discusses approaches to measuring father involvement in their…

  10. The Overgrid Interface for Computational Simulations on Overset Grids

    NASA Technical Reports Server (NTRS)

    Chan, William M.; Kwak, Dochan (Technical Monitor)

    2002-01-01

    Computational simulations using overset grids typically involve multiple steps and a variety of software modules. A graphical interface called OVERGRID has been specially designed for such purposes. Data required and created by the different steps include geometry, grids, domain connectivity information and flow solver input parameters. The interface provides a unified environment for the visualization, processing, generation and diagnosis of such data. General modules are available for the manipulation of structured grids and unstructured surface triangulations. Modules more specific for the overset approach include surface curve generators, hyperbolic and algebraic surface grid generators, a hyperbolic volume grid generator, Cartesian box grid generators, and domain connectivity: pre-processing tools. An interface provides automatic selection and viewing of flow solver boundary conditions, and various other flow solver inputs. For problems involving multiple components in relative motion, a module is available to build the component/grid relationships and to prescribe and animate the dynamics of the different components.

  11. Exploring Teacher Use of an Online Forum to Develop Game-Based Learning Literacy

    ERIC Educational Resources Information Center

    Barany, Amanda; Shah, Mamta; Foster, Aroutis

    2017-01-01

    Game-based learning researchers have emphasized the importance of teachers' game literacy and knowledge of pedagogical approaches involved in successfully adopting an instructional approach (Bell and Gresalfi, 2017). In this paper, we describe findings from an online resource that teachers used to generate a repository of games for use both during…

  12. The Contributions of Working Memory and Executive Functioning to Problem Representation and Solution Generation in Algebraic Word Problems

    ERIC Educational Resources Information Center

    Lee, Kerry; Ng, Ee Lynn; Ng, Swee Fong

    2009-01-01

    Solving algebraic word problems involves multiple cognitive phases. The authors used a multitask approach to examine the extent to which working memory and executive functioning are associated with generating problem models and producing solutions. They tested 255 11-year-olds on working memory (Counting Recall, Letter Memory, and Keep Track),…

  13. Formulating a subgrid-scale breakup model for microbubble generation from interfacial collisions

    NASA Astrophysics Data System (ADS)

    Chan, Wai Hong Ronald; Mirjalili, Shahab; Urzay, Javier; Mani, Ali; Moin, Parviz

    2017-11-01

    Multiphase flows often involve impact events that engender important effects like the generation of a myriad of tiny bubbles that are subsequently transported in large liquid bodies. These impact events are created by large-scale phenomena like breaking waves on ocean surfaces, and often involve the relative approach of liquid surfaces. This relative motion generates continuously shrinking length scales as the entrapped gas layer thins and eventually breaks up into microbubbles. The treatment of this disparity in length scales is computationally challenging. In this presentation, a framework is presented that addresses a subgrid-scale (SGS) model aimed at capturing the process of microbubble generation. This work sets up the components in an overarching volume-of-fluid (VoF) toolset and investigates the analytical foundations of an SGS model for describing the breakup of a thin air film trapped between two approaching water bodies in a physical regime corresponding to Mesler entrainment. Constituents of the SGS model, such as the identification of impact events and the accurate computation of the local characteristic curvature in a VoF-based architecture, and the treatment of the air layer breakup, are discussed and illustrated in simplified scenarios. Supported by Office of Naval Research (ONR)/A*STAR (Singapore).

  14. Survey of Anomaly Detection Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ng, B

    This survey defines the problem of anomaly detection and provides an overview of existing methods. The methods are categorized into two general classes: generative and discriminative. A generative approach involves building a model that represents the joint distribution of the input features and the output labels of system behavior (e.g., normal or anomalous) then applies the model to formulate a decision rule for detecting anomalies. On the other hand, a discriminative approach aims directly to find the decision rule, with the smallest error rate, that distinguishes between normal and anomalous behavior. For each approach, we will give an overview ofmore » popular techniques and provide references to state-of-the-art applications.« less

  15. Assuring safety without animal testing: Unilever's ongoing research programme to deliver novel ways to assure consumer safety.

    PubMed

    Westmoreland, Carl; Carmichael, Paul; Dent, Matt; Fentem, Julia; MacKay, Cameron; Maxwell, Gavin; Pease, Camilla; Reynolds, Fiona

    2010-01-01

    Assuring consumer safety without the generation of new animal data is currently a considerable challenge. However, through the application of new technologies and the further development of risk-based approaches for safety assessment, we remain confident it is ultimately achievable. For many complex, multi-organ consumer safety endpoints, the development, evaluation and application of new, non-animal approaches is hampered by a lack of biological understanding of the underlying mechanistic processes involved. The enormity of this scientific challenge should not be underestimated. To tackle this challenge a substantial research programme was initiated by Unilever in 2004 to critically evaluate the feasibility of a new conceptual approach based upon the following key components: 1.Developing new, exposure-driven risk assessment approaches. 2.Developing new biological (in vitro) and computer-based (in silico) predictive models. 3.Evaluating the applicability of new technologies for generating data (e.g. "omics", informatics) and for integrating new types of data (e.g. systems approaches) for risk-based safety assessment. Our research efforts are focussed in the priority areas of skin allergy, cancer and general toxicity (including inhaled toxicity). In all of these areas, a long-term investment is essential to increase the scientific understanding of the underlying biology and molecular mechanisms that we believe will ultimately form a sound basis for novel risk assessment approaches. Our research programme in these priority areas consists of in-house research as well as Unilever-sponsored academic research, involvement in EU-funded projects (e.g. Sens-it-iv, Carcinogenomics), participation in cross-industry collaborative research (e.g. Colipa, EPAA) and ongoing involvement with other scientific initiatives on non-animal approaches to risk assessment (e.g. UK NC3Rs, US "Human Toxicology Project" consortium).

  16. Documentation Driven Development for Complex Real-Time Systems

    DTIC Science & Technology

    2004-12-01

    This paper presents a novel approach for development of complex real - time systems , called the documentation-driven development (DDD) approach. This... time systems . DDD will also support automated software generation based on a computational model and some relevant techniques. DDD includes two main...stakeholders to be easily involved in development processes and, therefore, significantly improve the agility of software development for complex real

  17. This Is Your Future: A Case Study Approach to Foster Health Literacy

    ERIC Educational Resources Information Center

    Brey, Rebecca A.; Clark, Susan E.; Wantz, Molly S.

    2008-01-01

    Today's young people seem to live in an even faster fast-paced society than previous generations. As in the past, they are involved in sports, music, school, church, work, and are exposed to many forms of mass media that add to their base of information. However, they also have instant access to computer-generated information such as the Internet,…

  18. Multi-site precipitation downscaling using a stochastic weather generator

    NASA Astrophysics Data System (ADS)

    Chen, Jie; Chen, Hua; Guo, Shenglian

    2018-03-01

    Statistical downscaling is an efficient way to solve the spatiotemporal mismatch between climate model outputs and the data requirements of hydrological models. However, the most commonly-used downscaling method only produces climate change scenarios for a specific site or watershed average, which is unable to drive distributed hydrological models to study the spatial variability of climate change impacts. By coupling a single-site downscaling method and a multi-site weather generator, this study proposes a multi-site downscaling approach for hydrological climate change impact studies. Multi-site downscaling is done in two stages. The first stage involves spatially downscaling climate model-simulated monthly precipitation from grid scale to a specific site using a quantile mapping method, and the second stage involves the temporal disaggregating of monthly precipitation to daily values by adjusting the parameters of a multi-site weather generator. The inter-station correlation is specifically considered using a distribution-free approach along with an iterative algorithm. The performance of the downscaling approach is illustrated using a 10-station watershed as an example. The precipitation time series derived from the National Centers for Environment Prediction (NCEP) reanalysis dataset is used as the climate model simulation. The precipitation time series of each station is divided into 30 odd years for calibration and 29 even years for validation. Several metrics, including the frequencies of wet and dry spells and statistics of the daily, monthly and annual precipitation are used as criteria to evaluate the multi-site downscaling approach. The results show that the frequencies of wet and dry spells are well reproduced for all stations. In addition, the multi-site downscaling approach performs well with respect to reproducing precipitation statistics, especially at monthly and annual timescales. The remaining biases mainly result from the non-stationarity of NCEP precipitation. Overall, the proposed approach is efficient for generating multi-site climate change scenarios that can be used to investigate the spatial variability of climate change impacts on hydrology.

  19. Systems biology and the quest for correlates of protection to guide the development of an HIV vaccine.

    PubMed

    Kuri-Cervantes, Leticia; Fourati, Slim; Canderan, Glenda; Sekaly, Rafick-Pierre

    2016-08-01

    Over the last three decades, a myriad of data has been generated regarding HIV/SIV evolution, immune evasion, immune response, and pathogenesis. Much of this data can be integrated and potentially used to generate a successful vaccine. Although individual approaches have begun to shed light on mechanisms involved in vaccine-conferred protection from infection, true correlates of protection have not yet been identified. The systems biology approach helps unify datasets generated using different techniques and broaden our understanding of HIV immunopathogenesis. Moreover, systems biology is a tool that can provide correlates of protection, which can be targeted for the production of a successful HIV vaccine. Copyright © 2016. Published by Elsevier Ltd.

  20. Identifying populations sensitive to environmental chemicals by simulating toxicokinetic variability

    EPA Science Inventory

    We incorporate inter-individual variability, including variability across demographic subgroups, into an open-source high-throughput (HT) toxicokinetics (TK) modeling framework for use in a next-generation risk prioritization approach. Risk prioritization involves rapid triage of...

  1. Laying the Groundwork: Lessons Learned from the Telecommunications Industry for Distributed Generation; Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wise, A. L.

    2008-05-01

    The telecommunications industry went through growing pains in the past that hold some interesting lessons for the growing distributed generation (DG) industry. The technology shifts and stakeholders involved with the historic market transformation of the telecommunications sector mirror similar factors involved in distributed generation today. An examination of these factors may inform best practices when approaching the conduits necessary to accelerate the shifting of our nation's energy system to cleaner forms of generation and use. From a technical perspective, the telecom industry in the 1990s saw a shift from highly centralized systems that had no capacity for adaptation to highlymore » adaptive, distributed network systems. From a management perspective, the industry shifted from small, private-company structures to big, capital-intensive corporations. This presentation will explore potential correlation and outline the lessons that we can take away from this comparison.« less

  2. Aripiprazole.

    PubMed

    Prommer, Eric

    2017-03-01

    Delirium is a palliative care emergency where patients experience changes in perception, awareness, and behavior. Common features include changes in the sleep-wake cycle, emotional lability, delusional thinking, and language and thought disorders. Delirium results from neurotransmitter imbalances involving several neurotransmitters such as dopamine, glutamate, norepinephrine, acetylcholine, gamma-aminobutyric acid, and serotonin. Untreated delirium causes significant morbidity and mortality. Nonpharmacologic and pharmacologic approaches treat delirium. Current pharmacologic management of delirium involves using agents such as haloperidol or second-generation antipsychotics. Third-generation atypical antipsychotic drugs have emerged as a potential choice for delirium management. Aripiprazole is a third-generation antipsychotic with a dopamine receptor-binding profile distinct from other second-generation antipsychotics. Aripiprazole acts as partial agonist at dopamine D 2 and 5-hydroxytryptamine (5-HT) 1A receptors, stabilizing the dopamine receptor leading to improvement in symptoms. The article reviews the pharmacology, pharmacodynamics, metabolism, and evidence of clinical efficacy for this new antipsychotic agent. This article explores possible roles in palliative care.

  3. Generation of EMS-Mutagenized Populations of Arabidopsis thaliana for Polyamine Genetics.

    PubMed

    Atanasov, Kostadin E; Liu, Changxin; Tiburcio, Antonio F; Alcázar, Rubén

    2018-01-01

    In the recent years, genetic engineering of polyamine biosynthetic genes has provided evidence for their involvement in plant stress responses and different aspects of plant development. Such approaches are being complemented with the use of reverse genetics, in which mutants affected on a particular trait, tightly associated with polyamines, are isolated and the causal genes mapped. Reverse genetics enables the identification of novel genes in the polyamine pathway, which may be involved in downstream signaling, transport, homeostasis, or perception. Here, we describe a basic protocol for the generation of ethyl methanesulfonate (EMS) mutagenized populations of Arabidopsis thaliana for its use in reverse genetics applied to polyamines.

  4. Accelerated lamellar disintegration in eutectoid steel

    NASA Astrophysics Data System (ADS)

    Mishra, Shakti; Mishra, Alok; Show, Bijay Kumar; Maity, Joydeep

    2017-04-01

    The fastest kinetics of lamellar disintegration (predicted duration of 44 min) in AISI 1080 steel is obtained with a novel approach of incomplete austenitisation-based cyclic heat treatment involving forced air cooling with an air flow rate of 8.7 m3 h-1. A physical model for process kinetics is proposed that involves lamellar fragmentation, lamellar thickening, divorced eutectoid growth and generation of new lamellar faults in remaining cementite lamellae in each cycle. Lamellar fragmentation is accentuated with faster rate of cooling through generation of more intense lamellar faults; but divorced eutectoid growth is ceased. Accordingly, as compared to still air cooling, much faster kinetics of lamellar disintegration is obtained by forced air cooling together with the generation of much smaller submicroscopic cementite particles (containing more proportion of plate-shaped non-spheroids) in divorced eutectoid region.

  5. Compatibility of Segments of Thermoelectric Generators

    NASA Technical Reports Server (NTRS)

    Snyder, G. Jeffrey; Ursell, Tristan

    2009-01-01

    A method of calculating (usually for the purpose of maximizing) the power-conversion efficiency of a segmented thermoelectric generator is based on equations derived from the fundamental equations of thermoelectricity. Because it is directly traceable to first principles, the method provides physical explanations in addition to predictions of phenomena involved in segmentation. In comparison with the finite-element method used heretofore to predict (without being able to explain) the behavior of a segmented thermoelectric generator, this method is much simpler to implement in practice: in particular, the efficiency of a segmented thermoelectric generator can be estimated by evaluating equations using only hand-held calculator with this method. In addition, the method provides for determination of cascading ratios. The concept of cascading is illustrated in the figure and the definition of the cascading ratio is defined in the figure caption. An important aspect of the method is its approach to the issue of compatibility among segments, in combination with introduction of the concept of compatibility within a segment. Prior approaches involved the use of only averaged material properties. Two materials in direct contact could be examined for compatibility with each other, but there was no general framework for analysis of compatibility. The present method establishes such a framework. The mathematical derivation of the method begins with the definition of reduced efficiency of a thermoelectric generator as the ratio between (1) its thermal-to-electric power-conversion efficiency and (2) its Carnot efficiency (the maximum efficiency theoretically attainable, given its hot- and cold-side temperatures). The derivation involves calculation of the reduced efficiency of a model thermoelectric generator for which the hot-side temperature is only infinitesimally greater than the cold-side temperature. The derivation includes consideration of the ratio (u) between the electric current and heat-conduction power and leads to the concept of compatibility factor (s) for a given thermoelectric material, defined as the value of u that maximizes the reduced efficiency of the aforementioned model thermoelectric generator.

  6. 'Setting the guinea pigs free': towards a new model of community-led social marketing.

    PubMed

    Smith, A J; Henry, L

    2009-09-01

    To offer the opportunity to discuss the positive contribution of co-production approaches in the field of social marketing. Recognizing the ever-evolving theoretical base for social marketing, this article offers a brief commentary on the positive contribution of co-production approaches in this field. The authors outline their own move towards conceptualizing a community-led social marketing approach and describe some key features. This developing framework has been influenced by, and tested through, the Early Presentation of Cancer Symptoms Programme, a community-led social marketing approach to tackle health inequalities across priority neighbourhoods in North East Lincolnshire, UK. A blend of social marketing, community involvement and rapid improvement science methodologies are drawn upon. The approach involves not just a strong focus on involving communities in insight and consultation, but also adopts methods where they are in charge of the process of generating solutions. A series of monthly and pre/post measures have demonstrated improvements in awareness of symptoms, reported willingness to act and increases in presentation measured through service referrals. Key features of the approach involve shared ownership and a shift away from service-instigated change by enabling communities 'to do' through developing skills and confidence and the conditions to 'try out'. The approach highlights the contribution that co-production approaches have to offer social marketing activity. In order to maximize potential, it is important to consider ways of engaging communities effectively. Successful approaches include translating social marketing methodology into easy-to-use frameworks, involving communities in gathering and interpreting local data, and supporting communities to act as change agents by planning and carrying out activity. The range of impacts across organisational, health and social capital measures demonstrates that multiple and longer-lasting improvements can be achieved with successful approaches.

  7. Towards integrated hygiene and food safety management systems: the Hygieneomic approach.

    PubMed

    Armstrong, G D

    1999-09-15

    Integrated hygiene and food safety management systems in food production can give rise to exceptional improvements in food safety performance, but require high level commitment and full functional involvement. A new approach, named hygieneomics, has been developed to assist management in their introduction of hygiene and food safety systems. For an effective introduction, the management systems must be designed to fit with the current generational state of an organisation. There are, broadly speaking, four generational states of an organisation in their approach to food safety. They comprise: (i) rules setting; (ii) ensuring compliance; (iii) individual commitment; (iv) interdependent action. In order to set up an effective integrated hygiene and food safety management system a number of key managerial requirements are necessary. The most important ones are: (a) management systems must integrate the activities of key functions from research and development through to supply chain and all functions need to be involved; (b) there is a critical role for the senior executive, in communicating policy and standards; (c) responsibilities must be clearly defined, and it should be clear that food safety is a line management responsibility not to be delegated to technical or quality personnel; (d) a thorough and effective multi-level audit approach is necessary; (e) key activities in the system are HACCP and risk management, but it is stressed that these are ongoing management activities, not once-off paper generating exercises; and (f) executive management board level review is necessary of audit results, measurements, status and business benefits.

  8. Functionalization of multilayer fullerenes (carbon nano-onions) using diazonium compounds and "click" chemistry.

    PubMed

    Flavin, Kevin; Chaur, Manuel N; Echegoyen, Luis; Giordani, Silvia

    2010-02-19

    A novel versatile approach for the functionalization of multilayer fullerenes (carbon nano-onions) has been developed, which involves the facile introduction of a variety of simple functionalities onto their surface by treatment with in situ generated diazonium compounds. This approach is complemented by use of "click" chemistry which was used for the covalent introduction of more complex porphyrin molecules.

  9. Reframing Internationalization

    ERIC Educational Resources Information Center

    Garson, Kyra

    2016-01-01

    Canadian higher education has long been involved in international education, partnerships, and research and development projects; however, recent framing of international education as an industry generating revenues to prop up underfunded institutions is troubling. This approach is endorsed by provincial government strategies and bolstered by the…

  10. Development of an electron paramagnetic resonance methodology for studying the photo-generation of reactive species in semiconductor nano-particle assembled films

    NASA Astrophysics Data System (ADS)

    Twardoch, Marek; Messai, Youcef; Vileno, Bertrand; Hoarau, Yannick; Mekki, Djamel E.; Felix, Olivier; Turek, Philippe; Weiss, Jean; Decher, Gero; Martel, David

    2018-06-01

    An experimental approach involving electron paramagnetic resonance is proposed for studying photo-generated reactive species in semiconductor nano-particle-based films deposited on the internal wall of glass capillaries. This methodology is applied here to nano-TiO2 and allows a semi-quantitative analysis of the kinetic evolutions of radical production using a spin scavenger probe.

  11. Terahertz Characterization of DNA: Enabling a Novel Approach

    DTIC Science & Technology

    2015-11-01

    DNA in a more reliable and less procedurally complicated manner. The method involves the use of terahertz surface plasmon generated on the surface of...advantages are due to overlapping resonance when the plasmon frequency generated by a foil coincides with that of the biological material. The...interference of the impinging terahertz wave and surface plasmon produces spectral graphs, which can be analyzed to identify and characterize a DNA sample

  12. Generators of the brainstem auditory evoked potential in cat. I. An experimental approach to their identification.

    PubMed

    Melcher, J R; Knudson, I M; Fullerton, B C; Guinan, J J; Norris, B E; Kiang, N Y

    1996-04-01

    This paper is the first in a series aimed at identifying the cellular generators of the brainstem auditory evoked potential (BAEP) in cats. The approach involves (1) developing experimental procedures for making small selective lesions and determining the corresponding changes in BAEP waveforms, (2) identifying brainstem regions involved in BAEP generation by examining the effects of lesions on the BAEP and (3) identifying specific cell populations involved by combining the lesion results with electrophysiological and anatomical information from other kinds of studies. We created lesions in the lower brainstem by injecting kainic acid which is generally toxic for neuronal cell bodies but not for axons and terminals. This first paper describes the justifications for using kainic acid, explains the associated problems, and develops a methodology that addresses the main difficulties. The issues and aspects of the specific methods are generally applicable to physiological and anatomical studies using any neurotoxin, as well as to the present BAEP study. The methods chosen involved (1) measuring the BAEP at regular intervals until it reached a post-injection steady state and perfusing the animals with fixative shortly after the last BAEP recordings were made, (2) using objective criteria to distinguish injection-related BAEP changes from unrelated ones, (3) making control injections to identify effects not due to kainic acid toxicity, (4) verifying the anatomical and functional integrity of axons in lesioned regions, and (5) examining injected brainstems microscopically for cell loss and cellular abnormalities indicating dysfunction. This combination of methods enabled us to identify BAEP changes which are clearly correlated with lesion locations.

  13. Electrolytic production of oxygen from lunar resources

    NASA Technical Reports Server (NTRS)

    Keller, Rudolf

    1991-01-01

    Some of the most promising approaches to extract oxygen from lunar resources involve electrochemical oxygen generation. In a concept called magma electrolysis, suitable oxides (silicates) which are molten at 1300 to 1500 C are then electrolyzed. Residual melt can be discarded after partial electrolysis. Alternatively, lunar soil may be dissolved in a molten salt and electrolyzed. In this approach, temperatures are lower and melt conductances higher, but electrolyte constituents need to be preserved. In a different approach ilmenite is reduced by hydrogen and the resulting water is electrolyzed.

  14. Survey of Approaches to Generate Realistic Synthetic Graphs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lim, Seung-Hwan; Lee, Sangkeun; Powers, Sarah S

    A graph is a flexible data structure that can represent relationships between entities. As with other data analysis tasks, the use of realistic graphs is critical to obtaining valid research results. Unfortunately, using the actual ("real-world") graphs for research and new algorithm development is difficult due to the presence of sensitive information in the data or due to the scale of data. This results in practitioners developing algorithms and systems that employ synthetic graphs instead of real-world graphs. Generating realistic synthetic graphs that provide reliable statistical confidence to algorithmic analysis and system evaluation involves addressing technical hurdles in a broadmore » set of areas. This report surveys the state of the art in approaches to generate realistic graphs that are derived from fitted graph models on real-world graphs.« less

  15. Planning and Design: A Systems Approach.

    ERIC Educational Resources Information Center

    Bozeman, William C.; Clements, Mary A.

    1981-01-01

    Explains "purpose design," a planning and problem-solving strategy involving determination of planning purposes, generation and selection of solutions, specification of solution details, implementation, and evaluation. Describes the application of purpose design to the planning of an alumni association at Black Hawk College, a community…

  16. Inter-Individual Variability in High-Throughput Risk Prioritization of Environmental Chemicals (Sot)

    EPA Science Inventory

    We incorporate realistic human variability into an open-source high-throughput (HT) toxicokinetics (TK) modeling framework for use in a next-generation risk prioritization approach. Risk prioritization involves rapid triage of thousands of environmental chemicals, most which have...

  17. Inter-individual variability in high-throughput risk prioritization of environmental chemicals (IVIVE)

    EPA Science Inventory

    We incorporate inter-individual variability into an open-source high-throughput (HT) toxicokinetics (TK) modeling framework for use in a next-generation risk prioritization approach. Risk prioritization involves rapid triage of thousands of environmental chemicals, most which hav...

  18. Identifying Effective Design Approaches to Allocate Genotypes in Two-Phase Designs: A Case Study in Pelargonium zonale.

    PubMed

    Molenaar, Heike; Boehm, Robert; Piepho, Hans-Peter

    2017-01-01

    Robust phenotypic data allow adequate statistical analysis and are crucial for any breeding purpose. Such data is obtained from experiments laid out to best control local variation. Additionally, experiments frequently involve two phases, each contributing environmental sources of variation. For example, in a former experiment we conducted to evaluate production related traits in Pelargonium zonale , there were two consecutive phases, each performed in a different greenhouse. Phase one involved the propagation of the breeding strains to obtain the stem cutting count, and phase two involved the assessment of root formation. The evaluation of the former study raised questions regarding options for improving the experimental layout: (i) Is there a disadvantage to using exactly the same design in both phases? (ii) Instead of generating a separate layout for each phase, can the design be optimized across both phases, such that the mean variance of a pair-wise treatment difference (MVD) can be decreased? To answer these questions, alternative approaches were explored to generate two-phase designs either in phase-wise order (Option 1) or across phases (Option 2). In Option 1 we considered the scenarios (i) using in both phases the same experimental design and (ii) randomizing each phase separately. In Option 2, we considered the scenarios (iii) generating a single design with eight replicates and splitting these among the two phases, (iv) separating the block structure across phases by dummy coding, and (v) design generation with optimal alignment of block units in the two phases. In both options, we considered the same or different block structures in each phase. The designs were evaluated by the MVD obtained by the intra-block analysis and the joint inter-block-intra-block analysis. The smallest MVD was most frequently obtained for designs generated across phases rather than for each phase separately, in particular when both phases of the design were separated with a single pseudo-level. The joint optimization ensured that treatment concurrences were equally balanced across pairs, one of the prerequisites for an efficient design. The proposed alternative approaches can be implemented with any model-based design packages with facilities to formulate linear models for treatment and block structures.

  19. A Pauson-Khand and ring-expansion approach to the aquariane ring system.

    PubMed

    Thornton, Paul D; Burnell, D Jean

    2006-07-20

    [Structure: see text] The carbocyclic ring system of the aquariolide diterpenes has been synthesized by two routes involving a diastereoselective Pauson-Khand reaction and subsequent ring expansion. In one route, a tetracyclic enone was elaborated to generate the nine-membered ring by Grob fragmentation. In the second approach, a spirocyclic tricycle underwent a facile anionic oxy-Cope rearrangement to complete the synthesis of the desired ring system.

  20. Green Approach to Nanomaterials: Sustainable Utility of Nano-Catalysts

    EPA Science Inventory

    The presentation summarizes our synthetic activity for the preparation of nanoparticles involving benign alternatives which reduces or eliminates the use and generation of hazardous substances. Vitamins B1, B2, C, and tea and wine polyphenols which function both as reducing and c...

  1. A New Take on "Tried and True"

    ERIC Educational Resources Information Center

    Hancock, James Brian, II; Lee, May

    2018-01-01

    Many teachers are confused about how to implement the phenomena-based teaching recommended by the "Next Generation Science Standards" (NGSS Lead States 2013). This article describes one possible approach--purposely repurposing existing activities. This process involves having teachers: (1) Choose a phenomenon that informs the development…

  2. Human variability in high-throughput risk prioritization of environmental chemicals (Texas AM U. webinar)

    EPA Science Inventory

    We incorporate inter-individual variability into an open-source high-throughput (HT) toxicokinetics (TK) modeling framework for use in a next-generation risk prioritization approach. Risk prioritization involves rapid triage of thousands of environmental chemicals, most which hav...

  3. Proteomics approaches advance our understanding of plant self-incompatibility response.

    PubMed

    Sankaranarayanan, Subramanian; Jamshed, Muhammad; Samuel, Marcus A

    2013-11-01

    Self-incompatibility (SI) in plants is a genetic mechanism that prevents self-fertilization and promotes out-crossing needed to maintain genetic diversity. SI has been classified into two broad categories: the gametophytic self-incompatibility (GSI) and the sporophytic self-incompatibility (SSI) based on the genetic mechanisms involved in 'self' pollen rejection. Recent proteomic approaches to identify potential candidates involved in SI have shed light onto a number of previously unidentified mechanisms required for SI response. SI proteome research has progressed from the use of isoelectric focusing in early days to the latest third-generation technique of comparative isobaric tag for relative and absolute quantitation (iTRAQ) used in recent times. We will focus on the proteome-based approaches used to study self-incompatibility (GSI and SSI), recent developments in the field of incompatibility research with emphasis on SSI and future prospects of using proteomic approaches to study self-incompatibility.

  4. Complementary DNA libraries: an overview.

    PubMed

    Ying, Shao-Yao

    2004-07-01

    The generation of complete and full-length cDNA libraries for potential functional assays of specific gene sequences is essential for most molecules in biotechnology and biomedical research. The field of cDNA library generation has changed rapidly in the past 10 yr. This review presents an overview of the method available for the basic information of generating cDNA libraries, including the definition of the cDNA library, different kinds of cDNA libraries, difference between methods for cDNA library generation using conventional approaches and a novel strategy, and the quality of cDNA libraries. It is anticipated that the high-quality cDNA libraries so generated would facilitate studies involving genechips and the microarray, differential display, subtractive hybridization, gene cloning, and peptide library generation.

  5. Evaluation of Different Normalization and Analysis Procedures for Illumina Gene Expression Microarray Data Involving Small Changes

    PubMed Central

    Johnstone, Daniel M.; Riveros, Carlos; Heidari, Moones; Graham, Ross M.; Trinder, Debbie; Berretta, Regina; Olynyk, John K.; Scott, Rodney J.; Moscato, Pablo; Milward, Elizabeth A.

    2013-01-01

    While Illumina microarrays can be used successfully for detecting small gene expression changes due to their high degree of technical replicability, there is little information on how different normalization and differential expression analysis strategies affect outcomes. To evaluate this, we assessed concordance across gene lists generated by applying different combinations of normalization strategy and analytical approach to two Illumina datasets with modest expression changes. In addition to using traditional statistical approaches, we also tested an approach based on combinatorial optimization. We found that the choice of both normalization strategy and analytical approach considerably affected outcomes, in some cases leading to substantial differences in gene lists and subsequent pathway analysis results. Our findings suggest that important biological phenomena may be overlooked when there is a routine practice of using only one approach to investigate all microarray datasets. Analytical artefacts of this kind are likely to be especially relevant for datasets involving small fold changes, where inherent technical variation—if not adequately minimized by effective normalization—may overshadow true biological variation. This report provides some basic guidelines for optimizing outcomes when working with Illumina datasets involving small expression changes. PMID:27605185

  6. Machine Learning Approaches in Cardiovascular Imaging.

    PubMed

    Henglin, Mir; Stein, Gillian; Hushcha, Pavel V; Snoek, Jasper; Wiltschko, Alexander B; Cheng, Susan

    2017-10-01

    Cardiovascular imaging technologies continue to increase in their capacity to capture and store large quantities of data. Modern computational methods, developed in the field of machine learning, offer new approaches to leveraging the growing volume of imaging data available for analyses. Machine learning methods can now address data-related problems ranging from simple analytic queries of existing measurement data to the more complex challenges involved in analyzing raw images. To date, machine learning has been used in 2 broad and highly interconnected areas: automation of tasks that might otherwise be performed by a human and generation of clinically important new knowledge. Most cardiovascular imaging studies have focused on task-oriented problems, but more studies involving algorithms aimed at generating new clinical insights are emerging. Continued expansion in the size and dimensionality of cardiovascular imaging databases is driving strong interest in applying powerful deep learning methods, in particular, to analyze these data. Overall, the most effective approaches will require an investment in the resources needed to appropriately prepare such large data sets for analyses. Notwithstanding current technical and logistical challenges, machine learning and especially deep learning methods have much to offer and will substantially impact the future practice and science of cardiovascular imaging. © 2017 American Heart Association, Inc.

  7. A comprehensive dwelling unit choice model accommodating psychological constructs within a search strategy for consideration set formation.

    DOT National Transportation Integrated Search

    2015-12-01

    This study adopts a dwelling unit level of analysis and considers a probabilistic choice set generation approach for residential choice modeling. In doing so, we accommodate the fact that housing choices involve both characteristics of the dwelling u...

  8. Sequencing the Genome of the Heirloom Watermelon Cultivar Charleston Gray

    USDA-ARS?s Scientific Manuscript database

    The genome of the watermelon cultivar Charleston Gray, a major heirloom which has been used in breeding programs of many watermelon cultivars, was sequenced. Our strategy involved a hybrid approach using the Illumina and 454/Titanium next-generation sequencing technologies. For Illumina, shotgun g...

  9. INNOVATIVE PRACTICES FOR TREATING WASTE STREAMS CONTAINING HEAVY METALS: A WASTE MINIMIZATION APPROACH

    EPA Science Inventory

    Innovative practices for treating waste streams containing heavy metals often involve technologies or systems that either reduce the amount of waste generated or recover reusable resources. With the land disposal of metal treatment residuals becoming less of an accepted waste man...

  10. Fine-Tuning Neural Patient Question Retrieval Model with Generative Adversarial Networks.

    PubMed

    Tang, Guoyu; Ni, Yuan; Wang, Keqiang; Yong, Qin

    2018-01-01

    The online patient question and answering (Q&A) system attracts an increasing amount of users in China. Patient will post their questions and wait for doctors' response. To avoid the lag time involved with the waiting and to reduce the workload on the doctors, a better method is to automatically retrieve the semantically equivalent question from the archive. We present a Generative Adversarial Networks (GAN) based approach to automatically retrieve patient question. We apply supervised deep learning based approaches to determine the similarity between patient questions. Then a GAN framework is used to fine-tune the pre-trained deep learning models. The experiment results show that fine-tuning by GAN can improve the performance.

  11. Current views on HIV-1 latency, persistence, and cure.

    PubMed

    Melkova, Zora; Shankaran, Prakash; Madlenakova, Michaela; Bodor, Josef

    2017-01-01

    HIV-1 infection cannot be cured as it persists in latently infected cells that are targeted neither by the immune system nor by available therapeutic approaches. Consequently, a lifelong therapy suppressing only the actively replicating virus is necessary. The latent reservoir has been defined and characterized in various experimental models and in human patients, allowing research and development of approaches targeting individual steps critical for HIV-1 latency establishment, maintenance, and reactivation. However, additional mechanisms and processes driving the remaining low-level HIV-1 replication in the presence of the suppressive therapy still remain to be identified and targeted. Current approaches toward HIV-1 cure involve namely attempts to reactivate and purge HIV latently infected cells (so-called "shock and kill" strategy), as well as approaches involving gene therapy and/or gene editing and stem cell transplantation aiming at generation of cells resistant to HIV-1. This review summarizes current views and concepts underlying different approaches aiming at functional or sterilizing cure of HIV-1 infection.

  12. Declarative Business Process Modelling and the Generation of ERP Systems

    NASA Astrophysics Data System (ADS)

    Schultz-Møller, Nicholas Poul; Hølmer, Christian; Hansen, Michael R.

    We present an approach to the construction of Enterprise Resource Planning (ERP) Systems, which is based on the Resources, Events and Agents (REA) ontology. This framework deals with processes involving exchange and flow of resources in a declarative, graphically-based manner describing what the major entities are rather than how they engage in computations. We show how to develop a domain-specific language on the basis of REA, and a tool which automatically can generate running web-applications. A main contribution is a proof-of-concept showing that business-domain experts can generate their own applications without worrying about implementation details.

  13. Securing TCP/IP and Dial-up Access to Administrative Data.

    ERIC Educational Resources Information Center

    Conrad, L. Dean

    1992-01-01

    This article describes Arizona State University's solution to security risk inherent in general access systems such as TCP/IP (Transmission Control Protocol/INTERNET Protocol). Advantages and disadvantages of various options are compared, and the process of selecting a log-on authentication approach involving generation of a different password at…

  14. Reframing Science Learning and Teaching: A Communities of Practice Approach

    ERIC Educational Resources Information Center

    Sansone, Anna

    2018-01-01

    Next Generation Science Standards encourage science instruction that offers not only opportunities for inquiry but also the diverse social and cognitive processes involved in scientific thinking and communication. This article gives an introduction to Lave and Wenger's (1991) communities of practice framework as a potential way of viewing…

  15. Writing Our Lived Experience: Beyond the (Pale) Hermeneutic?

    ERIC Educational Resources Information Center

    Geelan, David R.; Taylor, Peter C.

    2001-01-01

    Makes a case for an alternative epistemology of research based on the hermeneutic-phenomenology of Max van Manen (1990). This interpretive approach to understanding the nature of a social phenomenon involves the researcher in making explicit the meaning of a particular lived experience and generating a pedagogical thoughtfulness in readers.…

  16. Competing With Ronald McDonald, Cap'n Crunch and the Pepsi Generation.

    ERIC Educational Resources Information Center

    Kamholtz, J. Dennis; Wood, Bill

    1982-01-01

    A new approach to elementary health education involves the use of a series of health-related games. The games address a variety of issues including nutrition, substance abuse, and dental health education. The story "Floss is the Boss" is used as an example. (JN)

  17. Development of High-Tech Skills.

    ERIC Educational Resources Information Center

    Theuerkauf, Walter E.

    High tech systems not only generate new structures in the production process, but also involve profound changes in job organization, which in turn imply that job qualifications must be modified. In view of the changes within engineering systems and the relevant technologies, it seems expedient to choose a curricular approach based on the concepts…

  18. Greener Biomimetic Approach to the Synthesis of Nanomaterials using Antioxidants and their Sustainable Applications

    EPA Science Inventory

    The presentation summarizes our sustainable synthetic activity for the preparation of nanoparticles involving benign alternatives which reduces or eliminates the use and generation of hazardous substances. Vitamins B1, B2, C, and tea and wine polyphenols which function both as r...

  19. Kinetic Monte Carlo Method for Rule-based Modeling of Biochemical Networks

    PubMed Central

    Yang, Jin; Monine, Michael I.; Faeder, James R.; Hlavacek, William S.

    2009-01-01

    We present a kinetic Monte Carlo method for simulating chemical transformations specified by reaction rules, which can be viewed as generators of chemical reactions, or equivalently, definitions of reaction classes. A rule identifies the molecular components involved in a transformation, how these components change, conditions that affect whether a transformation occurs, and a rate law. The computational cost of the method, unlike conventional simulation approaches, is independent of the number of possible reactions, which need not be specified in advance or explicitly generated in a simulation. To demonstrate the method, we apply it to study the kinetics of multivalent ligand-receptor interactions. We expect the method will be useful for studying cellular signaling systems and other physical systems involving aggregation phenomena. PMID:18851068

  20. Comparison of a rational vs. high throughput approach for rapid salt screening and selection.

    PubMed

    Collman, Benjamin M; Miller, Jonathan M; Seadeek, Christopher; Stambek, Julie A; Blackburn, Anthony C

    2013-01-01

    In recent years, high throughput (HT) screening has become the most widely used approach for early phase salt screening and selection in a drug discovery/development setting. The purpose of this study was to compare a rational approach for salt screening and selection to those results previously generated using a HT approach. The rational approach involved a much smaller number of initial trials (one salt synthesis attempt per counterion) that were selected based on a few strategic solubility determinations of the free form combined with a theoretical analysis of the ideal solvent solubility conditions for salt formation. Salt screening results for sertraline, tamoxifen, and trazodone using the rational approach were compared to those previously generated by HT screening. The rational approach produced similar results to HT screening, including identification of the commercially chosen salt forms, but with a fraction of the crystallization attempts. Moreover, the rational approach provided enough solid from the very initial crystallization of a salt for more thorough and reliable solid-state characterization and thus rapid decision-making. The crystallization techniques used in the rational approach mimic larger-scale process crystallization, allowing smoother technical transfer of the selected salt to the process chemist.

  1. Light, sound, chemistry… action: state of the art optical methods for animal imaging.

    PubMed

    Ripoll, Jorge; Ntziachristos, Vasilis

    2011-01-01

    During recent years, macroscopic optical methods have been promoted from backstage to main actors in biological imaging. Many possible forms of energy conservation have been explored that involve light, including fluorescence emission, sound generated through absorption and bioluminescence, that is light generated through a chemical reaction. These physicochemical approaches for contrast generation have resulted in optical imaging methods that come with potent performance characteristics over simple epi-illumination optical imaging approaches of the past, and can play a central role in imaging applications in vivo as it pertains to modern biological and drug discovery, pre-clinical imaging and clinical applications. This review focuses on state of the art optical and opto-acoustic (photo-acoustic) imaging methods and discusses key performance characteristics that convert optical imaging from a qualitative modality to a powerful high-resolution and quantitative volumetric interrogation tool for operation through several millimeters of tissue depth.: © 2011 Elsevier Ltd . All rights reserved.

  2. Involvement of family members in life with type 2 diabetes: Six interconnected problem domains of significance for family health identity and healthcare authenticity

    PubMed Central

    Grabowski, Dan; Andersen, Tue Helms; Varming, Annemarie; Ommundsen, Christine; Willaing, Ingrid

    2017-01-01

    Objectives: Family involvement plays a key role in diabetes management. Problems and challenges related to type 2-diabetes often affect the whole family, and relatives are at increased risk of developing diabetes themselves. We highlight these issues in our objectives: (1) to uncover specific family problems associated with mutual involvement in life with type 2-diabetes and (2) to analytically look at ways of approaching these problems in healthcare settings. Methods: Qualitative data were gathered in participatory problem assessment workshops. The data were analysed in three rounds using radical hermeneutics. Results: Problems were categorized in six domains: knowledge, communication, support, everyday life, roles and worries. The final cross-analysis focusing on the link between family identity and healthcare authenticity provided information on how the six domains can be approached in healthcare settings. Conclusion: The study generated important knowledge about problems associated with family involvement in life with type 2 diabetes and about how family involvement can be supported in healthcare practice. PMID:28839943

  3. Application of physics engines in virtual worlds

    NASA Astrophysics Data System (ADS)

    Norman, Mark; Taylor, Tim

    2002-03-01

    Dynamic virtual worlds potentially can provide a much richer and more enjoyable experience than static ones. To realize such worlds, three approaches are commonly used. The first of these, and still widely applied, involves importing traditional animations from a modeling system such as 3D Studio Max. This approach is therefore limited to predefined animation scripts or combinations/blends thereof. The second approach involves the integration of some specific-purpose simulation code, such as car dynamics, and is thus generally limited to one (class of) application(s). The third approach involves the use of general-purpose physics engines, which promise to enable a range of compelling dynamic virtual worlds and to considerably speed up development. By far the largest market today for real-time simulation is computer games, revenues exceeding those of the movie industry. Traditionally, the simulation is produced by game developers in-house for specific titles. However, off-the-shelf middleware physics engines are now available for use in games and related domains. In this paper, we report on our experiences of using middleware physics engines to create a virtual world as an interactive experience, and an advanced scenario where artificial life techniques generate controllers for physically modeled characters.

  4. Random mutagenesis by error-prone pol plasmid replication in Escherichia coli.

    PubMed

    Alexander, David L; Lilly, Joshua; Hernandez, Jaime; Romsdahl, Jillian; Troll, Christopher J; Camps, Manel

    2014-01-01

    Directed evolution is an approach that mimics natural evolution in the laboratory with the goal of modifying existing enzymatic activities or of generating new ones. The identification of mutants with desired properties involves the generation of genetic diversity coupled with a functional selection or screen. Genetic diversity can be generated using PCR or using in vivo methods such as chemical mutagenesis or error-prone replication of the desired sequence in a mutator strain. In vivo mutagenesis methods facilitate iterative selection because they do not require cloning, but generally produce a low mutation density with mutations not restricted to specific genes or areas within a gene. For this reason, this approach is typically used to generate new biochemical properties when large numbers of mutants can be screened or selected. Here we describe protocols for an advanced in vivo mutagenesis method that is based on error-prone replication of a ColE1 plasmid bearing the gene of interest. Compared to other in vivo mutagenesis methods, this plasmid-targeted approach allows increased mutation loads and facilitates iterative selection approaches. We also describe the mutation spectrum for this mutagenesis methodology in detail, and, using cycle 3 GFP as a target for mutagenesis, we illustrate the phenotypic diversity that can be generated using our method. In sum, error-prone Pol I replication is a mutagenesis method that is ideally suited for the evolution of new biochemical activities when a functional selection is available.

  5. Systematic development and implementation of interventions to OPtimise Health Literacy and Access (Ophelia).

    PubMed

    Beauchamp, Alison; Batterham, Roy W; Dodson, Sarity; Astbury, Brad; Elsworth, Gerald R; McPhee, Crystal; Jacobson, Jeanine; Buchbinder, Rachelle; Osborne, Richard H

    2017-03-03

    The need for healthcare strengthening to enhance equity is critical, requiring systematic approaches that focus on those experiencing lesser access and outcomes. This project developed and tested the Ophelia (OPtimising HEalth LIteracy and Access) approach for co-design of interventions to improve health literacy and equity of access. Eight principles guided this development: Outcomes focused; Equity driven, Needs diagnosis, Co-design, Driven by local wisdom, Sustainable, Responsive and Systematically applied. We report the application of the Ophelia process where proof-of-concept was defined as successful application of the principles. Nine sites were briefed on the aims of the project around health literacy, co-design and quality improvement. The sites were rural/metropolitan, small/large hospitals, community health centres or municipalities. Each site identified their own priorities for improvement; collected health literacy data using the Health Literacy Questionnaire (HLQ) within the identified priority groups; engaged staff in co-design workshops to generate ideas for improvement; developed program-logic models; and implemented their projects using Plan-Do-Study-Act (PDSA) cycles. Evaluation included assessment of impacts on organisations, practitioners and service users, and whether the principles were applied. Sites undertook co-design workshops involving discussion of service user needs informed by HLQ (n = 813) and interview data. Sites generated between 21 and 78 intervention ideas and then planned their selected interventions through program-logic models. Sites successfully implemented interventions and refined them progressively with PDSA cycles. Interventions generally involved one of four pathways: development of clinician skills and resources for health literacy, engagement of community volunteers to disseminate health promotion messages, direct impact on consumers' health literacy, and redesign of existing services. Evidence of application of the principles was found in all sites. The Ophelia approach guided identification of health literacy issues at each participating site and the development and implementation of locally appropriate solutions. The eight principles provided a framework that allowed flexible application of the Ophelia approach and generation of a diverse set of interventions. Changes were observed at organisational, staff, and community member levels. The Ophelia approach can be used to generate health service improvements that enhance health outcomes and address inequity of access to healthcare.

  6. Transition Metal Free Multicomponent approach to Stereo-enriched Cyclopentyl-isoxazoles via C-C Bond Cleavage.

    PubMed

    Kaliappan, Krishna Pillai; Subramanian, Parthasarathi

    2018-06-19

    An efficient multicomponent reaction leading to the synthesis of stereo-enriched cyclopentyl-isoxazoles from camphor derived α-oxime, alkynes and MeOH is reported. Our method involves a series of cascade transformations such as in situ generation of catalyst I(III) which catalyzes the addition MeOH into a sterically hindered ketone, oxime oxidation and α-hydroxyiminium ion rearrangement to generate in situ nitrile oxide which upon [3+2]-cycloaddition reaction with alkynes delivers regioselective products. The reaction is very selective to syn-oxime. This multicomponent approach has also been extended for the synthesis of a novel glycoconjugate, camphoric ester-isoxazole C-galactoside. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Knowledge-based zonal grid generation for computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    Andrews, Alison E.

    1988-01-01

    Automation of flow field zoning in two dimensions is an important step towards reducing the difficulty of three-dimensional grid generation in computational fluid dynamics. Using a knowledge-based approach makes sense, but problems arise which are caused by aspects of zoning involving perception, lack of expert consensus, and design processes. These obstacles are overcome by means of a simple shape and configuration language, a tunable zoning archetype, and a method of assembling plans from selected, predefined subplans. A demonstration system for knowledge-based two-dimensional flow field zoning has been successfully implemented and tested on representative aerodynamic configurations. The results show that this approach can produce flow field zonings that are acceptable to experts with differing evaluation criteria.

  8. Identifying Effective Design Approaches to Allocate Genotypes in Two-Phase Designs: A Case Study in Pelargonium zonale

    PubMed Central

    Molenaar, Heike; Boehm, Robert; Piepho, Hans-Peter

    2018-01-01

    Robust phenotypic data allow adequate statistical analysis and are crucial for any breeding purpose. Such data is obtained from experiments laid out to best control local variation. Additionally, experiments frequently involve two phases, each contributing environmental sources of variation. For example, in a former experiment we conducted to evaluate production related traits in Pelargonium zonale, there were two consecutive phases, each performed in a different greenhouse. Phase one involved the propagation of the breeding strains to obtain the stem cutting count, and phase two involved the assessment of root formation. The evaluation of the former study raised questions regarding options for improving the experimental layout: (i) Is there a disadvantage to using exactly the same design in both phases? (ii) Instead of generating a separate layout for each phase, can the design be optimized across both phases, such that the mean variance of a pair-wise treatment difference (MVD) can be decreased? To answer these questions, alternative approaches were explored to generate two-phase designs either in phase-wise order (Option 1) or across phases (Option 2). In Option 1 we considered the scenarios (i) using in both phases the same experimental design and (ii) randomizing each phase separately. In Option 2, we considered the scenarios (iii) generating a single design with eight replicates and splitting these among the two phases, (iv) separating the block structure across phases by dummy coding, and (v) design generation with optimal alignment of block units in the two phases. In both options, we considered the same or different block structures in each phase. The designs were evaluated by the MVD obtained by the intra-block analysis and the joint inter-block–intra-block analysis. The smallest MVD was most frequently obtained for designs generated across phases rather than for each phase separately, in particular when both phases of the design were separated with a single pseudo-level. The joint optimization ensured that treatment concurrences were equally balanced across pairs, one of the prerequisites for an efficient design. The proposed alternative approaches can be implemented with any model-based design packages with facilities to formulate linear models for treatment and block structures. PMID:29354145

  9. Computer Games Development Experience and Appreciative Learning Approach for Creative Process Enhancement

    ERIC Educational Resources Information Center

    Leng, Eow Yee; Ali, Wan Zah bte Wan; Mahmud, Rosnaini bt.; Baki, Roselan

    2010-01-01

    Nurturing children into thinking creatively needs to take account of what interest them. Therefore, the study conducted engaged students with computer games development as it corresponded with the young generation's habits and interests. This was done with the purpose to enhance the creative process experienced by students. It involved 69…

  10. Why Traditional Expository Teaching-Learning Approaches May Founder? An Experimental Examination of Neural Networks in Biology Learning

    ERIC Educational Resources Information Center

    Lee, Jun-Ki; Kwon, Yong-Ju

    2011-01-01

    Using functional magnetic resonance imaging (fMRI), this study investigates and discusses neurological explanations for, and the educational implications of, the neural network activations involved in hypothesis-generating and hypothesis-understanding for biology education. Two sets of task paradigms about biological phenomena were designed:…

  11. The Process of Developing Theories-in-Action with OELEs: A Qualitative Study.

    ERIC Educational Resources Information Center

    Land, Susan M.; Hannafin, Michael J.

    Open-ended learning environments (OELEs) like microworlds have been touted as one approach for blending learning theory and emerging technology to support the building of student-centered understanding. The learning process involves developing a theory-in-action--an intuitive theory that is generated and changed by learners as they reflect upon…

  12. Learning-by-Doing as an Approach to Teaching Social Entrepreneurship

    ERIC Educational Resources Information Center

    Chang, Jane; Benamraoui, Abdelhafid; Rieple, Alison

    2014-01-01

    Many studies have explored the use of learning-by-doing in higher education, but few have applied this to social entrepreneurship contexts and applications: this paper addresses this gap in the literature. Our programme involved students working with different stakeholders in an interactive learning environment to generate real revenue for social…

  13. Generating and Analyzing Visual Representations of Conic Sections with the Use of Technological Tools

    ERIC Educational Resources Information Center

    Santos-Trigo, Manuel; Espinosa-Perez, Hugo; Reyes-Rodriguez, Aaron

    2006-01-01

    Technological tools have the potential to offer students the possibility to represent information and relationships embedded in problems and concepts in ways that involve numerical, algebraic, geometric, and visual approaches. In this paper, the authors present and discuss an example in which an initial representation of a mathematical object…

  14. Rules in School. Strategies for Teachers Series.

    ERIC Educational Resources Information Center

    Brady, Kathryn; Forton, Mary Beth; Porter, Deborah; Wood, Chip

    This book offers an approach for helping K-8 students become invested in creating and living by classroom rules. It provides techniques for: helping students articulate their hopes and dreams for school; involving students in generating classroom rules that grow out of their hopes and dreams; modeling, practicing, and role playing the rules; using…

  15. Nanoscale imaging with table-top coherent extreme ultraviolet source based on high harmonic generation

    NASA Astrophysics Data System (ADS)

    Ba Dinh, Khuong; Le, Hoang Vu; Hannaford, Peter; Van Dao, Lap

    2017-08-01

    A table-top coherent diffractive imaging experiment on a sample with biological-like characteristics using a focused narrow-bandwidth high harmonic source around 30 nm is performed. An approach involving a beam stop and a new reconstruction algorithm to enhance the quality of reconstructed the image is described.

  16. Applications of Automation Methods for Nonlinear Fracture Test Analysis

    NASA Technical Reports Server (NTRS)

    Allen, Phillip A.; Wells, Douglas N.

    2013-01-01

    As fracture mechanics material testing evolves, the governing test standards continue to be refined to better reflect the latest understanding of the physics of the fracture processes involved. The traditional format of ASTM fracture testing standards, utilizing equations expressed directly in the text of the standard to assess the experimental result, is self-limiting in the complexity that can be reasonably captured. The use of automated analysis techniques to draw upon a rich, detailed solution database for assessing fracture mechanics tests provides a foundation for a new approach to testing standards that enables routine users to obtain highly reliable assessments of tests involving complex, non-linear fracture behavior. Herein, the case for automating the analysis of tests of surface cracks in tension in the elastic-plastic regime is utilized as an example of how such a database can be generated and implemented for use in the ASTM standards framework. The presented approach forms a bridge between the equation-based fracture testing standards of today and the next generation of standards solving complex problems through analysis automation.

  17. A neurophysiological approach to tinnitus: clinical implications.

    PubMed

    Jastreboff, P J; Hazell, J W

    1993-02-01

    This paper presents a neurophysiological approach to tinnitus and discusses its clinical implications. A hypothesis of discordant damage of inner and outer hair cells systems in tinnitus generation is outlined. A recent animal model has facilitated the investigation of the mechanisms of tinnitus and has been further refined to allow for the measurement of tinnitus pitch and loudness. The analysis of the processes involved in tinnitus detection postulates the involvement of an abnormal increase of gain within the auditory system. Moreover, it provides a basis for treating patients with hyperacusis, which we are considering to be a pre-tinnitus state. Analysis of the process of tinnitus perception allows for the possibility of facilitating the process of tinnitus habituation for the purpose of its alleviation. The combining of theoretical analysis with clinical findings has resulted in the creation of a multidisciplinary Tinnitus Centre. The foundation of the Centre focuses on two goals: the clinical goal is to remove tinnitus perception from the patient's consciousness, while directing research toward finding a mechanism-based method for the suppression of tinnitus generators and processes responsible for enhancement of tinnitus-related neuronal activity.

  18. Interprofessional communication and medical error: a reframing of research questions and approaches.

    PubMed

    Varpio, Lara; Hall, Pippa; Lingard, Lorelei; Schryer, Catherine F

    2008-10-01

    Progress toward understanding the links between interprofessional communication and issues of medical error has been slow. Recent research proposes that this delay may result from overlooking the complexities involved in interprofessional care. Medical education initiatives in this domain tend to simplify the complexities of team membership fluidity, rotation, and use of communication tools. A new theoretically informed research approach is required to take into account these complexities. To generate such an approach, we review two theories from the social sciences: Activity Theory and Knotworking. Using these perspectives, we propose that research into interprofessional communication and medical error can develop better understandings of (1) how and why medical errors are generated and (2) how and why gaps in team defenses occur. Such complexities will have to be investigated if students and practicing clinicians are to be adequately prepared to work safely in interprofessional teams.

  19. The generation of CD8+ T-cell population specific for vaccinia virus epitope involved in the antiviral protection against ectromelia virus challenge.

    PubMed

    Gierynska, Malgorzata; Szulc-Dabrowska, Lidia; Dzieciatkowski, Tomasz; Golke, Anna; Schollenberger, Ada

    2015-12-01

    Eradication of smallpox has led to cessation of vaccination programs. This has rendered the human population increasingly susceptible not only to variola virus infection but also to infections with other representatives of Poxviridae family that cause zoonotic variola-like diseases. Thus, new approaches for designing improved vaccine against smallpox are required. Discovering that orthopoxviruses, e.g. variola virus, vaccinia virus, ectromelia virus, share common immunodominant antigen, may result in the development of such a vaccine. In our study, the generation of antigen-specific CD8(+) T cells in mice during the acute and memory phase of the immune response was induced using the vaccinia virus immunodominant TSYKFESV epitope and CpG oligodeoxynucleotides as adjuvants. The role of the generated TSYKFESV-specific CD8(+) T cells was evaluated in mice during ectromelia virus infection using systemic and mucosal model. Moreover, the involvement of dendritic cells subsets in the adaptive immune response stimulation was assessed. Our results indicate that the TSYKFESV epitope/TLR9 agonist approach, delivered systemically or mucosally, generated strong CD8(+) T-cell response when measured 10 days after immunization. Furthermore, the TSYKFESV-specific cell population remained functionally active 2 months post-immunization, and gave cross-protection in virally challenged mice, even though the numbers of detectable antigen-specific T cells decreased. © FEMS 2015. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  20. TCR-engineered, customized, antitumor T cells for cancer immunotherapy: advantages and limitations.

    PubMed

    Chhabra, Arvind

    2011-01-05

    The clinical outcome of the traditional adoptive cancer immunotherapy approaches involving the administration of donor-derived immune effectors, expanded ex vivo, has not met expectations. This could be attributed, in part, to the lack of sufficient high-avidity antitumor T-cell precursors in most cancer patients, poor immunogenicity of cancer cells, and the technological limitations to generate a sufficiently large number of tumor antigen-specific T cells. In addition, the host immune regulatory mechanisms and immune homeostasis mechanisms, such as activation-induced cell death (AICD), could further limit the clinical efficacy of the adoptively administered antitumor T cells. Since generation of a sufficiently large number of potent antitumor immune effectors for adoptive administration is critical for the clinical success of this approach, recent advances towards generating customized donor-specific antitumor-effector T cells by engrafting human peripheral blood-derived T cells with a tumor-associated antigen-specific transgenic T-cell receptor (TCR) are quite interesting. This manuscript provides a brief overview of the TCR engineering-based cancer immunotherapy approach, its advantages, and the current limitations.

  1. Creating tomorrow's leaders today: the Emerging Nurse Leaders Program of the Texas Nurses Association.

    PubMed

    Sportsman, Susan; Wieck, Lynn; Yoder-Wise, Patricia S; Light, Kathleen M; Jordan, Clair

    2010-06-01

    The Texas Nurses Association initiated an Emerging Nurse Leaders Program as an approach to engaging new nurses in the leadership of the professional association. This article explains the program's origin, the commitment of the Texas Nurses Association to this process, the implementation of the plan, and the discussions that launched a new way of connecting leaders across generations. Further, it is an approach that any professional organization can use to encourage the involvement of new leaders.

  2. Mapping soil types from multispectral scanner data.

    NASA Technical Reports Server (NTRS)

    Kristof, S. J.; Zachary, A. L.

    1971-01-01

    Multispectral remote sensing and computer-implemented pattern recognition techniques were used for automatic ?mapping' of soil types. This approach involves subjective selection of a set of reference samples from a gray-level display of spectral variations which was generated by a computer. Each resolution element is then classified using a maximum likelihood ratio. Output is a computer printout on which the researcher assigns a different symbol to each class. Four soil test areas in Indiana were experimentally examined using this approach, and partially successful results were obtained.

  3. Hybrid generative-discriminative approach to age-invariant face recognition

    NASA Astrophysics Data System (ADS)

    Sajid, Muhammad; Shafique, Tamoor

    2018-03-01

    Age-invariant face recognition is still a challenging research problem due to the complex aging process involving types of facial tissues, skin, fat, muscles, and bones. Most of the related studies that have addressed the aging problem are focused on generative representation (aging simulation) or discriminative representation (feature-based approaches). Designing an appropriate hybrid approach taking into account both the generative and discriminative representations for age-invariant face recognition remains an open problem. We perform a hybrid matching to achieve robustness to aging variations. This approach automatically segments the eyes, nose-bridge, and mouth regions, which are relatively less sensitive to aging variations compared with the rest of the facial regions that are age-sensitive. The aging variations of age-sensitive facial parts are compensated using a demographic-aware generative model based on a bridged denoising autoencoder. The age-insensitive facial parts are represented by pixel average vector-based local binary patterns. Deep convolutional neural networks are used to extract relative features of age-sensitive and age-insensitive facial parts. Finally, the feature vectors of age-sensitive and age-insensitive facial parts are fused to achieve the recognition results. Extensive experimental results on morphological face database II (MORPH II), face and gesture recognition network (FG-NET), and Verification Subset of cross-age celebrity dataset (CACD-VS) demonstrate the effectiveness of the proposed method for age-invariant face recognition well.

  4. Requirement Generation for Space Infrastructure Systems

    NASA Astrophysics Data System (ADS)

    Hempsell, M.

    Despite heavy investment, in the half-century period between 1970 and 2020 there will almost no progress in the capability provided by the space infrastructure. It is argued that this is due to a failure during the requirement generation phase of the infrastructure's elements, a failure that is primarily due to following the accepted good practice of involving stakeholders while establishing a mission based set of technical requirements. This argument is supported by both a consideration of the history of the requirement generation phase of past space infrastructure projects, in particular the Space Shuttle, and an analysis of the interactions of the stakeholders during this phase. Traditional stakeholder involvement only works well in mature infrastructures where investment aims to make minor improvements, whereas space activity is still in the early experimental stages and is open to major new initiatives that aim to radically change the way we work in space. A new approach to requirement generation is proposed, which is more appropriate to these current circumstances. This uses a methodology centred on the basic functions the system is intended to perform rather than its expected missions.

  5. Manufacture of electrical and magnetic graded and anisotropic materials for novel manipulations of microwaves.

    PubMed

    Grant, P S; Castles, F; Lei, Q; Wang, Y; Janurudin, J M; Isakov, D; Speller, S; Dancer, C; Grovenor, C R M

    2015-08-28

    Spatial transformations (ST) provide a design framework to generate a required spatial distribution of electrical and magnetic properties of materials to effect manipulations of electromagnetic waves. To obtain the electromagnetic properties required by these designs, the most common materials approach has involved periodic arrays of metal-containing subwavelength elements. While aspects of ST theory have been confirmed using these structures, they are often disadvantaged by narrowband operation, high losses and difficulties in implementation. An all-dielectric approach involves weaker interactions with applied fields, but may offer more flexibility for practical implementation. This paper investigates manufacturing approaches to produce composite materials that may be conveniently arranged spatially, according to ST-based designs. A key aim is to highlight the limitations and possibilities of various manufacturing approaches, to constrain designs to those that may be achievable. The article focuses on polymer-based nano- and microcomposites in which interactions with microwaves are achieved by loading the polymers with high-permittivity and high-permeability particles, and manufacturing approaches based on spray deposition, extrusion, casting and additive manufacture.

  6. Manufacture of electrical and magnetic graded and anisotropic materials for novel manipulations of microwaves

    PubMed Central

    Grant, P. S.; Castles, F.; Lei, Q.; Wang, Y.; Janurudin, J. M.; Isakov, D.; Speller, S.; Dancer, C.; Grovenor, C. R. M.

    2015-01-01

    Spatial transformations (ST) provide a design framework to generate a required spatial distribution of electrical and magnetic properties of materials to effect manipulations of electromagnetic waves. To obtain the electromagnetic properties required by these designs, the most common materials approach has involved periodic arrays of metal-containing subwavelength elements. While aspects of ST theory have been confirmed using these structures, they are often disadvantaged by narrowband operation, high losses and difficulties in implementation. An all-dielectric approach involves weaker interactions with applied fields, but may offer more flexibility for practical implementation. This paper investigates manufacturing approaches to produce composite materials that may be conveniently arranged spatially, according to ST-based designs. A key aim is to highlight the limitations and possibilities of various manufacturing approaches, to constrain designs to those that may be achievable. The article focuses on polymer-based nano- and microcomposites in which interactions with microwaves are achieved by loading the polymers with high-permittivity and high-permeability particles, and manufacturing approaches based on spray deposition, extrusion, casting and additive manufacture. PMID:26217051

  7. Computational physical oceanography -- A comprehensive approach based on generalized CFD/grid techniques for planetary scale simulations of oceanic flows. Final report, September 1, 1995--August 31, 1996

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beddhu, M.; Jiang, M.Y.; Whitfield, D.L.

    The original intention for this work was to impart the technology that was developed in the field of computational aeronautics to the field of computational physical oceanography. This technology transfer involved grid generation techniques and solution procedures to solve the governing equations over the grids thus generated. Specifically, boundary fitting non-orthogonal grids would be generated over a sphere taking into account the topography of the ocean floor and the topography of the continents. The solution methodology to be employed involved the application of an upwind, finite volume discretization procedure that uses higher order numerical fluxes at the cell faces tomore » discretize the governing equations and an implicit Newton relaxation technique to solve the discretized equations. This report summarizes the efforts put forth during the past three years to achieve these goals and indicates the future direction of this work as it is still an ongoing effort.« less

  8. Rostral and caudal prefrontal contribution to creativity: a meta-analysis of functional imaging data

    PubMed Central

    Gonen-Yaacovi, Gil; de Souza, Leonardo Cruz; Levy, Richard; Urbanski, Marika; Josse, Goulven; Volle, Emmanuelle

    2013-01-01

    Creativity is of central importance for human civilization, yet its neurocognitive bases are poorly understood. The aim of the present study was to integrate existing functional imaging data by using the meta-analysis approach. We reviewed 34 functional imaging studies that reported activation foci during tasks assumed to engage creative thinking in healthy adults. A coordinate-based meta-analysis using Activation Likelihood Estimation (ALE) first showed a set of predominantly left-hemispheric regions shared by the various creativity tasks examined. These regions included the caudal lateral prefrontal cortex (PFC), the medial and lateral rostral PFC, and the inferior parietal and posterior temporal cortices. Further analyses showed that tasks involving the combination of remote information (combination tasks) activated more anterior areas of the lateral PFC than tasks involving the free generation of unusual responses (unusual generation tasks), although both types of tasks shared caudal prefrontal areas. In addition, verbal and non-verbal tasks involved the same regions in the left caudal prefrontal, temporal, and parietal areas, but also distinct domain-oriented areas. Taken together, these findings suggest that several frontal and parieto-temporal regions may support cognitive processes shared by diverse creativity tasks, and that some regions may be specialized for distinct types of processes. In particular, the lateral PFC appeared to be organized along a rostro-caudal axis, with rostral regions involved in combining ideas creatively and more posterior regions involved in freely generating novel ideas. PMID:23966927

  9. Extending the Framework of Generativity Theory Through Research: A Qualitative Study

    PubMed Central

    Rubinstein, Robert L.; Girling, Laura M.; de Medeiros, Kate; Brazda, Michael; Hannum, Susan

    2015-01-01

    Purpose of the study: Based on ethnographic interviews, we discuss three ideas we believe will expand knowledge of older informants’ thoughts about and representations of generativity. We adapt the notion of “dividuality” as developed in cultural anthropology to reframe ideas on generativity. The term dividuality refers to a condition of interpersonal or intergenerational connectedness, as distinct from individuality. We also extend previous definitions of generativity by identifying both objects of generative action and temporal and relational frameworks for generative action. Design: We define 4 foci of generativity (people, groups, things, and activities) and 4 spheres of generativity (historical, familial, individual, and relational) based in American culture and with which older informants could easily identify. The approach outlined here also discusses a form of generativity oriented to the past in which relationships with persons in senior generations form a kind of generative action since they are involved in caring for the origins of the self and hence of future generative acts. These 3 elements of a new framework will allow researchers to pose critical questions about generativity among older adults. Such questions include (a) How is the self, as culturally constituted, involved in generative action? and (b) What are the types of generativity within the context of American culture and how are they spoken about? Each of the above points is directly addressed in the data we present below. Methods: We defined these domains through extended ethnographic interviews with 200 older women. Results and implications: The article addresses some new ways of thinking about generativity as a construct, which may be useful in understanding the cultural personhood of older Americans. PMID:24704718

  10. Extending the Framework of Generativity Theory Through Research: A Qualitative Study.

    PubMed

    Rubinstein, Robert L; Girling, Laura M; de Medeiros, Kate; Brazda, Michael; Hannum, Susan

    2015-08-01

    Based on ethnographic interviews, we discuss three ideas we believe will expand knowledge of older informants' thoughts about and representations of generativity. We adapt the notion of "dividuality" as developed in cultural anthropology to reframe ideas on generativity. The term dividuality refers to a condition of interpersonal or intergenerational connectedness, as distinct from individuality. We also extend previous definitions of generativity by identifying both objects of generative action and temporal and relational frameworks for generative action. We define 4 foci of generativity (people, groups, things, and activities) and 4 spheres of generativity (historical, familial, individual, and relational) based in American culture and with which older informants could easily identify. The approach outlined here also discusses a form of generativity oriented to the past in which relationships with persons in senior generations form a kind of generative action since they are involved in caring for the origins of the self and hence of future generative acts. These 3 elements of a new framework will allow researchers to pose critical questions about generativity among older adults. Such questions include (a) How is the self, as culturally constituted, involved in generative action? and (b) What are the types of generativity within the context of American culture and how are they spoken about? Each of the above points is directly addressed in the data we present below. We defined these domains through extended ethnographic interviews with 200 older women. The article addresses some new ways of thinking about generativity as a construct, which may be useful in understanding the cultural personhood of older Americans. © The Author 2014. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  11. Implementing the Health Promoting School in Denmark: A Case Study

    ERIC Educational Resources Information Center

    Nordin, Lone Lindegaard

    2016-01-01

    Purpose: The purpose of this paper is to provide insight into teachers' practice in implementing school-based health promotion. Design/methodology/approach: This qualitative research was designed as a multiple case study. The study involved five schools, 233 pupils in the age 12-16 and 23 teachers. The primary data generation method were focus…

  12. Models, Their Application, and Scientific Anticipation: Ludwig Boltzmann's Work as Tacit Knowing

    ERIC Educational Resources Information Center

    Schmitt, Richard Henry

    2011-01-01

    Ludwig Boltzmann's work in theoretical physics exhibits an approach to the construction of theory that he transmitted to the succeeding generation by example. It involved the construction of clear models, allowed more than one, and was not based solely on the existing facts, with the intent of examining and criticizing the assumptions that made…

  13. Resource Contention Management in Parallel Systems

    DTIC Science & Technology

    1989-04-01

    technical competence include communications, command and control, battle management, information processing, surveillance sensors, intelligence data ...two-simulation approach since they require only a single simulation run. More importantly, since they involve only observed data , they may also be...we use the original, unobservable RAC of Section 2 and handle un- observable transitions by generating artifcial events, when required, using a random

  14. Biology Teachers' Conceptions of the Diversity of Life and the Historical Development of Evolutionary Concepts

    ERIC Educational Resources Information Center

    da Silva, Paloma Rodrigues; de Andrade, Mariana A. Bologna Soares; de Andrade Caldeira, Ana Maria

    2015-01-01

    Biology is a science that involves study of the diversity of living organisms. This diversity has always generated questions and has motivated cultures to seek plausible explanations for the differences and similarities between types of organisms. In biology teaching, these issues are addressed by adopting an evolutionary approach. The aim of this…

  15. Program Code Generator for Cardiac Electrophysiology Simulation with Automatic PDE Boundary Condition Handling

    PubMed Central

    Punzalan, Florencio Rusty; Kunieda, Yoshitoshi; Amano, Akira

    2015-01-01

    Clinical and experimental studies involving human hearts can have certain limitations. Methods such as computer simulations can be an important alternative or supplemental tool. Physiological simulation at the tissue or organ level typically involves the handling of partial differential equations (PDEs). Boundary conditions and distributed parameters, such as those used in pharmacokinetics simulation, add to the complexity of the PDE solution. These factors can tailor PDE solutions and their corresponding program code to specific problems. Boundary condition and parameter changes in the customized code are usually prone to errors and time-consuming. We propose a general approach for handling PDEs and boundary conditions in computational models using a replacement scheme for discretization. This study is an extension of a program generator that we introduced in a previous publication. The program generator can generate code for multi-cell simulations of cardiac electrophysiology. Improvements to the system allow it to handle simultaneous equations in the biological function model as well as implicit PDE numerical schemes. The replacement scheme involves substituting all partial differential terms with numerical solution equations. Once the model and boundary equations are discretized with the numerical solution scheme, instances of the equations are generated to undergo dependency analysis. The result of the dependency analysis is then used to generate the program code. The resulting program code are in Java or C programming language. To validate the automatic handling of boundary conditions in the program code generator, we generated simulation code using the FHN, Luo-Rudy 1, and Hund-Rudy cell models and run cell-to-cell coupling and action potential propagation simulations. One of the simulations is based on a published experiment and simulation results are compared with the experimental data. We conclude that the proposed program code generator can be used to generate code for physiological simulations and provides a tool for studying cardiac electrophysiology. PMID:26356082

  16. Combined phosphorescence-holographic approach for singlet oxygen detection in biological media

    NASA Astrophysics Data System (ADS)

    Semenova, I. V.; Belashov, A. V.; Beltukova, D. M.; Petrov, N. V.; Vasyutinskii, O. S.

    2015-06-01

    The paper presents a novel combined approach aimed to detect and monitor singlet oxygen molecules in biological specimens by means of the simultaneous recording and monitoring of their deactivation dynamics in the two complementary channels: radiative and nonradiative. The approach involves both the direct registration of phosphorescence at the wavelength of about 1270 nm caused by radiative relaxation of excited singlet oxygen molecules and holographic recording of thermal disturbances in the medium produced by their nonradiative relaxation. The data provides a complete set of information on singlet oxygen location and dynamics in the medium. The approach was validated in the case study of photosensitized generation of singlet oxygen in onion cell structures.

  17. NextGen Future Safety Assessment Game

    NASA Technical Reports Server (NTRS)

    Ancel, Ersin; Gheorghe, Adian; Jones, Sharon Monica

    2010-01-01

    The successful implementation of the next generation infrastructure systems requires solid understanding of their technical, social, political and economic aspects along with their interactions. The lack of historical data that relate to the long-term planning of complex systems introduces unique challenges for decision makers and involved stakeholders which in turn result in unsustainable systems. Also, the need to understand the infrastructure at the societal level and capture the interaction between multiple stakeholders becomes important. This paper proposes a methodology in order to develop a holistic approach aiming to provide an alternative subject-matter expert (SME) elicitation and data collection method for future sociotechnical systems. The methodology is adapted to Next Generation Air Transportation System (NextGen) decision making environment in order to demonstrate the benefits of this holistic approach.

  18. NextGen Future Safety Assessment Game

    NASA Technical Reports Server (NTRS)

    Ancel, Ersin; Gheorghe, Adrian; Jones, Sharon Monica

    2011-01-01

    The successful implementation of the next generation infrastructure systems requires solid understanding of their technical, social, political and economic aspects along with their interactions. The lack of historical data that relate to the long-term planning of complex systems introduces unique challenges for decision makers and involved stakeholders which in turn result in unsustainable systems. Also, the need to understand the infrastructure at the societal level and capture the interaction between multiple stakeholders becomes important. This paper proposes a methodology in order to develop a holistic approach aiming to provide an alternative subject-matter expert (SME) elicitation and data collection method for future sociotechnical systems. The methodology is adapted to Next Generation Air Transportation System (NextGen) decision making environment in order to demonstrate the benefits of this holistic approach.

  19. Next Generation Sequencing Approach in a Prenatal Case of Cardio-Facio-Cutaneus Syndrome.

    PubMed

    Mucciolo, Mafalda; Dello Russo, Claudio; D'Emidio, Laura; Mesoraca, Alvaro; Giorlandino, Claudio

    2016-06-16

    Cardiofaciocutaneous syndrome (CFCS) belongs to a group of developmental disorders due to defects in the Ras/Mitogen-Activated Protein Kinase (RAS/MAPK) signaling pathway named RASophaties. While postnatal presentation of these disorders is well known, the prenatal and neonatal characteristics are less recognized. Noonan syndrome, Costello syndrome, and CFCS diagnosis should be considered in pregnancies with a normal karyotype and in the case of ultrasound findings such as increased nuchal translucency, polyhydramnios, macrosomia and cardiac defect. Because all the RASopathies share similar clinical features, their molecular characterization is complex, time consuming and expensive. Here we report a case of CFCS prenatally diagnosed through Next Generation Prenatal Diagnosis (NGPD), a new targeted approach that allows us to concurrently investigate all the genes involved in the RASophaties.

  20. Next Generation Sequencing Approach in a Prenatal Case of Cardio-Facio-Cutaneus Syndrome

    PubMed Central

    Mucciolo, Mafalda; Dello Russo, Claudio; D’Emidio, Laura; Mesoraca, Alvaro; Giorlandino, Claudio

    2016-01-01

    Cardiofaciocutaneous syndrome (CFCS) belongs to a group of developmental disorders due to defects in the Ras/Mitogen-Activated Protein Kinase (RAS/MAPK) signaling pathway named RASophaties. While postnatal presentation of these disorders is well known, the prenatal and neonatal characteristics are less recognized. Noonan syndrome, Costello syndrome, and CFCS diagnosis should be considered in pregnancies with a normal karyotype and in the case of ultrasound findings such as increased nuchal translucency, polyhydramnios, macrosomia and cardiac defect. Because all the RASopathies share similar clinical features, their molecular characterization is complex, time consuming and expensive. Here we report a case of CFCS prenatally diagnosed through Next Generation Prenatal Diagnosis (NGPD), a new targeted approach that allows us to concurrently investigate all the genes involved in the RASophaties. PMID:27322245

  1. To create or to recall? Neural mechanisms underlying the generation of creative new ideas☆

    PubMed Central

    Benedek, Mathias; Jauk, Emanuel; Fink, Andreas; Koschutnig, Karl; Reishofer, Gernot; Ebner, Franz; Neubauer, Aljoscha C.

    2014-01-01

    This fMRI study investigated brain activation during creative idea generation using a novel approach allowing spontaneous self-paced generation and expression of ideas. Specifically, we addressed the fundamental question of what brain processes are relevant for the generation of genuinely new creative ideas, in contrast to the mere recollection of old ideas from memory. In general, creative idea generation (i.e., divergent thinking) was associated with extended activations in the left prefrontal cortex and the right medial temporal lobe, and with deactivation of the right temporoparietal junction. The generation of new ideas, as opposed to the retrieval of old ideas, was associated with stronger activation in the left inferior parietal cortex which is known to be involved in mental simulation, imagining, and future thought. Moreover, brain activation in the orbital part of the inferior frontal gyrus was found to increase as a function of the creativity (i.e., originality and appropriateness) of ideas pointing to the role of executive processes for overcoming dominant but uncreative responses. We conclude that the process of idea generation can be generally understood as a state of focused internally-directed attention involving controlled semantic retrieval. Moreover, left inferior parietal cortex and left prefrontal regions may subserve the flexible integration of previous knowledge for the construction of new and creative ideas. PMID:24269573

  2. Deciphering the Structural Requirements of Nucleoside Bisubstrate Analogues for Inhibition of MbtA in Mycobacterium tuberculosis: A FB-QSAR Study and Combinatorial Library Generation for Identifying Potential Hits.

    PubMed

    Maganti, Lakshmi; Das, Sanjit Kumar; Mascarenhas, Nahren Manuel; Ghoshal, Nanda

    2011-10-01

    The re-emergence of tuberculosis infections, which are resistant to conventional drug therapy, has steadily risen in the last decade. Inhibitors of aryl acid adenylating enzyme known as MbtA, involved in siderophore biosynthesis in Mycobacterium tuberculosis, are being explored as potential antitubercular agents. The ability to identify fragments that interact with a biological target is a key step in fragment based drug design (FBDD). To expand the boundaries of quantitative structure activity relationship (QSAR) paradigm, we have proposed a Fragment Based QSAR methodology, referred here in as FB-QSAR, for deciphering the structural requirements of a series of nucleoside bisubstrate analogs for inhibition of MbtA, a key enzyme involved in siderophore biosynthetic pathway. For the development of FB-QSAR models, statistical techniques such as stepwise multiple linear regression (SMLR), genetic function approximation (GFA) and GFAspline were used. The predictive ability of the generated models was validated using different statistical metrics, and similarity-based coverage estimation was carried out to define applicability boundaries. To aid the creation of novel antituberculosis compounds, a bioisosteric database was enumerated using the combichem approach endorsed mining in a lead-like chemical space. The generated library was screened using an integrated in-silico approach and potential hits identified. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. A cost-effective high-throughput metabarcoding approach powerful enough to genotype ~44 000 year-old rodent remains from Northern Africa.

    PubMed

    Guimaraes, S; Pruvost, M; Daligault, J; Stoetzel, E; Bennett, E A; Côté, N M-L; Nicolas, V; Lalis, A; Denys, C; Geigl, E-M; Grange, T

    2017-05-01

    We present a cost-effective metabarcoding approach, aMPlex Torrent, which relies on an improved multiplex PCR adapted to highly degraded DNA, combining barcoding and next-generation sequencing to simultaneously analyse many heterogeneous samples. We demonstrate the strength of these improvements by generating a phylochronology through the genotyping of ancient rodent remains from a Moroccan cave whose stratigraphy covers the last 120 000 years. Rodents are important for epidemiology, agronomy and ecological investigations and can act as bioindicators for human- and/or climate-induced environmental changes. Efficient and reliable genotyping of ancient rodent remains has the potential to deliver valuable phylogenetic and paleoecological information. The analysis of multiple ancient skeletal remains of very small size with poor DNA preservation, however, requires a sensitive high-throughput method to generate sufficient data. We show this approach to be particularly adapted at accessing this otherwise difficult taxonomic and genetic resource. As a highly scalable, lower cost and less labour-intensive alternative to targeted sequence capture approaches, we propose the aMPlex Torrent strategy to be a useful tool for the genetic analysis of multiple degraded samples in studies involving ecology, archaeology, conservation and evolutionary biology. © 2016 John Wiley & Sons Ltd.

  4. Comparisons of node-based and element-based approaches of assigning bone material properties onto subject-specific finite element models.

    PubMed

    Chen, G; Wu, F Y; Liu, Z C; Yang, K; Cui, F

    2015-08-01

    Subject-specific finite element (FE) models can be generated from computed tomography (CT) datasets of a bone. A key step is assigning material properties automatically onto finite element models, which remains a great challenge. This paper proposes a node-based assignment approach and also compares it with the element-based approach in the literature. Both approaches were implemented using ABAQUS. The assignment procedure is divided into two steps: generating the data file of the image intensity of a bone in a MATLAB program and reading the data file into ABAQUS via user subroutines. The node-based approach assigns the material properties to each node of the finite element mesh, while the element-based approach assigns the material properties directly to each integration point of an element. Both approaches are independent from the type of elements. A number of FE meshes are tested and both give accurate solutions; comparatively the node-based approach involves less programming effort. The node-based approach is also independent from the type of analyses; it has been tested on the nonlinear analysis of a Sawbone femur. The node-based approach substantially improves the level of automation of the assignment procedure of bone material properties. It is the simplest and most powerful approach that is applicable to many types of analyses and elements. Copyright © 2015 IPEM. Published by Elsevier Ltd. All rights reserved.

  5. Dispersed solar thermal generation employing parabolic dish-electric transport with field modulated generator systems

    NASA Technical Reports Server (NTRS)

    Ramakumar, R.; Bahrami, K.

    1981-01-01

    This paper discusses the application of field modulated generator systems (FMGS) to dispersed solar-thermal-electric generation from a parabolic dish field with electric transport. Each solar generation unit is rated at 15 kWe and the power generated by an array of such units is electrically collected for insertion into an existing utility grid. Such an approach appears to be most suitable when the heat engine rotational speeds are high (greater than 6000 r/min) and, in particular, if they are operated in the variable speed mode and if utility-grade a.c. is required for direct insertion into the grid without an intermediate electric energy storage and reconversion system. Predictions of overall efficiencies based on conservative efficiency figures for the FMGS are in the range of 25 per cent and should be encouraging to those involved in the development of cost-effective dispersed solar thermal power systems.

  6. Comparison of the different approaches to generate holograms from data acquired with a Kinect sensor

    NASA Astrophysics Data System (ADS)

    Kang, Ji-Hoon; Leportier, Thibault; Ju, Byeong-Kwon; Song, Jin Dong; Lee, Kwang-Hoon; Park, Min-Chul

    2017-05-01

    Data of real scenes acquired in real-time with a Kinect sensor can be processed with different approaches to generate a hologram. 3D models can be generated from a point cloud or a mesh representation. The advantage of the point cloud approach is that computation process is well established since it involves only diffraction and propagation of point sources between parallel planes. On the other hand, the mesh representation enables to reduce the number of elements necessary to represent the object. Then, even though the computation time for the contribution of a single element increases compared to a simple point, the total computation time can be reduced significantly. However, the algorithm is more complex since propagation of elemental polygons between non-parallel planes should be implemented. Finally, since a depth map of the scene is acquired at the same time than the intensity image, a depth layer approach can also be adopted. This technique is appropriate for a fast computation since propagation of an optical wavefront from one plane to another can be handled efficiently with the fast Fourier transform. Fast computation with depth layer approach is convenient for real time applications, but point cloud method is more appropriate when high resolution is needed. In this study, since Kinect can be used to obtain both point cloud and depth map, we examine the different approaches that can be adopted for hologram computation and compare their performance.

  7. The Trojan female technique: a novel, effective and humane approach for pest population control.

    PubMed

    Gemmell, Neil J; Jalilzadeh, Aidin; Didham, Raphael K; Soboleva, Tanya; Tompkins, Daniel M

    2013-12-22

    Humankind's ongoing battle with pest species spans millennia. Pests cause or carry disease, damage or consume food crops and other resources, and drive global environmental change. Conventional approaches to pest management usually involve lethal control, but such approaches are costly, of varying efficiency and often have ethical issues. Thus, pest management via control of reproductive output is increasingly considered an optimal solution. One of the most successful such 'fertility control' strategies developed to date is the sterile male technique (SMT), in which large numbers of sterile males are released into a population each generation. However, this approach is time-consuming, labour-intensive and costly. We use mathematical models to test a new twist on the SMT, using maternally inherited mitochondrial (mtDNA) mutations that affect male, but not female reproductive fitness. 'Trojan females' carrying such mutations, and their female descendants, produce 'sterile-male'-equivalents under natural conditions over multiple generations. We find that the Trojan female technique (TFT) has the potential to be a novel humane approach for pest control. Single large releases and relatively few small repeat releases of Trojan females both provided effective and persistent control within relatively few generations. Although greatest efficacy was predicted for high-turnover species, the additive nature of multiple releases made the TFT applicable to the full range of life histories modelled. The extensive conservation of mtDNA among eukaryotes suggests this approach could have broad utility for pest control.

  8. An Efficient, Simple, and Noninvasive Procedure for Genotyping Aquatic and Nonaquatic Laboratory Animals.

    PubMed

    Okada, Morihiro; Miller, Thomas C; Roediger, Julia; Shi, Yun-Bo; Schech, Joseph Mat

    2017-09-01

    Various animal models are indispensible in biomedical research. Increasing awareness and regulations have prompted the adaptation of more humane approaches in the use of laboratory animals. With the development of easier and faster methodologies to generate genetically altered animals, convenient and humane methods to genotype these animals are important for research involving such animals. Here, we report skin swabbing as a simple and noninvasive method for extracting genomic DNA from mice and frogs for genotyping. We show that this method is highly reliable and suitable for both immature and adult animals. Our approach allows a simpler and more humane approach for genotyping vertebrate animals.

  9. Dependence on collision energy of the stereodynamical properties of the 18O + 32O2 exchange reaction

    NASA Astrophysics Data System (ADS)

    Privat, E.; Guillon, G.; Honvault, P.

    2018-06-01

    We report a quantum stereodynamical study of the 18O + 16O16O(v = 0, j = 1) → 18O16O(v‧ = 0, j‧) + 16O oxygen exchange reaction at four different collision energies. We calculated the polarisation moments and generated stereodynamical portraits related to the key vectors involved in this collision process. Ozone complex-forming approaches of reactants are then deduced. The results indicate that different approaches are possible but strongly depend on the collision energy and other parameters of the collision. We also conclude that the reaction globally tends to favour a perpendicular approach with increasing energy.

  10. Decoding the Heart through Next Generation Sequencing Approaches.

    PubMed

    Pawlak, Michal; Niescierowicz, Katarzyna; Winata, Cecilia Lanny

    2018-06-07

    : Vertebrate organs develop through a complex process which involves interaction between multiple signaling pathways at the molecular, cell, and tissue levels. Heart development is an example of such complex process which, when disrupted, results in congenital heart disease (CHD). This complexity necessitates a holistic approach which allows the visualization of genome-wide interaction networks, as opposed to assessment of limited subsets of factors. Genomics offers a powerful solution to address the problem of biological complexity by enabling the observation of molecular processes at a genome-wide scale. The emergence of next generation sequencing (NGS) technology has facilitated the expansion of genomics, increasing its output capacity and applicability in various biological disciplines. The application of NGS in various aspects of heart biology has resulted in new discoveries, generating novel insights into this field of study. Here we review the contributions of NGS technology into the understanding of heart development and its disruption reflected in CHD and discuss how emerging NGS based methodologies can contribute to the further understanding of heart repair.

  11. Generation of Functional Thyroid Tissue Using 3D-Based Culture of Embryonic Stem Cells.

    PubMed

    Antonica, Francesco; Kasprzyk, Dominika Figini; Schiavo, Andrea Alex; Romitti, Mírian; Costagliola, Sabine

    2017-01-01

    During the last decade three-dimensional (3D) cultures of pluripotent stem cells have been intensively used to understand morphogenesis and molecular signaling important for the embryonic development of many tissues. In addition, pluripotent stem cells have been shown to be a valid tool for the in vitro modeling of several congenital or chronic human diseases, opening new possibilities to study their physiopathology without using animal models. Even more interestingly, 3D culture has proved to be a powerful and versatile tool to successfully generate functional tissues ex vivo. Using similar approaches, we here describe a protocol for the generation of functional thyroid tissue using mouse embryonic stem cells and give all the details and references for its characterization and analysis both in vitro and in vivo. This model is a valid approach to study the expression and the function of genes involved in the correct morphogenesis of thyroid gland, to elucidate the mechanisms of production and secretion of thyroid hormones and to test anti-thyroid drugs.

  12. Mobilizing cross-sector community partnerships to address the needs of criminal justice-involved older adults: a framework for action.

    PubMed

    Metzger, Lia; Ahalt, Cyrus; Kushel, Margot; Riker, Alissa; Williams, Brie

    2017-09-11

    Purpose The rapidly increasing number of older adults cycling through local criminal justice systems (jails, probation, and parole) suggests a need for greater collaboration among a diverse group of local stakeholders including professionals from healthcare delivery, public health, and criminal justice and directly affected individuals, their families, and advocates. The purpose of this paper is to develop a framework that local communities can use to understand and begin to address the needs of criminal justice-involved older adults. Design/methodology/approach The framework included solicit input from community stakeholders to identify pressing challenges facing criminal justice-involved older adults, conduct needs assessments of criminal justice-involved older adults and professionals working with them; implement quick-response interventions based on needs assessments; share findings with community stakeholders and generate public feedback; engage interdisciplinary group to develop an action plan to optimize services. Findings A five-step framework for creating an interdisciplinary community response is an effective approach to action planning and broad stakeholder engagement on behalf of older adults cycling through the criminal justice system. Originality/value This study proposes the Criminal Justice Involved Older Adults in Need of Treatment Initiative Framework for establishing an interdisciplinary community response to the growing population of medically and socially vulnerable criminal justice-involved older adults.

  13. Sustainable development: concept, value and practice.

    PubMed

    Barrow, C J

    1995-11-01

    The author discusses the concept of sustainable development (SD) and explores the effectiveness of implementation strategies. Approaches to implementing sustainable development include 1) "a stocktaking approach" that involves regional and national environmental audits, resource accounting, and national environmental action plans; and 2) "changes in people's attitudes." Each approach reinforces the other. Eden examined the International Chamber of Commerce reactions to the 1987 Brundtland Report and found that business generally favored SD over no-growth environmentalism. SD occurs as a process with a variety of routes that most often involve technology that improves upon traditional methods or protects from the destructive effects of modernization. SD assures that environmental quality is maintained, and economic and social development enhances resources and the environment. SD allows for the best quality of life for people. SD assures that future generations do not have reduced options. SD prevents or avoids major natural catastrophes. The requirements are corrective treatment of root causes of nonsustainability and a shift away from consumption-oriented life styles. Trade-offs must be made. Politicians and planners must use a longer planning perspective. There must be transition to smaller population numbers. Resource conflicts must be resolved. Pollution must be reduced and resources must not be wasted. Local resources should be used for agriculture, industry, and power generation. There should be a transition to a more equitable sharing of resources. The author identifies 12 other requirements. Progress thus far is disappointing and not demonstrably evident.

  14. Involving Older People in the Design, Development, and Delivery of an Innovative Module on Aging for Undergraduate Students

    ERIC Educational Resources Information Center

    Tullo, Ellen; Greaves, Laura; Wakeling, Luisa

    2016-01-01

    As the number of older people in society increases, gaining an awareness of the needs of an aging population is important for university students from all academic backgrounds. Using a multidisciplinary approach to aging, we developed a new teaching module (NU-AGE [Newcastle University Aging Generations Education]) aimed at students enrolled in…

  15. Small Impacts on Mars: Atmospheric Effects

    NASA Technical Reports Server (NTRS)

    Greeley, Ronald; Nemtchinov, Ivan V.

    2002-01-01

    The objectives of this investigation were to study the interaction of the atmosphere with the surface of Mars through the impact of small objects that would generate dust and set the dust into motion in the atmosphere. The approach involved numerical simulations of impacts and experiments under controlled conditions. Attachment: Atmospheric disturbances and radiation impulses caused by large-meteoroid impact in the surface of Mars.

  16. Thai Automatic Speech Recognition

    DTIC Science & Technology

    2005-01-01

    used in an external DARPA evaluation involving medical scenarios between an American Doctor and a naïve monolingual Thai patient. 2. Thai Language... dictionary generation more challenging, and (3) the lack of word segmentation, which calls for automatic segmentation approaches to make n-gram language...requires a dictionary and provides various segmentation algorithms to automatically select suitable segmentations. Here we used a maximal matching

  17. Closed-Loop Rehabilitation of Age-Related Cognitive Disorders

    PubMed Central

    Mishra, Jyoti; Gazzaley, Adam

    2015-01-01

    Cognitive deficits are common in older adults, as a result of both the natural aging process and neurodegenerative disease. Although medical advancements have successfully prolonged the human lifespan, the challenge of remediating cognitive aging remains. The authors discuss the current state of cognitive therapeutic interventions and then present the need for development and validation of more powerful neurocognitive therapeutics. They propose that the next generation of interventions be implemented as closed-loop systems that target specific neural processing deficits, incorporate quantitative feedback to the individual and clinician, and are personalized to the individual’s neurocognitive capacities using real-time performance-adaptive algorithms. This approach should be multimodal and seamlessly integrate other treatment approaches, including neurofeedback and transcranial electrical stimulation. This novel approach will involve the generation of software that engages the individual in an immersive and enjoyable game-based interface, integrated with advanced biosensing hardware, to maximally harness plasticity and assure adherence. Introducing such next-generation closed-loop neurocognitive therapeutics into the mainstream of our mental health care system will require the combined efforts of clinicians, neuroscientists, bioengineers, software game developers, and industry and policy makers working together to meet the challenges and opportunities of translational neuroscience in the 21st century. PMID:25520029

  18. Gaining a Competitive Edge through Action Design Research

    NASA Astrophysics Data System (ADS)

    Alexa, L.; Alexa, M.; Avasilcăi, S.

    2016-08-01

    The current business environment is characterized by increased competition and highly innovative approach, in order to create products and services to better respond to the costumers’ needs and expectations. In this specific context, the research approaches need to be more flexible and business oriented and so, throughout the paper we have used a research method that combines design research and action research, named Action Design Research which is a research method used for generating prescriptive design knowledge through building and evaluating IT artifacts in an organizational setting [1]. Following the Action Design Research stages and principles: problem identification, building, intervention and evaluation, reflection and learning and formalization of learning, the research team has developed an online instrument used to actively involve the consumer in the product development process, in order to generate a better consumers insight regarding their needs and desires and to design and/or adjust the product accordingly. The customer engagement IT tool created and tested by using Action Design Research, E-PICUS, has been developed within the framework of the research project „E-solutions for innovation through customer pro-active involvement in value creation to increase organisational competitiveness (E-PICUS)”, PN- II-PT-PCCA-2013-4-1811, currently undergoing.

  19. Theoretical Estimation of Thermal Effects in Drilling of Woven Carbon Fiber Composite

    PubMed Central

    Díaz-Álvarez, José; Olmedo, Alvaro; Santiuste, Carlos; Miguélez, María Henar

    2014-01-01

    Carbon Fiber Reinforced Polymer (CFRPs) composites are extensively used in structural applications due to their attractive properties. Although the components are usually made near net shape, machining processes are needed to achieve dimensional tolerance and assembly requirements. Drilling is a common operation required for further mechanical joining of the components. CFRPs are vulnerable to processing induced damage; mainly delamination, fiber pull-out, and thermal degradation, drilling induced defects being one of the main causes of component rejection during manufacturing processes. Despite the importance of analyzing thermal phenomena involved in the machining of composites, only few authors have focused their attention on this problem, most of them using an experimental approach. The temperature at the workpiece could affect surface quality of the component and its measurement during processing is difficult. The estimation of the amount of heat generated during drilling is important; however, numerical modeling of drilling processes involves a high computational cost. This paper presents a combined approach to thermal analysis of composite drilling, using both an analytical estimation of heat generated during drilling and numerical modeling for heat propagation. Promising results for indirect detection of risk of thermal damage, through the measurement of thrust force and cutting torque, are obtained. PMID:28788685

  20. An efficient approach to BAC based assembly of complex genomes.

    PubMed

    Visendi, Paul; Berkman, Paul J; Hayashi, Satomi; Golicz, Agnieszka A; Bayer, Philipp E; Ruperao, Pradeep; Hurgobin, Bhavna; Montenegro, Juan; Chan, Chon-Kit Kenneth; Staňková, Helena; Batley, Jacqueline; Šimková, Hana; Doležel, Jaroslav; Edwards, David

    2016-01-01

    There has been an exponential growth in the number of genome sequencing projects since the introduction of next generation DNA sequencing technologies. Genome projects have increasingly involved assembly of whole genome data which produces inferior assemblies compared to traditional Sanger sequencing of genomic fragments cloned into bacterial artificial chromosomes (BACs). While whole genome shotgun sequencing using next generation sequencing (NGS) is relatively fast and inexpensive, this method is extremely challenging for highly complex genomes, where polyploidy or high repeat content confounds accurate assembly, or where a highly accurate 'gold' reference is required. Several attempts have been made to improve genome sequencing approaches by incorporating NGS methods, to variable success. We present the application of a novel BAC sequencing approach which combines indexed pools of BACs, Illumina paired read sequencing, a sequence assembler specifically designed for complex BAC assembly, and a custom bioinformatics pipeline. We demonstrate this method by sequencing and assembling BAC cloned fragments from bread wheat and sugarcane genomes. We demonstrate that our assembly approach is accurate, robust, cost effective and scalable, with applications for complete genome sequencing in large and complex genomes.

  1. Pearls and pitfalls in genetic studies of migraine.

    PubMed

    Eising, Else; de Vries, Boukje; Ferrari, Michel D; Terwindt, Gisela M; van den Maagdenberg, Arn M J M

    2013-06-01

    Migraine is a prevalent neurovascular brain disorder with a strong genetic component, and different methodological approaches have been implemented to identify the genes involved. This review focuses on pearls and pitfalls of these approaches and genetic findings in migraine. Common forms of migraine (i.e. migraine with and without aura) are thought to have a polygenic make-up, whereas rare familial hemiplegic migraine (FHM) presents with a monogenic pattern of inheritance. Until a few years ago only studies in FHM yielded causal genes, which were identified by a classical linkage analysis approach. Functional analyses of FHM gene mutations in cellular and transgenic animal models suggest abnormal glutamatergic neurotransmission as a possible key disease mechanism. Recently, a number of genes were discovered for the common forms of migraine using a genome-wide association (GWA) approach, which sheds first light on the pathophysiological mechanisms involved. Novel technological strategies such as next-generation sequencing, which can be implemented in future genetic migraine research, may aid the identification of novel FHM genes and promote the search for the missing heritability of common migraine.

  2. The anarchic hand syndrome and utilization behavior: a window onto agentive self-awareness.

    PubMed

    Pacherie, Elisabeth

    2007-01-01

    Two main approaches can be discerned in the literature on agentive self-awareness: a top-down approach, according to which agentive self-awareness is fundamentally holistic in nature and involves the operations of a central-systems narrator, and a bottom-up approach that sees agentive self-awareness as produced by lowlevel processes grounded in the very machinery responsible for motor production and control. Neither approach is entirely satisfactory if taken in isolation; however, the question of whether their combination would yield a full account of agentive self-awareness remains very much open. In this paper, I contrast two disorders affecting the control of voluntary action: the anarchic hand syndrome and utilization behavior. Although in both conditions patients fail to inhibit actions that are elicited by objects in the environment but inappropriate with respect to the wider context, these actions are experienced in radically different ways by the two groups of patients. I discuss how top-down and bottom-up processes involved in the generation of agentive self-awareness would have to be related in order to account for these differences.

  3. Enhancement in production of recombinant two-chain Insulin Glargine by over-expression of Kex2 protease in Pichia pastoris.

    PubMed

    Sreenivas, Suma; Krishnaiah, Sateesh M; Govindappa, Nagaraja; Basavaraju, Yogesh; Kanojia, Komal; Mallikarjun, Niveditha; Natarajan, Jayaprakash; Chatterjee, Amarnath; Sastry, Kedarnath N

    2015-01-01

    Glargine is an analog of Insulin currently being produced by recombinant DNA technology using two different hosts namely Escherichia coli and Pichia pastoris. Production from E. coli involves the steps of extraction of inclusion bodies by cell lysis, refolding, proteolytic cleavage and purification. In P. pastoris, a single-chain precursor with appropriate disulfide bonding is secreted to the medium. Downstream processing currently involves use of trypsin which converts the precursor into two-chain final product. The use of trypsin in the process generates additional impurities due to presence of Lys and Arg residues in the Glargine molecule. In this study, we describe an alternate approach involving over-expression of endogenous Kex2 proprotein convertase, taking advantage of dibasic amino acid sequence (Arg-Arg) at the end of B-chain of Glargine. KEX2 gene over-expression in Pichia was accomplished by using promoters of varying strengths to ensure production of greater levels of fully functional two-chain Glargine product, confirmed by HPLC and mass analysis. In conclusion, this new production process involving Kex2 protease over-expression improves the downstream process efficiency, reduces the levels of impurities generated and decreases the use of raw materials.

  4. Holistic systems biology approaches to molecular mechanisms of human helper T cell differentiation to functionally distinct subsets.

    PubMed

    Chen, Z; Lönnberg, T; Lahesmaa, R

    2013-08-01

    Current knowledge of helper T cell differentiation largely relies on data generated from mouse studies. To develop therapeutical strategies combating human diseases, understanding the molecular mechanisms how human naïve T cells differentiate to functionally distinct T helper (Th) subsets as well as studies on human differentiated Th cell subsets is particularly valuable. Systems biology approaches provide a holistic view of the processes of T helper differentiation, enable discovery of new factors and pathways involved and generation of new hypotheses to be tested to improve our understanding of human Th cell differentiation and immune-mediated diseases. Here, we summarize studies where high-throughput systems biology approaches have been exploited to human primary T cells. These studies reveal new factors and signalling pathways influencing T cell differentiation towards distinct subsets, important for immune regulation. Such information provides new insights into T cell biology and into targeting immune system for therapeutic interventions. © 2013 John Wiley & Sons Ltd.

  5. Nanostructured thermites based on iodine pentoxide for bio agent defeat systems.

    NASA Astrophysics Data System (ADS)

    Hobosyan, Mkhitar; Kazansky, Alexander; Martirosyan, Karen

    2011-10-01

    The risk for bioterrorist events involving the intentional airborne release of contagious agents has led to development of new approaches for bio agent defeat technologies both indoors and outdoors. Novel approaches to defeat harmful biological agents have generated a strong demand for new active materials. The preferred solutions are to neutralize the biological agents within the immediate target area by using aerosolized biocidal substances released in situ by high energetic reactions. By using nano-thermite reactions, with energy release up to 25 kJ/cc, based on I2O5/Al nanoparticles we intend to generate high quantity of vaporized iodine for spatial deposition onto harmful bacteria for their destruction. In this report, the effect of reaction product on growth and survival of Escherichia coli (E-coli) expressing GFP (Green Fluorescent Protein) was investigated. Moreover, we developed an approach to increase sensitivity of the detection. The study has shown that I2O5/Al nanosystem is extremely effective to disinfect harmful biological agents such (E-coli) bacteria in seconds.

  6. Developing creativity and problem-solving skills of engineering students: a comparison of web- and pen-and-paper-based approaches

    NASA Astrophysics Data System (ADS)

    Valentine, Andrew; Belski, Iouri; Hamilton, Margaret

    2017-11-01

    Problem-solving is a key engineering skill, yet is an area in which engineering graduates underperform. This paper investigates the potential of using web-based tools to teach students problem-solving techniques without the need to make use of class time. An idea generation experiment involving 90 students was designed. Students were surveyed about their study habits and reported they use electronic-based materials more than paper-based materials while studying, suggesting students may engage with web-based tools. Students then generated solutions to a problem task using either a paper-based template or an equivalent web interface. Students who used the web-based approach performed as well as students who used the paper-based approach, suggesting the technique can be successfully adopted and taught online. Web-based tools may therefore be adopted as supplementary material in a range of engineering courses as a way to increase students' options for enhancing problem-solving skills.

  7. Generating demand and community support for sexual and reproductive health services for young people: A review of the Literature and Programs.

    PubMed

    Kesterton, Amy J; Cabral de Mello, Meena

    2010-09-24

    This review investigates the effectiveness of interventions aimed at generating demand for and use of sexual and reproductive health (SRH) services by young people; and interventions aimed at generating wider community support for their use. Reports and publications were found in the peer-reviewed and grey literature through academic search engines; web searches; the bibliographies of known conference proceedings and papers; and consultation with experts. The studies were reviewed against a set of inclusion criteria and those that met these were explored in more depth. The evidence-base for interventions aimed at both generating demand and community support for SRH services for young people was found under-developed and many available studies do not provide strong evidence. However, the potential of several methods to increase youth uptake has been demonstrated, this includes the linking of school education programs with youth friendly services, life skills approaches and social marketing and franchising. There is also evidence that the involvement of key community gatekeepers such as parents and religious leaders is vital to generating wider community support. In general a combined multi-component approach seems most promising with several success stories to build on. Many areas for further research have been highlighted and there is a great need for more rigorous evaluation of programmes in this area. In particular, further evaluation of individual components within a multi-component approach is needed to elucidate the most effective interventions.

  8. Engineering kidney cells: reprogramming and directed differentiation to renal tissues.

    PubMed

    Kaminski, Michael M; Tosic, Jelena; Pichler, Roman; Arnold, Sebastian J; Lienkamp, Soeren S

    2017-07-01

    Growing knowledge of how cell identity is determined at the molecular level has enabled the generation of diverse tissue types, including renal cells from pluripotent or somatic cells. Recently, several in vitro protocols involving either directed differentiation or transcription-factor-based reprogramming to kidney cells have been established. Embryonic stem cells or induced pluripotent stem cells can be guided towards a kidney fate by exposing them to combinations of growth factors or small molecules. Here, renal development is recapitulated in vitro resulting in kidney cells or organoids that show striking similarities to mammalian embryonic nephrons. In addition, culture conditions are also defined that allow the expansion of renal progenitor cells in vitro. Another route towards the generation of kidney cells is direct reprogramming. Key transcription factors are used to directly impose renal cell identity on somatic cells, thus circumventing the pluripotent stage. This complementary approach to stem-cell-based differentiation has been demonstrated to generate renal tubule cells and nephron progenitors. In-vitro-generated renal cells offer new opportunities for modelling inherited and acquired renal diseases on a patient-specific genetic background. These cells represent a potential source for developing novel models for kidney diseases, drug screening and nephrotoxicity testing and might represent the first steps towards kidney cell replacement therapies. In this review, we summarize current approaches for the generation of renal cells in vitro and discuss the advantages of each approach and their potential applications.

  9. Generating demand and community support for sexual and reproductive health services for young people: A review of the Literature and Programs

    PubMed Central

    2010-01-01

    Background This review investigates the effectiveness of interventions aimed at generating demand for and use of sexual and reproductive health (SRH) services by young people; and interventions aimed at generating wider community support for their use. Methods Reports and publications were found in the peer-reviewed and grey literature through academic search engines; web searches; the bibliographies of known conference proceedings and papers; and consultation with experts. The studies were reviewed against a set of inclusion criteria and those that met these were explored in more depth. Results The evidence-base for interventions aimed at both generating demand and community support for SRH services for young people was found under-developed and many available studies do not provide strong evidence. However, the potential of several methods to increase youth uptake has been demonstrated, this includes the linking of school education programs with youth friendly services, life skills approaches and social marketing and franchising. There is also evidence that the involvement of key community gatekeepers such as parents and religious leaders is vital to generating wider community support. In general a combined multi-component approach seems most promising with several success stories to build on. Conclusions Many areas for further research have been highlighted and there is a great need for more rigorous evaluation of programmes in this area. In particular, further evaluation of individual components within a multi-component approach is needed to elucidate the most effective interventions. PMID:20863411

  10. An eco-compatible strategy for the diversity-oriented synthesis of macrocycles exploiting carbohydrate-derived building blocks.

    PubMed

    Maurya, Sushil K; Rana, Rohit

    2017-01-01

    An efficient, eco-compatible diversity-oriented synthesis (DOS) approach for the generation of library of sugar embedded macrocyclic compounds with various ring size containing 1,2,3-triazole has been developed. This concise strategy involves the iterative use of readily available sugar-derived alkyne/azide-alkene building blocks coupled through copper catalyzed azide-alkyne cycloaddition (CuAAC) reaction followed by pairing of the linear cyclo-adduct using greener reaction conditions. The eco-compatibility, mild reaction conditions, greener solvents, easy purification and avoidance of hazards and toxic solvents are advantages of this protocol to access this important structural class. The diversity of the macrocycles synthesized (in total we have synthesized 13 macrocycles) using a set of standard reaction protocols demonstrate the potential of the new eco-compatible approach for the macrocyclic library generation.

  11. Engaging Medical Students in Research: Reaching out to the Next Generation of Physician-Scientists

    PubMed Central

    Cluver, Jeffrey S.; Book, Sarah W.; Brady, Kathleen T.; Thornley, Nicola; Back, Sudie E.

    2013-01-01

    Objective The authors describe a multifaceted educational training approach aimed at increasing medical student involvement in psychiatric research. Method A description of the initiative is provided, including the rationale and expected impact of each component. Results Medical student involvement in research projects has increased steadily since implementation. This applies to summer research projects as well as elective research rotations for senior medical students. Furthermore, a substantial proportion of students who participate in research continue to engage in research activities following completion of the program (e.g., through additional research participation, conference presentations). Conclusion A proactive and well-organized approach to encouraging medical student participation in research can increase the number of students who choose to engage in a research and may ultimately help increase the number of physician-scientists. PMID:24913099

  12. Bioinformatics and expressional analysis of cDNA clones from floral buds

    NASA Astrophysics Data System (ADS)

    Pawełkowicz, Magdalena Ewa; Skarzyńska, Agnieszka; Cebula, Justyna; Hincha, Dirck; ZiÄ bska, Karolina; PlÄ der, Wojciech; Przybecki, Zbigniew

    2017-08-01

    The application of genomic approaches may serve as an initial step in understanding the complexity of biochemical network and cellular processes responsible for regulation and execution of many developmental tasks. The molecular mechanism of sex expression in cucumber is still not elucidated. A study of differential expression was conducted to identify genes involved in sex determination and floral organ morphogenesis. Herein, we present generation of expression sequence tags (EST) obtained by differential hybridization (DH) and subtraction technique (cDNA-DSC) and their characteristic features such as molecular function, involvement in biology processes, expression and mapping position on the genome.

  13. The feasibility of an efficient drug design method with high-performance computers.

    PubMed

    Yamashita, Takefumi; Ueda, Akihiko; Mitsui, Takashi; Tomonaga, Atsushi; Matsumoto, Shunji; Kodama, Tatsuhiko; Fujitani, Hideaki

    2015-01-01

    In this study, we propose a supercomputer-assisted drug design approach involving all-atom molecular dynamics (MD)-based binding free energy prediction after the traditional design/selection step. Because this prediction is more accurate than the empirical binding affinity scoring of the traditional approach, the compounds selected by the MD-based prediction should be better drug candidates. In this study, we discuss the applicability of the new approach using two examples. Although the MD-based binding free energy prediction has a huge computational cost, it is feasible with the latest 10 petaflop-scale computer. The supercomputer-assisted drug design approach also involves two important feedback procedures: The first feedback is generated from the MD-based binding free energy prediction step to the drug design step. While the experimental feedback usually provides binding affinities of tens of compounds at one time, the supercomputer allows us to simultaneously obtain the binding free energies of hundreds of compounds. Because the number of calculated binding free energies is sufficiently large, the compounds can be classified into different categories whose properties will aid in the design of the next generation of drug candidates. The second feedback, which occurs from the experiments to the MD simulations, is important to validate the simulation parameters. To demonstrate this, we compare the binding free energies calculated with various force fields to the experimental ones. The results indicate that the prediction will not be very successful, if we use an inaccurate force field. By improving/validating such simulation parameters, the next prediction can be made more accurate.

  14. Genomics Approaches For Improving Salinity Stress Tolerance in Crop Plants.

    PubMed

    Nongpiur, Ramsong Chantre; Singla-Pareek, Sneh Lata; Pareek, Ashwani

    2016-08-01

    Salinity is one of the major factors which reduces crop production worldwide. Plant responses to salinity are highly complex and involve a plethora of genes. Due to its multigenicity, it has been difficult to attain a complete understanding of how plants respond to salinity. Genomics has progressed tremendously over the past decade and has played a crucial role towards providing necessary knowledge for crop improvement. Through genomics, we have been able to identify and characterize the genes involved in salinity stress response, map out signaling pathways and ultimately utilize this information for improving the salinity tolerance of existing crops. The use of new tools, such as gene pyramiding, in genetic engineering and marker assisted breeding has tremendously enhanced our ability to generate stress tolerant crops. Genome editing technologies such as Zinc finger nucleases, TALENs and CRISPR/Cas9 also provide newer and faster avenues for plant biologists to generate precisely engineered crops.

  15. Preparation of hydrophobic organic aeorgels

    DOEpatents

    Baumann, Theodore F.; Satcher, Jr., Joe H.; Gash, Alexander E.

    2007-11-06

    Synthetic methods for the preparation of hydrophobic organics aerogels. One method involves the sol-gel polymerization of 1,3-dimethoxybenzene or 1,3,5-trimethoxybenzene with formaldehyde in non-aqueous solvents. Using a procedure analogous to the preparation of resorcinol-formaldehyde (RF) aerogels, this approach generates wet gels that can be dried using either supercritical solvent extraction to generate the new organic aerogels or air dried to produce an xerogel. Other methods involve the sol-gel polymerization of 1,3,5 trihydroxy benzene (phloroglucinol) or 1,3 dihydroxy benzene (resorcinol) and various aldehydes in non-aqueous solvents. These methods use a procedure analogous to the one-step base and two-step base/acid catalyzed polycondensation of phloroglucinol and formaldehyde, but the base catalyst used is triethylamine. These methods can be applied to a variety of other sol-gel precursors and solvent systems. These hydrophobic organics aerogels have numerous application potentials in the field of material absorbers and water-proof insulation.

  16. Preparation of hydrophobic organic aeorgels

    DOEpatents

    Baumann, Theodore F.; Satcher, Jr., Joe H.; Gash, Alexander E.

    2004-10-19

    Synthetic methods for the preparation of hydrophobic organics aerogels. One method involves the sol-gel polymerization of 1,3-dimethoxybenzene or 1,3,5-trimethoxybenzene with formaldehyde in non-aqueous solvents. Using a procedure analogous to the preparation of resorcinol-formaldehyde (RF) aerogels, this approach generates wet gels that can be dried using either supercritical solvent extraction to generate the new organic aerogels or air dried to produce an xerogel. Other methods involve the sol-gel polymerization of 1,3,5 trihydroxy benzene (phloroglucinol) or 1,3 dihydroxy benzene (resorcinol) and various aldehydes in non-aqueous solvents. These methods use a procedure analogous to the one-step base and two-step base/acid catalyzed polycondensation of phloroglucinol and formaldehyde, but the base catalyst used is triethylamine. These methods can be applied to a variety of other sol-gel precursors and solvent systems. These hydrophobic organics aerogels have numerous application potentials in the field of material absorbers and water-proof insulation.

  17. Cross-talk between Rho and Rac GTPases drives deterministic exploration of cellular shape space and morphological heterogeneity.

    PubMed

    Sailem, Heba; Bousgouni, Vicky; Cooper, Sam; Bakal, Chris

    2014-01-22

    One goal of cell biology is to understand how cells adopt different shapes in response to varying environmental and cellular conditions. Achieving a comprehensive understanding of the relationship between cell shape and environment requires a systems-level understanding of the signalling networks that respond to external cues and regulate the cytoskeleton. Classical biochemical and genetic approaches have identified thousands of individual components that contribute to cell shape, but it remains difficult to predict how cell shape is generated by the activity of these components using bottom-up approaches because of the complex nature of their interactions in space and time. Here, we describe the regulation of cellular shape by signalling systems using a top-down approach. We first exploit the shape diversity generated by systematic RNAi screening and comprehensively define the shape space a migratory cell explores. We suggest a simple Boolean model involving the activation of Rac and Rho GTPases in two compartments to explain the basis for all cell shapes in the dataset. Critically, we also generate a probabilistic graphical model to show how cells explore this space in a deterministic, rather than a stochastic, fashion. We validate the predictions made by our model using live-cell imaging. Our work explains how cross-talk between Rho and Rac can generate different cell shapes, and thus morphological heterogeneity, in genetically identical populations.

  18. Constraining continuous rainfall simulations for derived design flood estimation

    NASA Astrophysics Data System (ADS)

    Woldemeskel, F. M.; Sharma, A.; Mehrotra, R.; Westra, S.

    2016-11-01

    Stochastic rainfall generation is important for a range of hydrologic and water resources applications. Stochastic rainfall can be generated using a number of models; however, preserving relevant attributes of the observed rainfall-including rainfall occurrence, variability and the magnitude of extremes-continues to be difficult. This paper develops an approach to constrain stochastically generated rainfall with an aim of preserving the intensity-durationfrequency (IFD) relationships of the observed data. Two main steps are involved. First, the generated annual maximum rainfall is corrected recursively by matching the generated intensity-frequency relationships to the target (observed) relationships. Second, the remaining (non-annual maximum) rainfall is rescaled such that the mass balance of the generated rain before and after scaling is maintained. The recursive correction is performed at selected storm durations to minimise the dependence between annual maximum values of higher and lower durations for the same year. This ensures that the resulting sequences remain true to the observed rainfall as well as represent the design extremes that may have been developed separately and are needed for compliance reasons. The method is tested on simulated 6 min rainfall series across five Australian stations with different climatic characteristics. The results suggest that the annual maximum and the IFD relationships are well reproduced after constraining the simulated rainfall. While our presentation focusses on the representation of design rainfall attributes (IFDs), the proposed approach can also be easily extended to constrain other attributes of the generated rainfall, providing an effective platform for post-processing of stochastic rainfall generators.

  19. Public involvement in health priority setting: future challenges for policy, research and society.

    PubMed

    Hunter, David James; Kieslich, Katharina; Littlejohns, Peter; Staniszewska, Sophie; Tumilty, Emma; Weale, Albert; Williams, Iestyn

    2016-08-15

    Purpose - The purpose of this paper is to reflect on the findings of this special issue and discusses the future challenges for policy, research and society. The findings suggest that challenges emerge as a result of legitimacy deficits of both consensus and contestatory modes of public involvement in health priority setting. Design/methodology/approach - The paper draws on the discussions and findings presented in this special issue. It seeks to bring the country experiences and case studies together to draw conclusions for policy, research and society. Findings - At least two recurring themes emerge. An underlying theme is the importance, but also the challenge, of establishing legitimacy in health priority setting. The country experiences suggest that we understand very little about the conditions under which representative, or authentic, participation generates legitimacy and under which it will be regarded as insufficient. A second observation is that public participation takes a variety of forms that depend on the opportunity structures in a given national context. Given this variety the conceptualization of public participation needs to be expanded to account for the many forms of public participation. Originality/value - The paper concludes that the challenges of public involvement are closely linked to the question of how legitimate processes and decisions can be generated in priority setting. This suggests that future research must focus more narrowly on conditions under which legitimacy are generated in order to expand the understanding of public involvement in health prioritization.

  20. Exact nonlinear command generation and tracking for robot manipulators and spacecraft slewing maneuvers

    NASA Technical Reports Server (NTRS)

    Dywer, T. A. W., III; Lee, G. K. F.

    1984-01-01

    In connection with the current interest in agile spacecraft maneuvers, it has become necessary to consider the nonlinear coupling effects of multiaxial rotation in the treatment of command generation and tracking problems. Multiaxial maneuvers will be required in military missions involving a fast acquisition of moving targets in space. In addition, such maneuvers are also needed for the efficient operation of robot manipulators. Attention is given to details regarding the direct nonlinear command generation and tracking, an approach which has been successfully applied to the design of control systems for V/STOL aircraft, linearizing transformations for spacecraft controlled with external thrusters, the case of flexible spacecraft dynamics, examples from robot dynamics, and problems of implementation and testing.

  1. Developing Creativity and Problem-Solving Skills of Engineering Students: A Comparison of Web- and Pen-and-Paper-Based Approaches

    ERIC Educational Resources Information Center

    Valentine, Andrew; Belski, Iouri; Hamilton, Margaret

    2017-01-01

    Problem-solving is a key engineering skill, yet is an area in which engineering graduates underperform. This paper investigates the potential of using web-based tools to teach students problem-solving techniques without the need to make use of class time. An idea generation experiment involving 90 students was designed. Students were surveyed…

  2. Dual C-H functionalization of N-aryl amines: synthesis of polycyclic amines via an oxidative Povarov approach.

    PubMed

    Min, Chang; Sanchawala, Abbas; Seidel, Daniel

    2014-05-16

    Iminium ions generated in situ via copper(I) bromide catalyzed oxidation of N-aryl amines readily undergo [4 + 2] cycloadditions with a range of dienophiles. This method involves the functionalization of both a C(sp(3))-H and a C(sp(2))-H bond and enables the rapid construction of polycyclic amines under relatively mild conditions.

  3. Environmental life cycle assessment of methanol and electricity co-production system based on coal gasification technology.

    PubMed

    Śliwińska, Anna; Burchart-Korol, Dorota; Smoliński, Adam

    2017-01-01

    This paper presents a life cycle assessment (LCA) of greenhouse gas emissions generated through methanol and electricity co-production system based on coal gasification technology. The analysis focuses on polygeneration technologies from which two products are produced, and thus, issues related to an allocation procedure for LCA are addressed in this paper. In the LCA, two methods were used: a 'system expansion' method based on two approaches, the 'avoided burdens approach' and 'direct system enlargement' methods and an 'allocation' method involving proportional partitioning based on physical relationships in a technological process. Cause-effect relationships in the analysed production process were identified, allowing for the identification of allocation factors. The 'system expansion' method involved expanding the analysis to include five additional variants of electricity production technologies in Poland (alternative technologies). This method revealed environmental consequences of implementation for the analysed technologies. It was found that the LCA of polygeneration technologies based on the 'system expansion' method generated a more complete source of information on environmental consequences than the 'allocation' method. The analysis shows that alternative technologies chosen for generating LCA results are crucial. Life cycle assessment was performed for the analysed, reference and variant alternative technologies. Comparative analysis was performed between the analysed technologies of methanol and electricity co-production from coal gasification as well as a reference technology of methanol production from the natural gas reforming process. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. A Multiplexed Amplicon Approach for Detecting Gene Fusions by Next-Generation Sequencing.

    PubMed

    Beadling, Carol; Wald, Abigail I; Warrick, Andrea; Neff, Tanaya L; Zhong, Shan; Nikiforov, Yuri E; Corless, Christopher L; Nikiforova, Marina N

    2016-03-01

    Chromosomal rearrangements that result in oncogenic gene fusions are clinically important drivers of many cancer types. Rapid and sensitive methods are therefore needed to detect a broad range of gene fusions in clinical specimens that are often of limited quantity and quality. We describe a next-generation sequencing approach that uses a multiplex PCR-based amplicon panel to interrogate fusion transcripts that involve 19 driver genes and 94 partners implicated in solid tumors. The panel also includes control assays that evaluate the 3'/5' expression ratios of 12 oncogenic kinases, which might be used to infer gene fusion events when the partner is unknown or not included on the panel. There was good concordance between the solid tumor fusion gene panel and other methods, including fluorescence in situ hybridization, real-time PCR, Sanger sequencing, and other next-generation sequencing panels, because 40 specimens known to harbor gene fusions were correctly identified. No specific fusion reads were observed in 59 fusion-negative specimens. The 3'/5' expression ratio was informative for fusions that involved ALK, RET, and NTRK1 but not for BRAF or ROS1 fusions. However, among 37 ALK or RET fusion-negative specimens, four exhibited elevated 3'/5' expression ratios, indicating that fusions predicted solely by 3'/5' read ratios require confirmatory testing. Copyright © 2016 American Society for Investigative Pathology and the Association for Molecular Pathology. Published by Elsevier Inc. All rights reserved.

  5. The rise of genomic profiling in ovarian cancer

    PubMed Central

    Previs, Rebecca A.; Sood, Anil K.; Mills, Gordon B.; Westin, Shannon N.

    2017-01-01

    Introduction Next-generation sequencing and advances in ‘omics technology have rapidly increased our understanding of the molecular landscape of epithelial ovarian cancers. Areas covered Once characterized only by histologic appearance and clinical behavior, we now understand many of the molecular phenotypes that underlie the different ovarian cancer subtypes. While the current approach to treatment involves standard cytotoxic therapies after cytoreductive surgery for all ovarian cancers regardless of histologic or molecular characteristics, focus has shifted beyond a ‘one size fits all’ approach to ovarian cancer. Expert commentary Genomic profiling offers potentially ‘actionable’ opportunities for development of targeted therapies and a more individualized approach to treatment with concomitant improved outcomes and decreased toxicity. PMID:27828713

  6. Review of controlled fusion research using laser heating.

    NASA Technical Reports Server (NTRS)

    Hertzberg, A.

    1973-01-01

    Development of methods for generating high laser pulse energy has stimulated research leading to new ideas for practical controlled thermonuclear fusion machines. A review is presented of some important efforts in progress, and two different approaches have been selected as examples for discussion. One involves the concept of very short pulse lasers with power output tailored, in time, to obtain a nearly isentropic compression of a deuterium-tritium pellet to very high densities and temperatures. A second approach utilizing long wavelength, long pulse, efficient gas lasers to heat a column of plasma contained in a solenoidal field is also discussed. The working requirements of the laser and various magnetic field geometries of this approach are described.

  7. Dynamics of polymers: A mean-field theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fredrickson, Glenn H.; Materials Research Laboratory, University of California, Santa Barbara, California 93106; Department of Materials, University of California, Santa Barbara, California 93106

    2014-02-28

    We derive a general mean-field theory of inhomogeneous polymer dynamics; a theory whose form has been speculated and widely applied, but not heretofore derived. Our approach involves a functional integral representation of a Martin-Siggia-Rose (MSR) type description of the exact many-chain dynamics. A saddle point approximation to the generating functional, involving conditions where the MSR action is stationary with respect to a collective density field ρ and a conjugate MSR response field ϕ, produces the desired dynamical mean-field theory. Besides clarifying the proper structure of mean-field theory out of equilibrium, our results have implications for numerical studies of polymer dynamicsmore » involving hybrid particle-field simulation techniques such as the single-chain in mean-field method.« less

  8. To Recycle or Not to Recycle? An Intergenerational Approach to Nuclear Fuel Cycles

    PubMed Central

    Kloosterman, Jan Leen

    2007-01-01

    This paper approaches the choice between the open and closed nuclear fuel cycles as a matter of intergenerational justice, by revealing the value conflicts in the production of nuclear energy. The closed fuel cycle improve sustainability in terms of the supply certainty of uranium and involves less long-term radiological risks and proliferation concerns. However, it compromises short-term public health and safety and security, due to the separation of plutonium. The trade-offs in nuclear energy are reducible to a chief trade-off between the present and the future. To what extent should we take care of our produced nuclear waste and to what extent should we accept additional risks to the present generation, in order to diminish the exposure of future generation to those risks? The advocates of the open fuel cycle should explain why they are willing to transfer all the risks for a very long period of time (200,000 years) to future generations. In addition, supporters of the closed fuel cycle should underpin their acceptance of additional risks to the present generation and make the actual reduction of risk to the future plausible. PMID:18075732

  9. To recycle or not to recycle? An intergenerational approach to nuclear fuel cycles.

    PubMed

    Taebi, Behnam; Kloosterman, Jan Leen

    2008-06-01

    This paper approaches the choice between the open and closed nuclear fuel cycles as a matter of intergenerational justice, by revealing the value conflicts in the production of nuclear energy. The closed fuel cycle improve sustainability in terms of the supply certainty of uranium and involves less long-term radiological risks and proliferation concerns. However, it compromises short-term public health and safety and security, due to the separation of plutonium. The trade-offs in nuclear energy are reducible to a chief trade-off between the present and the future. To what extent should we take care of our produced nuclear waste and to what extent should we accept additional risks to the present generation, in order to diminish the exposure of future generation to those risks? The advocates of the open fuel cycle should explain why they are willing to transfer all the risks for a very long period of time (200,000 years) to future generations. In addition, supporters of the closed fuel cycle should underpin their acceptance of additional risks to the present generation and make the actual reduction of risk to the future plausible.

  10. 'I'm not an outsider, I'm his mother!' A phenomenological enquiry into carer experiences of exclusion from acute psychiatric settings.

    PubMed

    Wilkinson, Claire; McAndrew, Sue

    2008-12-01

    Contemporary standards and policies advocate carer involvement in planning, implementing, and evaluating mental health services. Critics have questioned why such standards and policies fail to move from rhetoric to reality, this particularly being applicable to carer involvement within acute psychiatric settings. As there is only limited UK research on this topic, this interpretive phenomenological study was undertaken to explore the perceived level of involvement from the perspective of carers of service users who were admitted to acute inpatient settings within the previous 2 years. Interviews were conducted with four individuals who cared for a loved one with a mental illness. The interview analysis was influenced by Van Manen, whose interpretive approach seeks to generate a deeper understanding of the phenomenon under study. Four main themes emerged: powerlessness, feeling isolated, needing to be recognized and valued, and a desire for partnership. The findings reflect the views expressed by carers in other studies, identifying that while carers seek to work in partnership with health-care professionals, at a clinical level they often feel excluded. The study concludes by discussing ways of improving and promoting carer involvement and advocating a partnership in care approach within acute psychiatry.

  11. Preclinical evaluation of the PI3K/Akt/mTOR pathway in animal models of multiple sclerosis

    PubMed Central

    Mammana, Santa; Bramanti, Placido; Mazzon, Emanuela; Cavalli, Eugenio; Basile, Maria Sofia; Fagone, Paolo; Petralia, Maria Cristina; McCubrey, James Andrew; Nicoletti, Ferdinando; Mangano, Katia

    2018-01-01

    The PI3K/AKT/mTOR pathway is an intracellular signalling pathway that regulates cell activation. proliferation, metabolism and apoptosis. Increasing body of data suggests that alterations in the PI3K/AKT/mTOR pathway may result in an enhanced susceptibility to autoimmunity. Multiple Sclerosis (MS) is one of the most common chronic inflammatory diseases of the central nervous system leading to demyelination and neurodegeneration. In the current study, we have firstly evaluated in silico the involvement of the mTOR network on the generation and progression of MS and on oligodendrocyte function, making use of currently available whole-genome transcriptomic data. Then, the data generated in silico were subjected to an ex-vivo evaluation. To this aim, the involvement of mTOR was validated on a well-known animal model of MS and in vitro on Th17 cells. Our data indicate that there is a significant involvement of the mTOR network in the etiopathogenesis of MS and that Rapamycin treatment may represent a useful therapeutic approach in this clinical setting. On the other hand, our data showed that a significant involvement of the mTOR network could be observed only in the early phases of oligodendrocyte maturation, but not in the maturation process of adult oligodendrocytes and in the process of remyelination following demyelinating injury. Overall, our study suggests that targeting the PI3K/mTOR pathway, although it may not be a useful therapeutic approach to promote remyelination in MS patients, it can be exploited to exert immunomodulation, preventing/delaying relapses, and to treat MS patients in order to slow down the progression of disability. PMID:29492193

  12. Urban land use: Remote sensing of ground-basin permeability

    NASA Technical Reports Server (NTRS)

    Tinney, L. R.; Jensen, J. R.; Estes, J. E.

    1975-01-01

    A remote sensing analysis of the amount and type of permeable and impermeable surfaces overlying an urban recharge basin is discussed. An effective methodology for accurately generating this data as input to a safe yield study is detailed and compared to more conventional alternative approaches. The amount of area inventoried, approximately 10 sq. miles, should provide a reliable base against which automatic pattern recognition algorithms, currently under investigation for this task, can be evaluated. If successful, such approaches can significantly reduce the time and effort involved in obtaining permeability data, an important aspect of urban hydrology dynamics.

  13. Low-cost digital image processing at the University of Oklahoma

    NASA Technical Reports Server (NTRS)

    Harrington, J. A., Jr.

    1981-01-01

    Computer assisted instruction in remote sensing at the University of Oklahoma involves two separate approaches and is dependent upon initial preprocessing of a LANDSAT computer compatible tape using software developed for an IBM 370/158 computer. In-house generated preprocessing algorithms permits students or researchers to select a subset of a LANDSAT scene for subsequent analysis using either general purpose statistical packages or color graphic image processing software developed for Apple II microcomputers. Procedures for preprocessing the data and image analysis using either of the two approaches for low-cost LANDSAT data processing are described.

  14. Stem cells in retinal regeneration: past, present and future.

    PubMed

    Ramsden, Conor M; Powner, Michael B; Carr, Amanda-Jayne F; Smart, Matthew J K; da Cruz, Lyndon; Coffey, Peter J

    2013-06-01

    Stem cell therapy for retinal disease is under way, and several clinical trials are currently recruiting. These trials use human embryonic, foetal and umbilical cord tissue-derived stem cells and bone marrow-derived stem cells to treat visual disorders such as age-related macular degeneration, Stargardt's disease and retinitis pigmentosa. Over a decade of analysing the developmental cues involved in retinal generation and stem cell biology, coupled with extensive surgical research, have yielded differing cellular approaches to tackle these retinopathies. Here, we review these various stem cell-based approaches for treating retinal diseases and discuss future directions and challenges for the field.

  15. Ganokendra: An Innovative Model for Poverty Alleviation In Bangladesh

    NASA Astrophysics Data System (ADS)

    Alam, Kazi Rafiqul

    2006-05-01

    Ganokendras (people's learning centers) employ a literacy-based approach to alleviating poverty in Bangladesh. They give special attention to empowering rural women, among whom poverty is widespread. The present study reviews the Ganokendra-approach to facilitating increased political and economic awareness and improving community conditions in line with government initiatives for poverty reduction. Many Ganokendras implement programmes geared towards income-generating activities and establish linkages with other service providers, both governmental and non-governmental. As is shown, one particularly successful strategy for facilitating women's economic empowerment involves co-ordinating micro-credit available through other agencies.

  16. Dynamical origin of near- and below-threshold harmonic generation of Cs in an intense mid-infrared laser field.

    PubMed

    Li, Peng-Cheng; Sheu, Yae-Lin; Laughlin, Cecil; Chu, Shih-I

    2015-05-20

    Near- and below-threshold harmonic generation provides a potential approach to generate vacuum-ultraviolet frequency comb. However, the dynamical origin of in these lower harmonics is less understood and largely unexplored. Here we perform an ab initio quantum study of the near- and below-threshold harmonic generation of caesium (Cs) atoms in an intense 3,600-nm mid-infrared laser field. Combining with a synchrosqueezing transform of the quantum time-frequency spectrum and an extended semiclassical analysis, the roles of multiphoton and multiple rescattering trajectories on the near- and below-threshold harmonic generation processes are clarified. We find that the multiphoton-dominated trajectories only involve the electrons scattered off the higher part of the combined atom-field potential followed by the absorption of many photons in near- and below-threshold regime. Furthermore, only the near-resonant below-threshold harmonic is exclusive to exhibit phase locked features. Our results shed light on the dynamic origin of the near- and below-threshold harmonic generation.

  17. Integrated genomics and molecular breeding approaches for dissecting the complex quantitative traits in crop plants.

    PubMed

    Kujur, Alice; Saxena, Maneesha S; Bajaj, Deepak; Laxmi; Parida, Swarup K

    2013-12-01

    The enormous population growth, climate change and global warming are now considered major threats to agriculture and world's food security. To improve the productivity and sustainability of agriculture, the development of highyielding and durable abiotic and biotic stress-tolerant cultivars and/climate resilient crops is essential. Henceforth, understanding the molecular mechanism and dissection of complex quantitative yield and stress tolerance traits is the prime objective in current agricultural biotechnology research. In recent years, tremendous progress has been made in plant genomics and molecular breeding research pertaining to conventional and next-generation whole genome, transcriptome and epigenome sequencing efforts, generation of huge genomic, transcriptomic and epigenomic resources and development of modern genomics-assisted breeding approaches in diverse crop genotypes with contrasting yield and abiotic stress tolerance traits. Unfortunately, the detailed molecular mechanism and gene regulatory networks controlling such complex quantitative traits is not yet well understood in crop plants. Therefore, we propose an integrated strategies involving available enormous and diverse traditional and modern -omics (structural, functional, comparative and epigenomics) approaches/resources and genomics-assisted breeding methods which agricultural biotechnologist can adopt/utilize to dissect and decode the molecular and gene regulatory networks involved in the complex quantitative yield and stress tolerance traits in crop plants. This would provide clues and much needed inputs for rapid selection of novel functionally relevant molecular tags regulating such complex traits to expedite traditional and modern marker-assisted genetic enhancement studies in target crop species for developing high-yielding stress-tolerant varieties.

  18. Approaches to Mitigate the Unwanted Immunogenicity of Therapeutic Proteins during Drug Development.

    PubMed

    Salazar-Fontana, Laura I; Desai, Dharmesh D; Khan, Tarik A; Pillutla, Renuka C; Prior, Sandra; Ramakrishnan, Radha; Schneider, Jennifer; Joseph, Alexandra

    2017-03-01

    All biotherapeutics have the potential to induce an immune response. This immunological response is complex and, in addition to antibody formation, involves T cell activation and innate immune responses that could contribute to adverse effects. Integrated immunogenicity data analysis is crucial to understanding the possible clinical consequences of anti-drug antibody (ADA) responses. Because patient- and product-related factors can influence the immunogenicity of a therapeutic protein, a risk-based approach is recommended and followed by most drug developers to provide insight over the potential harm of unwanted ADA responses. This paper examines mitigation strategies currently implemented and novel under investigation approaches used by drug developers. The review describes immunomodulatory regimens used in the clinic to mitigate deleterious ADA responses to replacement therapies for deficiency syndromes, such as hemophilia A and B, and high risk classical infantile Pompe patients (e.g., cyclophosphamide, methotrexate, rituximab); novel in silico and in vitro prediction tools used to select candidates based on their immunogenicity potential (e.g., anti-CD52 antibody primary sequence and IFN beta-1a formulation); in vitro generation of tolerogenic antigen-presenting cells (APCs) to reduce ADA responses to factor VIII and IX in murine models of hemophilia; and selection of novel delivery systems to reduce in vivo ADA responses to highly immunogenic biotherapeutics (e.g., asparaginase). We conclude that mitigation strategies should be considered early in development for biotherapeutics based on our knowledge of existing clinical data for biotherapeutics and the immune response involved in the generation of these ADAs.

  19. Dynamic performance of maximum power point tracking circuits using sinusoidal extremum seeking control for photovoltaic generation

    NASA Astrophysics Data System (ADS)

    Leyva, R.; Artillan, P.; Cabal, C.; Estibals, B.; Alonso, C.

    2011-04-01

    The article studies the dynamic performance of a family of maximum power point tracking circuits used for photovoltaic generation. It revisits the sinusoidal extremum seeking control (ESC) technique which can be considered as a particular subgroup of the Perturb and Observe algorithms. The sinusoidal ESC technique consists of adding a small sinusoidal disturbance to the input and processing the perturbed output to drive the operating point at its maximum. The output processing involves a synchronous multiplication and a filtering stage. The filter instance determines the dynamic performance of the MPPT based on sinusoidal ESC principle. The approach uses the well-known root-locus method to give insight about damping degree and settlement time of maximum-seeking waveforms. This article shows the transient waveforms in three different filter instances to illustrate the approach. Finally, an experimental prototype corroborates the dynamic analysis.

  20. Construction, database integration, and application of an Oenothera EST library.

    PubMed

    Mrácek, Jaroslav; Greiner, Stephan; Cho, Won Kyong; Rauwolf, Uwe; Braun, Martha; Umate, Pavan; Altstätter, Johannes; Stoppel, Rhea; Mlcochová, Lada; Silber, Martina V; Volz, Stefanie M; White, Sarah; Selmeier, Renate; Rudd, Stephen; Herrmann, Reinhold G; Meurer, Jörg

    2006-09-01

    Coevolution of cellular genetic compartments is a fundamental aspect in eukaryotic genome evolution that becomes apparent in serious developmental disturbances after interspecific organelle exchanges. The genus Oenothera represents a unique, at present the only available, resource to study the role of the compartmentalized plant genome in diversification of populations and speciation processes. An integrated approach involving cDNA cloning, EST sequencing, and bioinformatic data mining was chosen using Oenothera elata with the genetic constitution nuclear genome AA with plastome type I. The Gene Ontology system grouped 1621 unique gene products into 17 different functional categories. Application of arrays generated from a selected fraction of ESTs revealed significantly differing expression profiles among closely related Oenothera species possessing the potential to generate fertile and incompatible plastid/nuclear hybrids (hybrid bleaching). Furthermore, the EST library provides a valuable source of PCR-based polymorphic molecular markers that are instrumental for genotyping and molecular mapping approaches.

  1. Direct α-C-H bond functionalization of unprotected cyclic amines

    NASA Astrophysics Data System (ADS)

    Chen, Weijie; Ma, Longle; Paul, Anirudra; Seidel, Daniel

    2018-02-01

    Cyclic amines are ubiquitous core structures of bioactive natural products and pharmaceutical drugs. Although the site-selective abstraction of C-H bonds is an attractive strategy for preparing valuable functionalized amines from their readily available parent heterocycles, this approach has largely been limited to substrates that require protection of the amine nitrogen atom. In addition, most methods rely on transition metals and are incompatible with the presence of amine N-H bonds. Here we introduce a protecting-group-free approach for the α-functionalization of cyclic secondary amines. An operationally simple one-pot procedure generates products via a process that involves intermolecular hydride transfer to generate an imine intermediate that is subsequently captured by a nucleophile, such as an alkyl or aryl lithium compound. Reactions are regioselective and stereospecific and enable the rapid preparation of bioactive amines, as exemplified by the facile synthesis of anabasine and (-)-solenopsin A.

  2. Utilizing the Foreign Body Response to Grow Tissue Engineered Blood Vessels in Vivo.

    PubMed

    Geelhoed, Wouter J; Moroni, Lorenzo; Rotmans, Joris I

    2017-04-01

    It is well known that the number of patients requiring a vascular grafts for use as vessel replacement in cardiovascular diseases, or as vascular access site for hemodialysis is ever increasing. The development of tissue engineered blood vessels (TEBV's) is a promising method to meet this increasing demand vascular grafts, without having to rely on poorly performing synthetic options such as polytetrafluoroethylene (PTFE) or Dacron. The generation of in vivo TEBV's involves utilizing the host reaction to an implanted biomaterial for the generation of completely autologous tissues. Essentially this approach to the development of TEBV's makes use of the foreign body response to biomaterials for the construction of the entire vascular replacement tissue within the patient's own body. In this review we will discuss the method of developing in vivo TEBV's, and debate the approaches of several research groups that have implemented this method.

  3. MUSiC - A general search for deviations from monte carlo predictions in CMS

    NASA Astrophysics Data System (ADS)

    Biallass, Philipp A.; CMS Collaboration

    2009-06-01

    A model independent analysis approach in CMS is presented, systematically scanning the data for deviations from the Monte Carlo expectation. Such an analysis can contribute to the understanding of the detector and the tuning of the event generators. Furthermore, due to the minimal theoretical bias this approach is sensitive to a variety of models of new physics, including those not yet thought of. Events are classified into event classes according to their particle content (muons, electrons, photons, jets and missing transverse energy). A broad scan of various distributions is performed, identifying significant deviations from the Monte Carlo simulation. The importance of systematic uncertainties is outlined, which are taken into account rigorously within the algorithm. Possible detector effects and generator issues, as well as models involving Supersymmetry and new heavy gauge bosons are used as an input to the search algorithm.

  4. MUSiC - A Generic Search for Deviations from Monte Carlo Predictions in CMS

    NASA Astrophysics Data System (ADS)

    Hof, Carsten

    2009-05-01

    We present a model independent analysis approach, systematically scanning the data for deviations from the Standard Model Monte Carlo expectation. Such an analysis can contribute to the understanding of the CMS detector and the tuning of the event generators. Furthermore, due to the minimal theoretical bias this approach is sensitive to a variety of models of new physics, including those not yet thought of. Events are classified into event classes according to their particle content (muons, electrons, photons, jets and missing transverse energy). A broad scan of various distributions is performed, identifying significant deviations from the Monte Carlo simulation. We outline the importance of systematic uncertainties, which are taken into account rigorously within the algorithm. Possible detector effects and generator issues, as well as models involving supersymmetry and new heavy gauge bosons have been used as an input to the search algorithm.

  5. Centralized and Decentralized Control for Demand Response

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, Shuai; Samaan, Nader A.; Diao, Ruisheng

    2011-04-29

    Demand response has been recognized as an essential element of the smart grid. Frequency response, regulation and contingency reserve functions performed traditionally by generation resources are now starting to involve demand side resources. Additional benefits from demand response include peak reduction and load shifting, which will defer new infrastructure investment and improve generator operation efficiency. Technical approaches designed to realize these functionalities can be categorized into centralized control and decentralized control, depending on where the response decision is made. This paper discusses these two control philosophies and compares their relative advantages and disadvantages in terms of delay time, predictability, complexity,more » and reliability. A distribution system model with detailed household loads and controls is built to demonstrate the characteristics of the two approaches. The conclusion is that the promptness and reliability of decentralized control should be combined with the predictability and simplicity of centralized control to achieve the best performance of the smart grid.« less

  6. The design and synthesis of new synthetic low molecular weight heparins

    PubMed Central

    Chandarajoti, K.; Liu, J.; Pawlinski, R.

    2016-01-01

    Low molecular weight heparins (LMWH) have remained the most favorable form of heparin in clinics since 1990s’ owing to its predictable pharmacokinetic properties. However, LMWH is mainly eliminated through kidney, thus limits its use in renal-impaired patients. In addition, the anticoagulant activity of LMWH is only partially neutralized by protamine. LMWH is obtained from a full-length, highly sulfated polysaccharide harvested from porcine mucosal tissue. The depolymerization involved in LMWH production generates a broad size distribution of LMWH fragments (6-22 sugar residues). This, combined with the various methods used to produce commercial LMWHs, result in variable pharmacological and pharmacokinetic properties. An alternative, chemoenzymatic approach offers a method for the synthesis of LMWH that has the potential to overcome the limitations of current LMWHs. This review summarizes the application of a chemoenzymatic approach to generate LMWH and the rationale for development of a synthetic LMWH. PMID:26990516

  7. The design and synthesis of new synthetic low-molecular-weight heparins.

    PubMed

    Chandarajoti, K; Liu, J; Pawlinski, R

    2016-06-01

    Low-molecular-weight heparin (LMWH) has remained the most favorable form of heparin in clinics since the 1990s owing to its predictable pharmacokinetic properties. However, LMWH is mainly eliminated through the kidney, which limits its use in renal-impaired patients. In addition, the anticoagulant activity of LMWH is only partially neutralized by protamine. LMWH is obtained from a full-length, highly sulfated polysaccharide harvested from porcine mucosal tissue. The depolymerization involved in LMWH production generates a broad distribution of LMWH fragments (6-22 sugar residues). This, combined with the various methods used to produce commercial LMWHs, results in variable pharmacological and pharmacokinetic properties. An alternative chemoenzymatic approach offers a method for the synthesis of LMWH that has the potential to overcome the limitations of current LMWHs. This review summarizes the application of a chemoenzymatic approach to generate LMWH and the rationale for development of a synthetic LMWH. © 2016 International Society on Thrombosis and Haemostasis.

  8. A Novel Defect Inspection Method for Semiconductor Wafer Based on Magneto-Optic Imaging

    NASA Astrophysics Data System (ADS)

    Pan, Z.; Chen, L.; Li, W.; Zhang, G.; Wu, P.

    2013-03-01

    The defects of semiconductor wafer may be generated from the manufacturing processes. A novel defect inspection method of semiconductor wafer is presented in this paper. The method is based on magneto-optic imaging, which involves inducing eddy current into the wafer under test, and detecting the magnetic flux associated with eddy current distribution in the wafer by exploiting the Faraday rotation effect. The magneto-optic image being generated may contain some noises that degrade the overall image quality, therefore, in this paper, in order to remove the unwanted noise present in the magneto-optic image, the image enhancement approach using multi-scale wavelet is presented, and the image segmentation approach based on the integration of watershed algorithm and clustering strategy is given. The experimental results show that many types of defects in wafer such as hole and scratch etc. can be detected by the method proposed in this paper.

  9. Fast computation of hologram patterns of a 3D object using run-length encoding and novel look-up table methods.

    PubMed

    Kim, Seung-Cheol; Kim, Eun-Soo

    2009-02-20

    In this paper we propose a new approach for fast generation of computer-generated holograms (CGHs) of a 3D object by using the run-length encoding (RLE) and the novel look-up table (N-LUT) methods. With the RLE method, spatially redundant data of a 3D object are extracted and regrouped into the N-point redundancy map according to the number of the adjacent object points having the same 3D value. Based on this redundancy map, N-point principle fringe patterns (PFPs) are newly calculated by using the 1-point PFP of the N-LUT, and the CGH pattern for the 3D object is generated with these N-point PFPs. In this approach, object points to be involved in calculation of the CGH pattern can be dramatically reduced and, as a result, an increase of computational speed can be obtained. Some experiments with a test 3D object are carried out and the results are compared to those of the conventional methods.

  10. A new approach to Catalan numbers using differential equations

    NASA Astrophysics Data System (ADS)

    Kim, D. S.; Kim, T.

    2017-10-01

    In this paper, we introduce two differential equations arising from the generating function of the Catalan numbers which are `inverses' to each other in a certain sense. From these differential equations, we obtain some new and explicit identities for Catalan and higher-order Catalan numbers. In addition, by other means than differential equations, we also derive some interesting identities involving Catalan numbers which are of arithmetic and combinatorial nature.

  11. Second Generation Amphiphilic Poly-Lysine Dendrons Inhibit Glioblastoma Cell Proliferation without Toxicity for Neurons or Astrocytes

    PubMed Central

    Janiszewska, Jolanta; Posadas, Inmaculada; Játiva, Pablo; Bugaj-Zarebska, Marta; Urbanczyk-Lipkowska, Zofia; Ceña, Valentín

    2016-01-01

    Glioblastomas are the most common malignant primary brain tumours in adults and one of the most aggressive and difficult-to-treat cancers. No effective treatment exits actually for this tumour and new therapeutic approaches are needed for this disease. One possible innovative approach involves the nanoparticle-mediated specific delivery of drugs and/or genetic material to glioblastoma cells where they can provide therapeutic benefits. In the present work, we have synthesised and characterised several second generation amphiphilic polylysine dendrons to be used as siRNA carriers. We have found that, in addition to their siRNA binding properties, these new compounds inhibit the proliferation of two glioblastoma cell lines while being nontoxic for non-tumoural central nervous system cells like neurons and glia, cell types that share the anatomical space with glioblastoma cells during the course of the disease. The selective toxicity of these nanoparticles to glioblastoma cells, as compared to neurons and glial cells, involves mitochondrial depolarisation and reactive oxygen species production. This selective toxicity, together with the ability to complex and release siRNA, suggests that these new polylysine dendrons might offer a scaffold in the development of future nanoparticles designed to restrict the proliferation of glioblastoma cells. PMID:27832093

  12. Trophic and Non-Trophic Interactions in a Biodiversity Experiment Assessed by Next-Generation Sequencing

    PubMed Central

    Tiede, Julia; Wemheuer, Bernd; Traugott, Michael; Daniel, Rolf; Tscharntke, Teja; Ebeling, Anne; Scherber, Christoph

    2016-01-01

    Plant diversity affects species richness and abundance of taxa at higher trophic levels. However, plant diversity effects on omnivores (feeding on multiple trophic levels) and their trophic and non-trophic interactions are not yet studied because appropriate methods were lacking. A promising approach is the DNA-based analysis of gut contents using next generation sequencing (NGS) technologies. Here, we integrate NGS-based analysis into the framework of a biodiversity experiment where plant taxonomic and functional diversity were manipulated to directly assess environmental interactions involving the omnivorous ground beetle Pterostichus melanarius. Beetle regurgitates were used for NGS-based analysis with universal 18S rDNA primers for eukaryotes. We detected a wide range of taxa with the NGS approach in regurgitates, including organisms representing trophic, phoretic, parasitic, and neutral interactions with P. melanarius. Our findings suggest that the frequency of (i) trophic interactions increased with plant diversity and vegetation cover; (ii) intraguild predation increased with vegetation cover, and (iii) neutral interactions with organisms such as fungi and protists increased with vegetation cover. Experimentally manipulated plant diversity likely affects multitrophic interactions involving omnivorous consumers. Our study therefore shows that trophic and non-trophic interactions can be assessed via NGS to address fundamental questions in biodiversity research. PMID:26859146

  13. A combinatorial approach to angiosperm pollen morphology.

    PubMed

    Mander, Luke

    2016-11-30

    Angiosperms (flowering plants) are strikingly diverse. This is clearly expressed in the morphology of their pollen grains, which are characterized by enormous variety in their shape and patterning. In this paper, I approach angiosperm pollen morphology from the perspective of enumerative combinatorics. This involves generating angiosperm pollen morphotypes by algorithmically combining character states and enumerating the results of these combinations. I use this approach to generate 3 643 200 pollen morphotypes, which I visualize using a parallel-coordinates plot. This represents a raw morphospace. To compare real-world and theoretical morphologies, I map the pollen of 1008 species of Neotropical angiosperms growing on Barro Colorado Island (BCI), Panama, onto this raw morphospace. This highlights that, in addition to their well-documented taxonomic diversity, Neotropical rainforests also represent an enormous reservoir of morphological diversity. Angiosperm pollen morphospace at BCI has been filled mostly by pollen morphotypes that are unique to single plant species. Repetition of pollen morphotypes among higher taxa at BCI reflects both constraint and convergence. This combinatorial approach to morphology addresses the complexity that results from large numbers of discrete character combinations and could be employed in any situation where organismal form can be captured by discrete morphological characters. © 2016 The Author(s).

  14. A combinatorial approach to angiosperm pollen morphology

    PubMed Central

    2016-01-01

    Angiosperms (flowering plants) are strikingly diverse. This is clearly expressed in the morphology of their pollen grains, which are characterized by enormous variety in their shape and patterning. In this paper, I approach angiosperm pollen morphology from the perspective of enumerative combinatorics. This involves generating angiosperm pollen morphotypes by algorithmically combining character states and enumerating the results of these combinations. I use this approach to generate 3 643 200 pollen morphotypes, which I visualize using a parallel-coordinates plot. This represents a raw morphospace. To compare real-world and theoretical morphologies, I map the pollen of 1008 species of Neotropical angiosperms growing on Barro Colorado Island (BCI), Panama, onto this raw morphospace. This highlights that, in addition to their well-documented taxonomic diversity, Neotropical rainforests also represent an enormous reservoir of morphological diversity. Angiosperm pollen morphospace at BCI has been filled mostly by pollen morphotypes that are unique to single plant species. Repetition of pollen morphotypes among higher taxa at BCI reflects both constraint and convergence. This combinatorial approach to morphology addresses the complexity that results from large numbers of discrete character combinations and could be employed in any situation where organismal form can be captured by discrete morphological characters. PMID:27881756

  15. Omics/systems biology and cancer cachexia.

    PubMed

    Gallagher, Iain J; Jacobi, Carsten; Tardif, Nicolas; Rooyackers, Olav; Fearon, Kenneth

    2016-06-01

    Cancer cachexia is a complex syndrome generated by interaction between the host and tumour cells with a background of treatment effects and toxicity. The complexity of the physiological pathways likely involved in cancer cachexia necessitates a holistic view of the relevant biology. Emergent properties are characteristic of complex systems with the result that the end result is more than the sum of its parts. Recognition of the importance of emergent properties in biology led to the concept of systems biology wherein a holistic approach is taken to the biology at hand. Systems biology approaches will therefore play an important role in work to uncover key mechanisms with therapeutic potential in cancer cachexia. The 'omics' technologies provide a global view of biological systems. Genomics, transcriptomics, proteomics, lipidomics and metabolomics approaches all have application in the study of cancer cachexia to generate systems level models of the behaviour of this syndrome. The current work reviews recent applications of these technologies to muscle atrophy in general and cancer cachexia in particular with a view to progress towards integration of these approaches to better understand the pathology and potential treatment pathways in cancer cachexia. Copyright © 2016. Published by Elsevier Ltd.

  16. Topical Collection: Climate-change research by early-career hydrogeologists

    NASA Astrophysics Data System (ADS)

    Re, Viviana; Maldaner, Carlos H.; Gurdak, Jason J.; Leblanc, Marc; Resende, Tales Carvalho; Stigter, Tibor Y.

    2018-05-01

    Scientific outreach, international networking, collaboration and adequate courses are needed in both developed and developing countries to enable early-career hydrogeologists to promote long-term multidisciplinary approaches to cope with climate-change issues and emphasize the importance of groundwater in a global strategy for adaptation. One such collaboration has involved the Early Career Hydrogeologists' Network of the International Association of Hydrogeologists (ECHN-IAH) and the UNESCO International Hydrological Programme's (IHP) Groundwater Resources Assessment under the Pressures of Humanity and Climate Changes (GRAPHIC) project. This collaboration seeks to foster the education and involvement of the future generation of water leaders in the debate over groundwater and climate change.

  17. WONOEP appraisal: new genetic approaches to study epilepsy

    PubMed Central

    Rossignol, Elsa; Kobow, Katja; Simonato, Michele; Loeb, Jeffrey A.; Grisar, Thierry; Gilby, Krista L.; Vinet, Jonathan; Kadam, Shilpa D.; Becker, Albert J.

    2014-01-01

    Objective New genetic investigation techniques, including next-generation sequencing, epigenetic profiling, cell lineage mapping, targeted genetic manipulation of specific neuronal cell types, stem cell reprogramming and optogenetic manipulations within epileptic networks are progressively unravelling the mysteries of epileptogenesis and ictogenesis. These techniques have opened new avenues to discover the molecular basis of epileptogenesis and to study the physiological impacts of mutations in epilepsy-associated genes on a multilayer level, from cells to circuits. Methods This manuscript reviews recently published applications of these new genetic technologies in the study of epilepsy, as well as work presented by the authors at the genetic session of the XII Workshop on the Neurobiology of Epilepsy in Quebec, Canada. Results Next-generation sequencing is providing investigators with an unbiased means to assess the molecular causes of sporadic forms of epilepsy and have revealed the complexity and genetic heterogeneity of sporadic epilepsy disorders. To assess the functional impact of mutations in these newly identified genes on specific neuronal cell-types during brain development, new modeling strategies in animals, including conditional genetics in mice and in utero knockdown approaches, are enabling functional validation with exquisite cell-type and temporal specificity. In addition, optogenetics, using cell-type specific Cre recombinase driver lines, is enabling investigators to dissect networks involved in epilepsy. Genetically-encoded cell-type labeling is also providing new means to assess the role of the non-neuronal components of epileptic networks such as glial cells. Furthermore, beyond its role in revealing coding variants involved in epileptogenesis, next-generation sequencing can be used to assess the epigenetic modifications that lead to sustained network hyperexcitability in epilepsy, including methylation changes in gene promoters and non-coding RNAs involved in modifying gene expression following seizures. In addition, genetically-based bioluminescent reporters are providing new opportunities to assess neuronal activity and neurotransmitter levels both in vitro and in vivo in the context of epilepsy. Finally, genetically rederived neurons generated from patient iPS cells and genetically-modified zebrafish have become high-throughput means to investigate disease mechanisms and potential new therapies. Significance Genetics has considerably changed the field of epilepsy research and is paving the way for better diagnosis and therapies for patients with epilepsy. PMID:24965021

  18. The relationship between violence and engagement in drug dealing and sex work among street-involved youth.

    PubMed

    Hayashi, Kanna; Daly-Grafstein, Ben; Dong, Huiru; Wood, Evan; Kerr, Thomas; DeBeck, Kora

    2016-06-27

    Street-involved youth are highly vulnerable to violence. While involvement in income-generating activities within illicit drug scenes is recognized as shaping youths' vulnerability to violence, the relative contributions of different income-generating activities remain understudied. We sought to examine the independent effects of drug dealing and sex work on experiencing violence among street-involved youth. Data were derived from a prospective cohort of street-involved youth aged 14-26 who used drugs in Vancouver, British Columbia, between September 2005 and May 2014. Multivariable generalized estimating equations were used to examine the impact of involvement in drug dealing and sex work on experiencing violence. Among 1,152 participants, including 364 (31.6%) women, 740 (64.2%) reported having experienced violence at some point during the study period. In multivariable analysis, involvement in drug dealing but not sex work remained independently associated with experiencing violence among females (adjusted odds ratio [AOR]: 1.43; 95% confidence interval [CI]: 1.08-1.90) and males (AOR: 1.50; 95% CI: 1.25-1.80), while involvement in sex work only was not associated with violence among females (AOR: 1.15; 95% CI: 0.76-1.74) or males (AOR: 1.42; 95% CI: 0.81-2.48). Findings indicate that involvement in drug dealing is a major factor associated with experiencing violence among our sample. In addition to conventional interventions, such as addiction treatment, novel approaches are needed to reduce the risk of violence for drug-using youth who are actively engaged in drug dealing. The potential for low-threshold employment and decriminalization of drug use to mitigate violence warrants further study.

  19. The relationship between violence and engagement in drug dealing and sex work among street-involved youth

    PubMed Central

    Hayashi, Kanna; Daly-Grafstein, Ben; Dong, Huiru; Wood, Evan; Kerr, Thomas; DeBeck, Kora

    2016-01-01

    OBJECTIVES Street-involved youth are highly vulnerable to violence. While involvement in income-generating activities within illicit drug scenes are recognized as shaping youths’ vulnerability to violence, the relative contributions of different income-generating activities remain understudied. We sought to examine the independent effects of drug dealing and sex work on experiencing violence among street-involved youth. METHODS Data were derived from a prospective cohort of street-involved youth aged 14–26 who use drugs in Vancouver, Canada, between September 2005 and May 2014. Multivariable generalized estimating equations were used to examine the impact of involvement in drug dealing and sex work on experiencing violence. RESULTS Among 1,152 participants, including 364 (31.6%) women, 740 (64.2%) reported having experienced violence at some point during the study period. In multivariable analysis, involvement in drug dealing but not sex work remained independently associated with experiencing violence among females (adjusted odds ratio [AOR]: 1.43; 95% confidence interval [CI]: 1.08 – 1.90) and males (AOR: 1.50; 95% CI: 1.25 – 1.80), while involvement in sex work only was not associated with violence among females (AOR: 1.15; 95% CI: 0.76 – 1.74) or males (AOR: 1.42; 95% CI: 0.81 – 2.48). CONCLUSION Findings indicate that involvement in drug dealing is a major factor associated with experiencing violence among our sample. In addition to conventional interventions, such as addiction treatment, novel approaches are needed to reduce the risk of violence for drug-using youth who are actively engaged in drug dealing. The potential for low-threshold employment and decriminalization of drug use to mitigate violence warrants further study. PMID:27348116

  20. A latent discriminative model-based approach for classification of imaginary motor tasks from EEG data.

    PubMed

    Saa, Jaime F Delgado; Çetin, Müjdat

    2012-04-01

    We consider the problem of classification of imaginary motor tasks from electroencephalography (EEG) data for brain-computer interfaces (BCIs) and propose a new approach based on hidden conditional random fields (HCRFs). HCRFs are discriminative graphical models that are attractive for this problem because they (1) exploit the temporal structure of EEG; (2) include latent variables that can be used to model different brain states in the signal; and (3) involve learned statistical models matched to the classification task, avoiding some of the limitations of generative models. Our approach involves spatial filtering of the EEG signals and estimation of power spectra based on autoregressive modeling of temporal segments of the EEG signals. Given this time-frequency representation, we select certain frequency bands that are known to be associated with execution of motor tasks. These selected features constitute the data that are fed to the HCRF, parameters of which are learned from training data. Inference algorithms on the HCRFs are used for the classification of motor tasks. We experimentally compare this approach to the best performing methods in BCI competition IV as well as a number of more recent methods and observe that our proposed method yields better classification accuracy.

  1. Integrating volcanic hazard data in a systematic approach to develop volcanic hazard maps in the Lesser Antilles

    NASA Astrophysics Data System (ADS)

    Lindsay, Jan M.; Robertson, Richard E. A.

    2018-04-01

    We report on the process of generating the first suite of integrated volcanic hazard zonation maps for the islands of Dominica, Grenada (including Kick 'em Jenny and Ronde/Caille), Nevis, Saba, St. Eustatius, St. Kitts, Saint Lucia and St Vincent in the Lesser Antilles. We developed a systematic approach that accommodated the range in prior knowledge of the volcanoes in the region. A first-order hazard assessment for each island was used to develop one or more scenario(s) of likely future activity, for which scenario-based hazard maps were generated. For the most-likely scenario on each island we also produced a poster-sized integrated volcanic hazard zonation map, which combined the individual hazardous phenomena depicted in the scenario-based hazard maps into integrated hazard zones. We document the philosophy behind the generation of this suite of maps, and the method by which hazard information was combined to create integrated hazard zonation maps, and illustrate our approach through a case study of St. Vincent. We also outline some of the challenges we faced using this approach, and the lessons we have learned by observing how stakeholders have interacted with the maps over the past 10 years. Based on our experience, we recommend that future map makers involve stakeholders in the entire map generation process, especially when making design choices such as type of base map, use of colour and gradational boundaries, and indeed what to depict on the map. We also recommend careful consideration of how to evaluate and depict offshore hazard of island volcanoes, and recommend computer-assisted modelling of all phenomena to generate more realistic hazard footprints. Finally, although our systematic approach to integrating individual hazard data into zones generally worked well, we suggest that a better approach might be to treat the integration of hazards on a case-by-case basis to ensure the final product meets map users' needs. We hope that the documentation of our experience might be useful for other map makers to take into account when creating new or updating existing maps.

  2. [Generation Y healthcare students’ expectations: hard skills but also soft skills.

    PubMed

    Engels, Cynthia

    2017-12-01

    Generation Y’s (born between 1981 and 1999) educational approach in healthcare studies raises questions about their expectations and which teaching methods to use with this generation. The study involved third year students of occupational therapy. One hundred and twelve students were consulted in September 2012 and September 2013 about their expectations regarding the courses and the teaching methods which were offered. Results allowed to highlight firstly the importance of the usefulness of the course and secondly expectations regarding the teachers’ soft skills. If the link between soft skills and success was pointed out in many studies, only a few studies focus on the teachers’ own soft skills development in order to respond to students’ expectations in higher education. This is the topic of this article, which main interest could be the means to develop teaching methods adapted to this students’ generation.

  3. Vector Potential Generation for Numerical Relativity Simulations

    NASA Astrophysics Data System (ADS)

    Silberman, Zachary; Faber, Joshua; Adams, Thomas; Etienne, Zachariah; Ruchlin, Ian

    2017-01-01

    Many different numerical codes are employed in studies of highly relativistic magnetized accretion flows around black holes. Based on the formalisms each uses, some codes evolve the magnetic field vector B, while others evolve the magnetic vector potential A, the two being related by the curl: B=curl(A). Here, we discuss how to generate vector potentials corresponding to specified magnetic fields on staggered grids, a surprisingly difficult task on finite cubic domains. The code we have developed solves this problem in two ways: a brute-force method, whose scaling is nearly linear in the number of grid cells, and a direct linear algebra approach. We discuss the success both algorithms have in generating smooth vector potential configurations and how both may be extended to more complicated cases involving multiple mesh-refinement levels. NSF ACI-1550436

  4. Statistical Analysis of Complexity Generators for Cost Estimation

    NASA Technical Reports Server (NTRS)

    Rowell, Ginger Holmes

    1999-01-01

    Predicting the cost of cutting edge new technologies involved with spacecraft hardware can be quite complicated. A new feature of the NASA Air Force Cost Model (NAFCOM), called the Complexity Generator, is being developed to model the complexity factors that drive the cost of space hardware. This parametric approach is also designed to account for the differences in cost, based on factors that are unique to each system and subsystem. The cost driver categories included in this model are weight, inheritance from previous missions, technical complexity, and management factors. This paper explains the Complexity Generator framework, the statistical methods used to select the best model within this framework, and the procedures used to find the region of predictability and the prediction intervals for the cost of a mission.

  5. Involving the public through participatory visual research methods

    PubMed Central

    Lorenz, Laura S.; Kolb, Bettina

    2009-01-01

    Abstract Objectives  To show how providing cameras to patients and community residents can be effective at involving the public in generating understanding of consumer, community, and health system problems and strengths. Background  Health‐care institutions and systems may seek to include consumer perspectives on health and health care yet be challenged to involve the most vulnerable sectors, be they persons with disabilities or persons with low socio‐economic status living in societies where a top‐down approach to policy is the norm. Methods  Drawing on study examples using photo‐elicitation and photovoice in Morocco and the United States, the authors explore issues of planning, data analysis, ethical concerns and action related to using participatory visual methods in different cultural and political contexts. Results  Visual data generated by consumers can be surprising and can identify health system problems and strengths omitted from data gathered using other means. Statistical data may convince policy makers of the need to address a problem. Participant visual data may in turn encourage policy maker attention and action. Conclusion  Health system decision making may be improved by having a broader range of data available. Participant‐generated visual data may support data gathered using traditional methods, or provide a reality check when compared with data generated by organizations, researchers and policy makers. The two study examples model innovative ways to surface health and health‐care issues as they relate to consumers’ real lives and engage vulnerable groups in systems change, even in contexts where expressing opinions might be seen as a risky thing to do. PMID:19754690

  6. Feynman-like rules for calculating n-point correlators of the primordial curvature perturbation

    NASA Astrophysics Data System (ADS)

    Valenzuela-Toledo, César A.; Rodríguez, Yeinzon; Beltrán Almeida, Juan P.

    2011-10-01

    A diagrammatic approach to calculate n-point correlators of the primordial curvature perturbation ζ was developed a few years ago following the spirit of the Feynman rules in Quantum Field Theory. The methodology is very useful and time-saving, as it is for the case of the Feynman rules in the particle physics context, but, unfortunately, is not very well known by the cosmology community. In the present work, we extend such an approach in order to include not only scalar field perturbations as the generators of ζ, but also vector field perturbations. The purpose is twofold: first, we would like the diagrammatic approach (which we would call the Feynman-like rules) to become widespread among the cosmology community; second, we intend to give an easy tool to formulate any correlator of ζ for those cases that involve vector field perturbations and that, therefore, may generate prolonged stages of anisotropic expansion and/or important levels of statistical anisotropy. Indeed, the usual way of formulating such correlators, using the Wick's theorem, may become very clutter and time-consuming.

  7. Mid-infrared supercontinuum generation in fluoride fiber amplifiers: current status and future perspectives

    NASA Astrophysics Data System (ADS)

    Gauthier, Jean-Christophe; Robichaud, Louis-Rafaël; Fortin, Vincent; Vallée, Réal; Bernier, Martin

    2018-06-01

    The quest for a compact and efficient broadband laser source able to probe the numerous fundamental molecular absorption lines in the mid-infrared (3-8 µm) for various applications has been going on for more than a decade. While robust commercial fiber-based supercontinuum (SC) systems have started to appear on the market, they still exhibit poor energy conversion into the mid-infrared (typically under 30%) and are generally not producing wavelengths exceeding 4.7 µm. Here, we present an overview of the results obtained from a novel approach to SC generation based on spectral broadening inside of an erbium-doped fluoride fiber amplifier seeded directly at 2.8 µm, allowing mid-infrared conversion efficiencies reaching up to 95% and spectral coverage approaching the transparency limit of ZrF4 (4.2 µm) and InF3 (5.5 µm) fibers. The general concept of the approach and the physical mechanisms involved are presented alongside the various configurations of the system to adjust the output characteristics in terms of spectral coverage and output power for different applications.

  8. The Biophysics of Infection.

    PubMed

    Leake, Mark C

    2016-01-01

    Our understanding of the processes involved in infection has grown enormously in the past decade due in part to emerging methods of biophysics. This new insight has been enabled through advances in interdisciplinary experimental technologies and theoretical methods at the cutting-edge interface of the life and physical sciences. For example, this has involved several state-of-the-art biophysical tools used in conjunction with molecular and cell biology approaches, which enable investigation of infection in living cells. There are also new, emerging interfacial science tools which enable significant improvements to the resolution of quantitative measurements both in space and time. These include single-molecule biophysics methods and super-resolution microscopy approaches. These new technological tools in particular have underpinned much new understanding of dynamic processes of infection at a molecular length scale. Also, there are many valuable advances made recently in theoretical approaches of biophysics which enable advances in predictive modelling to generate new understanding of infection. Here, I discuss these advances, and take stock on our knowledge of the biophysics of infection and discuss where future advances may lead.

  9. Multi Objective Optimization Using Genetic Algorithm of a Pneumatic Connector

    NASA Astrophysics Data System (ADS)

    Salaam, HA; Taha, Zahari; Ya, TMYS Tuan

    2018-03-01

    The concept of sustainability was first introduced by Dr Harlem Brutland in the 1980’s promoting the need to preserve today’s natural environment for the sake of future generations. Based on this concept, John Elkington proposed an approach to measure sustainability known as Triple Bottom Line (TBL). There are three evaluation criteria’s involved in the TBL approach; namely economics, environmental integrity and social equity. In manufacturing industry the manufacturing costs measure the economic sustainability of a company in a long term. Environmental integrity is a measure of the impact of manufacturing activities on the environment. Social equity is complicated to evaluate; but when the focus is at the production floor level, the production operator health can be considered. In this paper, the TBL approach is applied in the manufacturing of a pneumatic nipple hose. The evaluation criteria used are manufacturing costs, environmental impact, ergonomics impact and also energy used for manufacturing. This study involves multi objective optimization by using genetic algorithm of several possible alternatives for material used in the manufacturing of the pneumatic nipple.

  10. Model compilation: An approach to automated model derivation

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Baudin, Catherine; Iwasaki, Yumi; Nayak, Pandurang; Tanaka, Kazuo

    1990-01-01

    An approach is introduced to automated model derivation for knowledge based systems. The approach, model compilation, involves procedurally generating the set of domain models used by a knowledge based system. With an implemented example, how this approach can be used to derive models of different precision and abstraction is illustrated, and models are tailored to different tasks, from a given set of base domain models. In particular, two implemented model compilers are described, each of which takes as input a base model that describes the structure and behavior of a simple electromechanical device, the Reaction Wheel Assembly of NASA's Hubble Space Telescope. The compilers transform this relatively general base model into simple task specific models for troubleshooting and redesign, respectively, by applying a sequence of model transformations. Each transformation in this sequence produces an increasingly more specialized model. The compilation approach lessens the burden of updating and maintaining consistency among models by enabling their automatic regeneration.

  11. Agent-Centric Approach for Cybersecurity Decision-Support with Partial Observability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tipireddy, Ramakrishna; Chatterjee, Samrat; Paulson, Patrick R.

    Generating automated cyber resilience policies for real-world settings is a challenging research problem that must account for uncertainties in system state over time and dynamics between attackers and defenders. In addition to understanding attacker and defender motives and tools, and identifying “relevant” system and attack data, it is also critical to develop rigorous mathematical formulations representing the defender’s decision-support problem under uncertainty. Game-theoretic approaches involving cyber resource allocation optimization with Markov decision processes (MDP) have been previously proposed in the literature. Moreover, advancements in reinforcement learning approaches have motivated the development of partially observable stochastic games (POSGs) in various multi-agentmore » problem domains with partial information. Recent advances in cyber-system state space modeling have also generated interest in potential applicability of POSGs for cybersecurity. However, as is the case in strategic card games such as poker, research challenges using game-theoretic approaches for practical cyber defense applications include: 1) solving for equilibrium and designing efficient algorithms for large-scale, general problems; 2) establishing mathematical guarantees that equilibrium exists; 3) handling possible existence of multiple equilibria; and 4) exploitation of opponent weaknesses. Inspired by advances in solving strategic card games while acknowledging practical challenges associated with the use of game-theoretic approaches in cyber settings, this paper proposes an agent-centric approach for cybersecurity decision-support with partial system state observability.« less

  12. Detecting representative data and generating synthetic samples to improve learning accuracy with imbalanced data sets.

    PubMed

    Li, Der-Chiang; Hu, Susan C; Lin, Liang-Sian; Yeh, Chun-Wu

    2017-01-01

    It is difficult for learning models to achieve high classification performances with imbalanced data sets, because with imbalanced data sets, when one of the classes is much larger than the others, most machine learning and data mining classifiers are overly influenced by the larger classes and ignore the smaller ones. As a result, the classification algorithms often have poor learning performances due to slow convergence in the smaller classes. To balance such data sets, this paper presents a strategy that involves reducing the sizes of the majority data and generating synthetic samples for the minority data. In the reducing operation, we use the box-and-whisker plot approach to exclude outliers and the Mega-Trend-Diffusion method to find representative data from the majority data. To generate the synthetic samples, we propose a counterintuitive hypothesis to find the distributed shape of the minority data, and then produce samples according to this distribution. Four real datasets were used to examine the performance of the proposed approach. We used paired t-tests to compare the Accuracy, G-mean, and F-measure scores of the proposed data pre-processing (PPDP) method merging in the D3C method (PPDP+D3C) with those of the one-sided selection (OSS), the well-known SMOTEBoost (SB) study, and the normal distribution-based oversampling (NDO) approach, and the proposed data pre-processing (PPDP) method. The results indicate that the classification performance of the proposed approach is better than that of above-mentioned methods.

  13. Need to Knowledge (NtK) Model: an evidence-based framework for generating technological innovations with socio-economic impacts.

    PubMed

    Flagg, Jennifer L; Lane, Joseph P; Lockett, Michelle M

    2013-02-15

    Traditional government policies suggest that upstream investment in scientific research is necessary and sufficient to generate technological innovations. The expected downstream beneficial socio-economic impacts are presumed to occur through non-government market mechanisms. However, there is little quantitative evidence for such a direct and formulaic relationship between public investment at the input end and marketplace benefits at the impact end. Instead, the literature demonstrates that the technological innovation process involves a complex interaction between multiple sectors, methods, and stakeholders. The authors theorize that accomplishing the full process of technological innovation in a deliberate and systematic manner requires an operational-level model encompassing three underlying methods, each designed to generate knowledge outputs in different states: scientific research generates conceptual discoveries; engineering development generates prototype inventions; and industrial production generates commercial innovations. Given the critical roles of engineering and business, the entire innovation process should continuously consider the practical requirements and constraints of the commercial marketplace.The Need to Knowledge (NtK) Model encompasses the activities required to successfully generate innovations, along with associated strategies for effectively communicating knowledge outputs in all three states to the various stakeholders involved. It is intentionally grounded in evidence drawn from academic analysis to facilitate objective and quantitative scrutiny, and industry best practices to enable practical application. The Need to Knowledge (NtK) Model offers a practical, market-oriented approach that avoids the gaps, constraints and inefficiencies inherent in undirected activities and disconnected sectors. The NtK Model is a means to realizing increased returns on public investments in those science and technology programs expressly intended to generate beneficial socio-economic impacts.

  14. Need to Knowledge (NtK) Model: an evidence-based framework for generating technological innovations with socio-economic impacts

    PubMed Central

    2013-01-01

    Background Traditional government policies suggest that upstream investment in scientific research is necessary and sufficient to generate technological innovations. The expected downstream beneficial socio-economic impacts are presumed to occur through non-government market mechanisms. However, there is little quantitative evidence for such a direct and formulaic relationship between public investment at the input end and marketplace benefits at the impact end. Instead, the literature demonstrates that the technological innovation process involves a complex interaction between multiple sectors, methods, and stakeholders. Discussion The authors theorize that accomplishing the full process of technological innovation in a deliberate and systematic manner requires an operational-level model encompassing three underlying methods, each designed to generate knowledge outputs in different states: scientific research generates conceptual discoveries; engineering development generates prototype inventions; and industrial production generates commercial innovations. Given the critical roles of engineering and business, the entire innovation process should continuously consider the practical requirements and constraints of the commercial marketplace. The Need to Knowledge (NtK) Model encompasses the activities required to successfully generate innovations, along with associated strategies for effectively communicating knowledge outputs in all three states to the various stakeholders involved. It is intentionally grounded in evidence drawn from academic analysis to facilitate objective and quantitative scrutiny, and industry best practices to enable practical application. Summary The Need to Knowledge (NtK) Model offers a practical, market-oriented approach that avoids the gaps, constraints and inefficiencies inherent in undirected activities and disconnected sectors. The NtK Model is a means to realizing increased returns on public investments in those science and technology programs expressly intended to generate beneficial socio-economic impacts. PMID:23414369

  15. Optimized breeding strategies for multiple trait integration: II. Process efficiency in event pyramiding and trait fixation.

    PubMed

    Peng, Ting; Sun, Xiaochun; Mumm, Rita H

    2014-01-01

    Multiple trait integration (MTI) is a multi-step process of converting an elite variety/hybrid for value-added traits (e.g. transgenic events) through backcross breeding. From a breeding standpoint, MTI involves four steps: single event introgression, event pyramiding, trait fixation, and version testing. This study explores the feasibility of marker-aided backcross conversion of a target maize hybrid for 15 transgenic events in the light of the overall goal of MTI of recovering equivalent performance in the finished hybrid conversion along with reliable expression of the value-added traits. Using the results to optimize single event introgression (Peng et al. Optimized breeding strategies for multiple trait integration: I. Minimizing linkage drag in single event introgression. Mol Breed, 2013) which produced single event conversions of recurrent parents (RPs) with ≤8 cM of residual non-recurrent parent (NRP) germplasm with ~1 cM of NRP germplasm in the 20 cM regions flanking the event, this study focused on optimizing process efficiency in the second and third steps in MTI: event pyramiding and trait fixation. Using computer simulation and probability theory, we aimed to (1) fit an optimal breeding strategy for pyramiding of eight events into the female RP and seven in the male RP, and (2) identify optimal breeding strategies for trait fixation to create a 'finished' conversion of each RP homozygous for all events. In addition, next-generation seed needs were taken into account for a practical approach to process efficiency. Building on work by Ishii and Yonezawa (Optimization of the marker-based procedures for pyramiding genes from multiple donor lines: I. Schedule of crossing between the donor lines. Crop Sci 47:537-546, 2007a), a symmetric crossing schedule for event pyramiding was devised for stacking eight (seven) events in a given RP. Options for trait fixation breeding strategies considered selfing and doubled haploid approaches to achieve homozygosity as well as seed chipping and tissue sampling approaches to facilitate genotyping. With selfing approaches, two generations of selfing rather than one for trait fixation (i.e. 'F2 enrichment' as per Bonnett et al. in Strategies for efficient implementation of molecular markers in wheat breeding. Mol Breed 15:75-85, 2005) were utilized to eliminate bottlenecking due to extremely low frequencies of desired genotypes in the population. The efficiency indicators such as total number of plants grown across generations, total number of marker data points, total number of generations, number of seeds sampled by seed chipping, number of plants requiring tissue sampling, and number of pollinations (i.e. selfing and crossing) were considered in comparisons of breeding strategies. A breeding strategy involving seed chipping and a two-generation selfing approach (SC + SELF) was determined to be the most efficient breeding strategy in terms of time to market and resource requirements. Doubled haploidy may have limited utility in trait fixation for MTI under the defined breeding scenario. This outcome paves the way for optimizing the last step in the MTI process, version testing, which involves hybridization of female and male RP conversions to create versions of the converted hybrid for performance evaluation and possible commercial release.

  16. Synthesis and Labeling of RNA In Vitro

    PubMed Central

    Huang, Chao; Yu, Yi-Tao

    2013-01-01

    This unit discusses several methods for generating large amounts of uniformly labeled, end-labeled, and site-specifically labeled RNAs in vitro. The methods involve a number of experimental procedures, including RNA transcription, 5′ dephosphorylation and rephosphorylation, 3′ terminal nucleotide addition (via ligation), site-specific RNase H cleavage directed by 2′-O-methyl RNA-DNA chimeras, and 2-piece splint ligation. The applications of these RNA radiolabeling approaches are also discussed. PMID:23547015

  17. Development and Evaluation of Positioning Systems for Autonomous Vehicle Navigation

    DTIC Science & Technology

    2001-12-01

    generation of autonomous vehicles to utilize NTV technology is built on a commercially-available vehicle built by ASV. The All-Purpose Remote Transport...larger scale, AFRL and CIMAR are involved in the development of a standard approach in the design and specification of autonomous vehicles being...1996. Shi92 Shin, D.H., Sanjiv, S., and Lee, J.J., “Explicit Path Tracking by Autonomous Vehicles ,” Robotica, 10, (1992), 69-87. Ste95

  18. [Cancer immunotherapy. Importance of overcoming immune suppression].

    PubMed

    Malvicini, Mariana; Puchulo, Guillermo; Matar, Pablo; Mazzolini, Guillermo

    2010-01-01

    Increasing evidence indicates that the immune system is involved in the control of tumor progression. Effective antitumor immune response depends on the interaction between several components of the immune system, including antigen-presenting cells and different T cell subsets. However, tumor cells develop a number of mechanisms to escape recognition and elimination by the immune system. In this review we discuss these mechanisms and address possible therapeutic approaches to overcome the immune suppression generated by tumors.

  19. Scenario Generation and Assessment Framework Solution in Support of the Comprehensive Approach

    DTIC Science & Technology

    2010-04-01

    attention, stress, fatigue etc.) and neurofeedback tracking for evaluation in a qualitative manner the real involvement of the trained participants in CAX...Series, Softrade, 2006 (in Bulgarian). [11] Minchev Z., Dukov G., Georgiev S. EEG Spectral Analysis in Serious Gaming: An Ad Hoc Experimental...Nonlinear and linear forecasting of the EEG time series, Biological Cybernetics, 66, 221-259, 1991. [20] Schubert, J., Svenson, P., and Mårtenson, Ch

  20. Aqueous sulfate separation by crystallization of sulfate–water clusters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Custelcean, Radu; Williams, Neil J.; Seipp, Charles A.

    An effective approach to separating sulfates from aqueous solutions is based on the crystallization of extended [SO 4(H 2O) 5 2-] n sulfate–water clusters with a bis(guanidinium) ligand. The ligand was generated in situ by hydrazone condensation in water, thus avoiding elaborate syntheses, tedious purifications, and organic solvents. Crystallization of sulfate–water clusters represents an alternative to the now established sulfate separation strategies that involve encapsulating the “naked” anion.

  1. Aqueous sulfate separation by crystallization of sulfate–water clusters

    DOE PAGES

    Custelcean, Radu; Williams, Neil J.; Seipp, Charles A.

    2015-08-07

    An effective approach to separating sulfates from aqueous solutions is based on the crystallization of extended [SO 4(H 2O) 5 2-] n sulfate–water clusters with a bis(guanidinium) ligand. The ligand was generated in situ by hydrazone condensation in water, thus avoiding elaborate syntheses, tedious purifications, and organic solvents. Crystallization of sulfate–water clusters represents an alternative to the now established sulfate separation strategies that involve encapsulating the “naked” anion.

  2. Alternate approaches to future electron-positron linear colliders

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Loew, G.A.

    1998-07-01

    The purpose of this article is two-fold: to review the current international status of various design approaches to the next generation of e{sup +}e{sup {minus}} linear colliders, and on the occasion of his 80th birthday, to celebrate Richard B. Neal`s many contributions to the field of linear accelerators. As it turns out, combining these two tasks is a rather natural enterprise because of Neal`s long professional involvement and insight into many of the problems and options which the international e{sup +}e{sup {minus}} linear collider community is currently studying to achieve a practical design for a future machine.

  3. Protein unfolding under isometric tension-what force can integrins generate, and can it unfold FNIII domains?

    PubMed

    Erickson, Harold P

    2017-02-01

    Extracellular matrix fibrils of fibronectin (FN) are highly elastic, and are typically stretched three to four times their relaxed length. The mechanism of stretching has been controversial, in particular whether it involves tension-induced unfolding of FNIII domains. Recent studies have found that ∼5pN is the threshold isometric force for unfolding various protein domains. FNIII domains should therefore not be unfolded until the tension approaches 5pN. Integrins have been reported to generate forces ranging from 1 to >50pN, but I argue that studies reporting 1-2pN are the most convincing. This is not enough to unfold FNIII domains. Even if domains were unfolded, 2pN would only extend the worm-like-chain to about twice the length of the folded domain. Overall I conclude that stretching FN matrix fibrils involves primarily the compact to extended conformational change of FN dimers, with minimal contribution from unfolding FNIII domains. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Automated Urban Travel Interpretation: A Bottom-up Approach for Trajectory Segmentation.

    PubMed

    Das, Rahul Deb; Winter, Stephan

    2016-11-23

    Understanding travel behavior is critical for an effective urban planning as well as for enabling various context-aware service provisions to support mobility as a service (MaaS). Both applications rely on the sensor traces generated by travellers' smartphones. These traces can be used to interpret travel modes, both for generating automated travel diaries as well as for real-time travel mode detection. Current approaches segment a trajectory by certain criteria, e.g., drop in speed. However, these criteria are heuristic, and, thus, existing approaches are subjective and involve significant vagueness and uncertainty in activity transitions in space and time. Also, segmentation approaches are not suited for real time interpretation of open-ended segments, and cannot cope with the frequent gaps in the location traces. In order to address all these challenges a novel, state based bottom-up approach is proposed. This approach assumes a fixed atomic segment of a homogeneous state, instead of an event-based segment, and a progressive iteration until a new state is found. The research investigates how an atomic state-based approach can be developed in such a way that can work in real time, near-real time and offline mode and in different environmental conditions with their varying quality of sensor traces. The results show the proposed bottom-up model outperforms the existing event-based segmentation models in terms of adaptivity, flexibility, accuracy and richness in information delivery pertinent to automated travel behavior interpretation.

  5. Automated Urban Travel Interpretation: A Bottom-up Approach for Trajectory Segmentation

    PubMed Central

    Das, Rahul Deb; Winter, Stephan

    2016-01-01

    Understanding travel behavior is critical for an effective urban planning as well as for enabling various context-aware service provisions to support mobility as a service (MaaS). Both applications rely on the sensor traces generated by travellers’ smartphones. These traces can be used to interpret travel modes, both for generating automated travel diaries as well as for real-time travel mode detection. Current approaches segment a trajectory by certain criteria, e.g., drop in speed. However, these criteria are heuristic, and, thus, existing approaches are subjective and involve significant vagueness and uncertainty in activity transitions in space and time. Also, segmentation approaches are not suited for real time interpretation of open-ended segments, and cannot cope with the frequent gaps in the location traces. In order to address all these challenges a novel, state based bottom-up approach is proposed. This approach assumes a fixed atomic segment of a homogeneous state, instead of an event-based segment, and a progressive iteration until a new state is found. The research investigates how an atomic state-based approach can be developed in such a way that can work in real time, near-real time and offline mode and in different environmental conditions with their varying quality of sensor traces. The results show the proposed bottom-up model outperforms the existing event-based segmentation models in terms of adaptivity, flexibility, accuracy and richness in information delivery pertinent to automated travel behavior interpretation. PMID:27886053

  6. Innovating for cash.

    PubMed

    Andrew, James P; Sirkin, Harold L

    2003-09-01

    Despite companies' almost fanatical worship of innovation, most new products don't generate money. That's because executives don't realize that the approach they take to commercializing a new product is as important as the innovation itself. Different approaches can generate very different levels of profit. Companies tend to favor one of three different innovation approaches, each with its own investment profile, profitability pattern, risk profile, and skill requirements. Most organizations are instinctively integrators: They manage all the steps needed to take a product to market themselves. Organizations can also choose to be orchestrators: They focus on some parts of the commercialization process and depend on partners to manage the rest. And finally, companies can be licensers: They sell or license a new product or idea to another organization that handles the commercialization process. Different innovations require different approaches. Selecting the most suitable approach, the authors' research found, often yields two or three times the profits of the least optimal approach. Yet companies tend to rely only on the mode most familiar to them. Executives would do better to take several different factors into account before deciding which tack to take, including the industry they're trying to enter, the specific characteristics of the innovation, and the risks involved in taking the product to market. By doing so, companies can match the approach to the opportunity and reap the maximum profit. Choosing the wrong approach, like Polaroid did, for example, can lead to the failure of both the product and the company. Optimizing their approaches, as Whirlpool has done, helps ensure that companies' innovations make money.

  7. Examining Involvement as a Critical Factor: Perceptions from First Generation and Non-First Generation College Students

    ERIC Educational Resources Information Center

    Davenport, Mona Yvette

    2010-01-01

    This study tested the perceptions of involvement components (Non-Academic Facility Usage, Intra-Racial Relations, Campus and Charleston Involvement, Faculty Interaction, Academic Facility Usage, Inter-Racial Relations, Cultural Center Usage, and Athletic Facilities Usage) for first generation and non-first generation African American and Hispanic…

  8. Posterior versus Frontal Theta Activity Indexes Approach Motivation during Affective Autobiographical Memories

    PubMed Central

    Walden, Keegan; Pornpattananangkul, Narun; Curlee, Alexandria; McAdams, Dan P.; Nusslock, Robin

    2016-01-01

    Research has recently identified a promising neurophysiological marker of approach motivation involving posterior versus frontal (Pz-Fz) electroencephalographic (EEG) theta activity (PFTA; Wacker, Chavanon, & Stemmler, 2006). Preliminary evidence indicates that PFTA is modulated by dopaminergic activity thought to underlie appetitive tendencies, and that it indexes self-reported Behavioral Approach System (BAS) sensitivity. To date, research has largely relied on resting indices of PFTA and has yet to examine the relationship between PFTA and specific approach-related affective states generated by emotionally salient laboratory tasks. Accordingly, the present study evaluated PFTA both at rest and during an ecologically valid autobiographical memory task in which participants recalled personal life experiences involving a goal-striving, an anxious apprehension, a low-point (i.e., difficult) and a neutral memory while EEG data were recorded. In line with prediction, elevated PFTA was observed during both goal-striving and anxious apprehension autobiographical memories. PFTA was particularly elevated during anxious apprehension memories coded as being high on approach-related tendencies. Elevated PFTA during anxious apprehension is consistent with a growing literature indicating that anxious apprehension is associated with elevated approach and reward-related brain function. Lastly, elevated resting PFTA was positively correlated with self-reported trait anger, a negatively valenced emotion characterized by approach-related tendencies. Results have implications for a) enhancing our understanding of the neurophysiology of approach-related emotions, b) establishing PFTA as an index of appetitive motivational states, and c) clarifying our understanding of the neurophysiology and approach-related tendencies associated with both anxious apprehension and anger. PMID:25245178

  9. Generation of SMURF2 knockout human cells using the CRISPR/Cas9 system.

    PubMed

    Manikoth Ayyathan, Dhanoop; Ilić, Nataša; Gil-Henn, Hava; Blank, Michael

    2017-08-15

    The HECT domain E3 ubiquitin ligase SMURF2 regulates stability of several key protein targets involved in tumorigenesis, cell proliferation, migration, differentiation, and senescence. While altered levels and aberrant cellular distribution of SMURF2 were reported in different types of cancer, its role in tumorigenesis is far from understood. To elucidate the role of SMURF2 in cancer, appropriate human cancer cell models are needed. Here, we describe approaches that can be used to generate human normal and cancer cell strains knocked-out for SMURF2 using the clustered regularly interspaced short palindromic repeats (CRISPR/Cas9) gene-editing technology. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. Nonlinear, non-stationary image processing technique for eddy current NDE

    NASA Astrophysics Data System (ADS)

    Yang, Guang; Dib, Gerges; Kim, Jaejoon; Zhang, Lu; Xin, Junjun; Udpa, Lalita

    2012-05-01

    Automatic analysis of eddy current (EC) data has facilitated the analysis of large volumes of data generated in the inspection of steam generator tubes in nuclear power plants. The traditional procedure for analysis of EC data includes data calibration, pre-processing, region of interest (ROI) detection, feature extraction and classification. Accurate ROI detection has been enhanced by pre-processing, which involves reducing noise and other undesirable components as well as enhancing defect indications in the raw measurement. This paper presents the Hilbert-Huang Transform (HHT) for feature extraction and support vector machine (SVM) for classification. The performance is shown to significantly better than the existing rule based classification approach used in industry.

  11. Improving the flow representation in a stochastic programming model for hydropower operations in Chile

    NASA Astrophysics Data System (ADS)

    Morales, Y.; Olivares, M. A.; Vargas, X.

    2015-12-01

    This research aims to improve the representation of stochastic water inflows to hydropower plants used in a grid-wide, power production scheduling model in central Chile. The model prescribes the operation of every plant in the system, including hydropower plants located in several basins, and uses stochastic dual dynamic programming (SDDP) with possible inflow scenarios defined from historical records. Each year of record is treated as a sample of weekly inflows to power plants, assuming this intrinsically incorporates spatial and temporal correlations, without any further autocorrelation analysis of the hydrological time series. However, standard good practice suggests the use of synthetic flows instead of raw historical records.The proposed approach generates synthetic inflow scenarios based on hydrological modeling of a few basins in the system and transposition of flows with other basins within so-called homogeneous zones. Hydrologic models use precipitation and temperature as inputs, and therefore this approach requires producing samples of those variables. Development and calibration of these models imply a greater demand of time compared to the purely statistical approach to synthetic flows. This approach requires consideration of the main uses in the basins: agriculture and hydroelectricity. Moreover a geostatistical analysis of the area is analyzed to generate a map that identifies the relationship between the points where the hydrological information is generated and other points of interest within the power system. Consideration of homogeneous zones involves a decrease in the effort required for generation of information compared with hydrological modeling of every point of interest. It is important to emphasize that future scenarios are derived through a probabilistic approach that incorporates the features of the hydrological year type (dry, normal or wet), covering the different possibilities in terms of availability of water resources. We present the results for Maule basin in Chile's Central Interconnected System (SIC).

  12. Factors controlling volume errors through 2D gully erosion assessment: guidelines for optimal survey design

    NASA Astrophysics Data System (ADS)

    Castillo, Carlos; Pérez, Rafael

    2017-04-01

    The assessment of gully erosion volumes is essential for the quantification of soil losses derived from this relevant degradation process. Traditionally, 2D and 3D approaches has been applied for this purpose (Casalí et al., 2006). Although innovative 3D approaches have recently been proposed for gully volume quantification, a renewed interest can be found in literature regarding the useful information that cross-section analysis still provides in gully erosion research. Moreover, the application of methods based on 2D approaches can be the most cost-effective approach in many situations such as preliminary studies with low accuracy requirements or surveys under time or budget constraints. The main aim of this work is to examine the key factors controlling volume error variability in 2D gully assessment by means of a stochastic experiment involving a Monte Carlo analysis over synthetic gully profiles in order to 1) contribute to a better understanding of the drivers and magnitude of gully erosion 2D-surveys uncertainty and 2) provide guidelines for optimal survey designs. Owing to the stochastic properties of error generation in 2D volume assessment, a statistical approach was followed to generate a large and significant set of gully reach configurations to evaluate quantitatively the influence of the main factors controlling the uncertainty of the volume assessment. For this purpose, a simulation algorithm in Matlab® code was written, involving the following stages: - Generation of synthetic gully area profiles with different degrees of complexity (characterized by the cross-section variability) - Simulation of field measurements characterised by a survey intensity and the precision of the measurement method - Quantification of the volume error uncertainty as a function of the key factors In this communication we will present the relationships between volume error and the studied factors and propose guidelines for 2D field surveys based on the minimal survey densities required to achieve a certain accuracy given the cross-sectional variability of a gully and the measurement method applied. References Casali, J., Loizu, J., Campo, M.A., De Santisteban, L.M., Alvarez-Mozos, J., 2006. Accuracy of methods for field assessment of rill and ephemeral gully erosion. Catena 67, 128-138. doi:10.1016/j.catena.2006.03.005

  13. Pharmacodynamics and common drug-drug interactions of the third-generation antiepileptic drugs.

    PubMed

    Stefanović, Srđan; Janković, Slobodan M; Novaković, Milan; Milosavljević, Marko; Folić, Marko

    2018-02-01

    Anticonvulsants that belong to the third generation are considered as 'newer' antiepileptic drugs, including: eslicarbazepine acetate, lacosamide, perampanel, brivaracetam, rufinamide and stiripentol. Areas covered: This article reviews pharmacodynamics (i.e. mechanisms of action) and clinically relevant drug-drug interactions of the third-generation antiepileptic drugs. Expert opinion: Newer antiepileptic drugs have mechanisms of action which are not shared with the first and the second generation anticonvulsants, like inhibition of neurotransmitters release, blocking receptors for excitatory amino acids and new ways of sodium channel inactivation. New mechanisms of action increase chances of controlling forms of epilepsy resistant to older anticonvulsants. Important advantage of the third-generation anticonvulsants could be their little propensity for interactions with both antiepileptic and other drugs observed until now, making prescribing much easier and safer. However, this may change with new studies specifically designed to discover drug-drug interactions. Although the third-generation antiepileptic drugs enlarged therapeutic palette against epilepsy, 20-30% of patients with epilepsy is still treatment-resistant and need new pharmacological approach. There is great need to explore all molecular targets that may directly or indirectly be involved in generation of seizures, so a number of candidate compounds for even newer anticonvulsants could be generated.

  14. Quasi-isentropic compression using compressed water flow generated by underwater electrical explosion of a wire array

    NASA Astrophysics Data System (ADS)

    Gurovich, V.; Virozub, A.; Rososhek, A.; Bland, S.; Spielman, R. B.; Krasik, Ya. E.

    2018-05-01

    A major experimental research area in material equation-of-state today involves the use of off-Hugoniot measurements rather than shock experiments that give only Hugoniot data. There is a wide range of applications using quasi-isentropic compression of matter including the direct measurement of the complete isentrope of materials in a single experiment and minimizing the heating of flyer plates for high-velocity shock measurements. We propose a novel approach to generating quasi-isentropic compression of matter. Using analytical modeling and hydrodynamic simulations, we show that a working fluid composed of compressed water, generated by an underwater electrical explosion of a planar wire array, might be used to efficiently drive the quasi-isentropic compression of a copper target to pressures ˜2 × 1011 Pa without any complex target designs.

  15. Metagenome changes in the biogas producing community during anaerobic digestion of rice straw.

    PubMed

    Pore, Soham D; Shetty, Deepa; Arora, Preeti; Maheshwari, Sneha; Dhakephalkar, Prashant K

    2016-08-01

    The present investigation was undertaken to study the microbial community succession in a sour and healthy digester. Ion torrent next-generation sequencing (NGS)-based metagenomic approach indicated abundance of hydrolytic bacteria and exclusion of methanogens and syntrophic bacteria in sour digester. Functional gene analysis revealed higher abundance of enzymes involved in acidogenesis and lower abundance of enzymes associated with methanogenesis like Methyl coenzyme M-reductase, F420 dependent reductase and Formylmethanofuran dehydrogenase in sour digester. Increased abundance of methanogens (Methanomicrobia) and genes involved in methanogenesis was observed in the restored/healthy digester highlighting revival of pH sensitive methanogenic community. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Application of the whole-transcriptome shotgun sequencing approach to the study of Philadelphia-positive acute lymphoblastic leukemia

    PubMed Central

    Iacobucci, I; Ferrarini, A; Sazzini, M; Giacomelli, E; Lonetti, A; Xumerle, L; Ferrari, A; Papayannidis, C; Malerba, G; Luiselli, D; Boattini, A; Garagnani, P; Vitale, A; Soverini, S; Pane, F; Baccarani, M; Delledonne, M; Martinelli, G

    2012-01-01

    Although the pathogenesis of BCR–ABL1-positive acute lymphoblastic leukemia (ALL) is mainly related to the expression of the BCR–ABL1 fusion transcript, additional cooperating genetic lesions are supposed to be involved in its development and progression. Therefore, in an attempt to investigate the complex landscape of mutations, changes in expression profiles and alternative splicing (AS) events that can be observed in such disease, the leukemia transcriptome of a BCR–ABL1-positive ALL patient at diagnosis and at relapse was sequenced using a whole-transcriptome shotgun sequencing (RNA-Seq) approach. A total of 13.9 and 15.8 million sequence reads was generated from de novo and relapsed samples, respectively, and aligned to the human genome reference sequence. This led to the identification of five validated missense mutations in genes involved in metabolic processes (DPEP1, TMEM46), transport (MVP), cell cycle regulation (ABL1) and catalytic activity (CTSZ), two of which resulted in acquired relapse variants. In all, 6390 and 4671 putative AS events were also detected, as well as expression levels for 18 315 and 18 795 genes, 28% of which were differentially expressed in the two disease phases. These data demonstrate that RNA-Seq is a suitable approach for identifying a wide spectrum of genetic alterations potentially involved in ALL. PMID:22829256

  17. Multi-Generational Perspectives: How They Interact and Impact Service to Students and Their Families in an Age of Highly-Involved Parents

    ERIC Educational Resources Information Center

    Wawrzusin, Andrea C.

    2013-01-01

    Although there have always been differences in how generations navigate decision-making in higher education, highly involved parents have led to conflicting inter-generational educational expectations. This research study investigated the phenomenon of parental involvement and how meanings on educational expectations vary depending on generation.…

  18. Deconvolution When Classifying Noisy Data Involving Transformations.

    PubMed

    Carroll, Raymond; Delaigle, Aurore; Hall, Peter

    2012-09-01

    In the present study, we consider the problem of classifying spatial data distorted by a linear transformation or convolution and contaminated by additive random noise. In this setting, we show that classifier performance can be improved if we carefully invert the data before the classifier is applied. However, the inverse transformation is not constructed so as to recover the original signal, and in fact, we show that taking the latter approach is generally inadvisable. We introduce a fully data-driven procedure based on cross-validation, and use several classifiers to illustrate numerical properties of our approach. Theoretical arguments are given in support of our claims. Our procedure is applied to data generated by light detection and ranging (Lidar) technology, where we improve on earlier approaches to classifying aerosols. This article has supplementary materials online.

  19. Assessing the ground vibrations produced by a heavy vehicle traversing a traffic obstacle.

    PubMed

    Ducarne, Loïc; Ainalis, Daniel; Kouroussis, Georges

    2018-01-15

    Despite advancements in alternative transport networks, road transport remains the dominant mode in many modern and developing countries. The ground-borne motions produced by the passage of a heavy vehicle over a geometric obstacle (e.g. speed hump, train tracks) pose a fundamental problem in transport annoyance in urban areas. In order to predict the ground vibrations generated by the passage of a heavy vehicle over a geometric obstacle, a two-step numerical model is developed. The first step involves simulating the dynamic loads generated by the heavy vehicle using a multibody approach, which includes the tyre-obstacle-ground interaction. The second step involves the simulation of the ground wave propagation using a three dimensional finite element model. The simulation is able to be decoupled due to the large difference in stiffness between the vehicle's tyres and the road. First, the two-step model is validated using an experimental case study available in the literature. A sensitivity analysis is then presented, examining the influence of various factors on the generated ground vibrations. Factors investigated include obstacle shape, obstacle dimensions, vehicle speed, and tyre stiffness. The developed model can be used as a tool in the early planning stages to predict the ground vibrations generated by the passage of a heavy vehicle over an obstacle in urban areas. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. The Sandwich Generation Diner: Development of a Web-Based Health Intervention for Intergenerational Caregivers

    PubMed Central

    George, Nika; MacDougall, Megan

    2016-01-01

    Background Women are disproportionately likely to assist aging family members; approximately 53 million in the United States are involved with the health care of aging parents, in-laws, or other relatives. The busy schedules of “sandwich generation” women who care for older relatives require accessible and flexible health education, including Web-based approaches. Objective This paper describes the development and implementation of a Web-based health education intervention, The Sandwich Generation Diner, as a tool for intergenerational caregivers of older adults with physical and cognitive impairments. Methods We used Bartholomew’s Intervention Mapping (IM) process to develop our theory-based health education program. Bandura’s (1997) self-efficacy theory provided the overarching theoretical model. Results The Sandwich Generation Diner website features four modules that address specific health care concerns. Our research involves randomly assigning caregiver participants to one of two experimental conditions that are identical in the type of information provided, but vary significantly in the presentation. In addition to structured Web-based assessments, specific website usage data are recorded. Conclusions The Sandwich Generation Diner was developed to address some of the informational and self-efficacy needs of intergenerational female caregivers. The next step is to demonstrate that this intervention is: (1) attractive and effective with families assisting older adults, and (2) feasible to embed within routine home health services for older adults. PMID:27269632

  1. Cell-free immunology: construction and in vitro expression of a PCR-based library encoding a single-chain antibody repertoire.

    PubMed

    Makeyev, E V; Kolb, V A; Spirin, A S

    1999-02-12

    A novel cloning-independent strategy has been developed to generate a combinatorial library of PCR fragments encoding a murine single-chain antibody repertoire and express it directly in a cell-free system. The new approach provides an effective alternative to the techniques involving in vivo procedures of preparation and handling large libraries of antibodies. The possible use of the described strategy in the ribosome display is discussed.

  2. Qualitative research: a brief description.

    PubMed

    Kemparaj, Umesh; Chavan, Sangeeta

    2013-01-01

    Qualitative research refers to, a range of methodological approaches which aim to generate an in-depth and interpreted understanding of the social world, by learning about people's social and material circumstances, their experiences, perspectives, and histories. Requires researchers to become intensely involved, often remaining in field for lengthy periods of time. The greatest value of qualitative research is its ability to address questions of relevance to public health knowledge and practice which are difficult to answer satisfactorily using quantitative methods.

  3. Modeling Regional Seismic Waves from Underground Nuclear Explosion

    DTIC Science & Technology

    1989-05-15

    consider primarily the long-period tangenital motions in this pilot study because less computational effort is involved compared to modeling the P-SV system...error testing can be a time- consuming endeavor but the basic approach has proven effective in previous studies (Vidale et aL, 1985; Helmberger and Vidale...at various depths in a variety of basin models were generated to test the above hypothesis. When the source is situated in the sediments and when the

  4. Novel genes and mutations in patients affected by recurrent pregnancy loss.

    PubMed

    Quintero-Ronderos, Paula; Mercier, Eric; Fukuda, Michiko; González, Ronald; Suárez, Carlos Fernando; Patarroyo, Manuel Alfonso; Vaiman, Daniel; Gris, Jean-Christophe; Laissue, Paul

    2017-01-01

    Recurrent pregnancy loss is a frequently occurring human infertility-related disease affecting ~1% of women. It has been estimated that the cause remains unexplained in >50% cases which strongly suggests that genetic factors may contribute towards the phenotype. Concerning its molecular aetiology numerous studies have had limited success in identifying the disease's genetic causes. This might have been due to the fact that hundreds of genes are involved in each physiological step necessary for guaranteeing reproductive success in mammals. In such scenario, next generation sequencing provides a potentially interesting tool for research into recurrent pregnancy loss causative mutations. The present study involved whole-exome sequencing and an innovative bioinformatics analysis, for the first time, in 49 unrelated women affected by recurrent pregnancy loss. We identified 27 coding variants (22 genes) potentially related to the phenotype (41% of patients). The affected genes, which were enriched by potentially deleterious sequence variants, belonged to distinct molecular cascades playing key roles in implantation/pregnancy biology. Using a quantum chemical approach method we established that mutations in MMP-10 and FGA proteins led to substantial energetic modifications suggesting an impact on their functions and/or stability. The next generation sequencing and bioinformatics approaches presented here represent an efficient way to find mutations, having potentially moderate/strong functional effects, associated with recurrent pregnancy loss aetiology. We consider that some of these variants (and genes) represent probable future biomarkers for recurrent pregnancy loss.

  5. Generating short-term probabilistic wind power scenarios via nonparametric forecast error density estimators: Generating short-term probabilistic wind power scenarios via nonparametric forecast error density estimators

    DOE PAGES

    Staid, Andrea; Watson, Jean -Paul; Wets, Roger J. -B.; ...

    2017-07-11

    Forecasts of available wind power are critical in key electric power systems operations planning problems, including economic dispatch and unit commitment. Such forecasts are necessarily uncertain, limiting the reliability and cost effectiveness of operations planning models based on a single deterministic or “point” forecast. A common approach to address this limitation involves the use of a number of probabilistic scenarios, each specifying a possible trajectory of wind power production, with associated probability. We present and analyze a novel method for generating probabilistic wind power scenarios, leveraging available historical information in the form of forecasted and corresponding observed wind power timemore » series. We estimate non-parametric forecast error densities, specifically using epi-spline basis functions, allowing us to capture the skewed and non-parametric nature of error densities observed in real-world data. We then describe a method to generate probabilistic scenarios from these basis functions that allows users to control for the degree to which extreme errors are captured.We compare the performance of our approach to the current state-of-the-art considering publicly available data associated with the Bonneville Power Administration, analyzing aggregate production of a number of wind farms over a large geographic region. Finally, we discuss the advantages of our approach in the context of specific power systems operations planning problems: stochastic unit commitment and economic dispatch. Here, our methodology is embodied in the joint Sandia – University of California Davis Prescient software package for assessing and analyzing stochastic operations strategies.« less

  6. Generating short-term probabilistic wind power scenarios via nonparametric forecast error density estimators: Generating short-term probabilistic wind power scenarios via nonparametric forecast error density estimators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Staid, Andrea; Watson, Jean -Paul; Wets, Roger J. -B.

    Forecasts of available wind power are critical in key electric power systems operations planning problems, including economic dispatch and unit commitment. Such forecasts are necessarily uncertain, limiting the reliability and cost effectiveness of operations planning models based on a single deterministic or “point” forecast. A common approach to address this limitation involves the use of a number of probabilistic scenarios, each specifying a possible trajectory of wind power production, with associated probability. We present and analyze a novel method for generating probabilistic wind power scenarios, leveraging available historical information in the form of forecasted and corresponding observed wind power timemore » series. We estimate non-parametric forecast error densities, specifically using epi-spline basis functions, allowing us to capture the skewed and non-parametric nature of error densities observed in real-world data. We then describe a method to generate probabilistic scenarios from these basis functions that allows users to control for the degree to which extreme errors are captured.We compare the performance of our approach to the current state-of-the-art considering publicly available data associated with the Bonneville Power Administration, analyzing aggregate production of a number of wind farms over a large geographic region. Finally, we discuss the advantages of our approach in the context of specific power systems operations planning problems: stochastic unit commitment and economic dispatch. Here, our methodology is embodied in the joint Sandia – University of California Davis Prescient software package for assessing and analyzing stochastic operations strategies.« less

  7. An Experimental Approach to Controllably Vary Protein Oxidation While Minimizing Electrode Adsorption for Boron-Doped Diamond Electrochemical Surface Mapping Applications

    PubMed Central

    McClintock, Carlee S; Hettich, Robert L.

    2012-01-01

    Oxidative protein surface mapping has become a powerful approach for measuring the solvent accessibility of folded protein structures. A variety of techniques exist for generating the key reagent – hydroxyl radicals – for these measurements; however, these approaches range significantly in their complexity and expense of operation. This research expands upon earlier work to enhance the controllability of boron-doped diamond (BDD) electrochemistry as an easily accessible tool for producing hydroxyl radicals in order to oxidize a range of intact proteins. Efforts to modulate oxidation level while minimizing the adsorption of protein to the electrode involved the use of relatively high flow rates to reduce protein residence time inside the electrochemical flow chamber. Additionally, a different cell activation approach using variable voltage to supply a controlled current allowed us to precisely tune the extent of oxidation in a protein-dependent manner. In order to gain perspective on the level of protein adsorption onto the electrode surface, studies were conducted to monitor protein concentration during electrolysis and gauge changes in the electrode surface between cell activation events. This report demonstrates the successful use of BDD electrochemistry for greater precision in generating a target number of oxidation events upon intact proteins. PMID:23210708

  8. Automatic image database generation from CAD for 3D object recognition

    NASA Astrophysics Data System (ADS)

    Sardana, Harish K.; Daemi, Mohammad F.; Ibrahim, Mohammad K.

    1993-06-01

    The development and evaluation of Multiple-View 3-D object recognition systems is based on a large set of model images. Due to the various advantages of using CAD, it is becoming more and more practical to use existing CAD data in computer vision systems. Current PC- level CAD systems are capable of providing physical image modelling and rendering involving positional variations in cameras, light sources etc. We have formulated a modular scheme for automatic generation of various aspects (views) of the objects in a model based 3-D object recognition system. These views are generated at desired orientations on the unit Gaussian sphere. With a suitable network file sharing system (NFS), the images can directly be stored on a database located on a file server. This paper presents the image modelling solutions using CAD in relation to multiple-view approach. Our modular scheme for data conversion and automatic image database storage for such a system is discussed. We have used this approach in 3-D polyhedron recognition. An overview of the results, advantages and limitations of using CAD data and conclusions using such as scheme are also presented.

  9. Generation of embryos directly from embryonic stem cells by tetraploid embryo complementation reveals a role for GATA factors in organogenesis.

    PubMed

    Duncan, S A

    2005-12-01

    Gene targeting in ES (embryonic stem) cells has been used extensively to study the role of proteins during embryonic development. In the traditional procedure, this requires the generation of chimaeric mice by introducing ES cells into blastocysts and allowing them to develop to term. Once chimaeric mice are produced, they are bred into a recipient mouse strain to establish germline transmission of the allele of interest. Although this approach has been used very successfully, the breeding cycles involved are time consuming. In addition, genes that are essential for organogenesis often have roles in the formation of extra-embryonic tissues that are essential for early stages of post-implantation development. For example, mice lacking the GATA transcription factors, GATA4 or GATA6, arrest during gastrulation due to an essential role for these factors in differentiation of extra-embryonic endoderm. This lethality has frustrated the study of these factors during the development of organs such as the liver and heart. Extraembryonic defects can, however, be circumvented by generating clonal mouse embryos directly from ES cells by tetraploid complementation. Here, we describe the usefulness and efficacy of this approach using GATA factors as an example.

  10. Random vs. Combinatorial Methods for Discrete Event Simulation of a Grid Computer Network

    NASA Technical Reports Server (NTRS)

    Kuhn, D. Richard; Kacker, Raghu; Lei, Yu

    2010-01-01

    This study compared random and t-way combinatorial inputs of a network simulator, to determine if these two approaches produce significantly different deadlock detection for varying network configurations. Modeling deadlock detection is important for analyzing configuration changes that could inadvertently degrade network operations, or to determine modifications that could be made by attackers to deliberately induce deadlock. Discrete event simulation of a network may be conducted using random generation, of inputs. In this study, we compare random with combinatorial generation of inputs. Combinatorial (or t-way) testing requires every combination of any t parameter values to be covered by at least one test. Combinatorial methods can be highly effective because empirical data suggest that nearly all failures involve the interaction of a small number of parameters (1 to 6). Thus, for example, if all deadlocks involve at most 5-way interactions between n parameters, then exhaustive testing of all n-way interactions adds no additional information that would not be obtained by testing all 5-way interactions. While the maximum degree of interaction between parameters involved in the deadlocks clearly cannot be known in advance, covering all t-way interactions may be more efficient than using random generation of inputs. In this study we tested this hypothesis for t = 2, 3, and 4 for deadlock detection in a network simulation. Achieving the same degree of coverage provided by 4-way tests would have required approximately 3.2 times as many random tests; thus combinatorial methods were more efficient for detecting deadlocks involving a higher degree of interactions. The paper reviews explanations for these results and implications for modeling and simulation.

  11. The Essential Elements of a Risk Governance Framework for Current and Future Nanotechnologies.

    PubMed

    Stone, Vicki; Führ, Martin; Feindt, Peter H; Bouwmeester, Hans; Linkov, Igor; Sabella, Stefania; Murphy, Finbarr; Bizer, Kilian; Tran, Lang; Ågerstrand, Marlene; Fito, Carlos; Andersen, Torben; Anderson, Diana; Bergamaschi, Enrico; Cherrie, John W; Cowan, Sue; Dalemcourt, Jean-Francois; Faure, Michael; Gabbert, Silke; Gajewicz, Agnieszka; Fernandes, Teresa F; Hristozov, Danail; Johnston, Helinor J; Lansdown, Terry C; Linder, Stefan; Marvin, Hans J P; Mullins, Martin; Purnhagen, Kai; Puzyn, Tomasz; Sanchez Jimenez, Araceli; Scott-Fordsmand, Janeck J; Streftaris, George; van Tongeren, Martie; Voelcker, Nicolas H; Voyiatzis, George; Yannopoulos, Spyros N; Poortvliet, P Marijn

    2017-12-14

    Societies worldwide are investing considerable resources into the safe development and use of nanomaterials. Although each of these protective efforts is crucial for governing the risks of nanomaterials, they are insufficient in isolation. What is missing is a more integrative governance approach that goes beyond legislation. Development of this approach must be evidence based and involve key stakeholders to ensure acceptance by end users. The challenge is to develop a framework that coordinates the variety of actors involved in nanotechnology and civil society to facilitate consideration of the complex issues that occur in this rapidly evolving research and development area. Here, we propose three sets of essential elements required to generate an effective risk governance framework for nanomaterials. (1) Advanced tools to facilitate risk-based decision making, including an assessment of the needs of users regarding risk assessment, mitigation, and transfer. (2) An integrated model of predicted human behavior and decision making concerning nanomaterial risks. (3) Legal and other (nano-specific and general) regulatory requirements to ensure compliance and to stimulate proactive approaches to safety. The implementation of such an approach should facilitate and motivate good practice for the various stakeholders to allow the safe and sustainable future development of nanotechnology. © 2017 Society for Risk Analysis.

  12. Robotics-inspired biology.

    PubMed

    Gravish, Nick; Lauder, George V

    2018-03-29

    For centuries, designers and engineers have looked to biology for inspiration. Biologically inspired robots are just one example of the application of knowledge of the natural world to engineering problems. However, recent work by biologists and interdisciplinary teams have flipped this approach, using robots and physical models to set the course for experiments on biological systems and to generate new hypotheses for biological research. We call this approach robotics-inspired biology; it involves performing experiments on robotic systems aimed at the discovery of new biological phenomena or generation of new hypotheses about how organisms function that can then be tested on living organisms. This new and exciting direction has emerged from the extensive use of physical models by biologists and is already making significant advances in the areas of biomechanics, locomotion, neuromechanics and sensorimotor control. Here, we provide an introduction and overview of robotics-inspired biology, describe two case studies and suggest several directions for the future of this exciting new research area. © 2018. Published by The Company of Biologists Ltd.

  13. Eco-innovative design approach: Integrating quality and environmental aspects in prioritizing and solving engineering problems

    NASA Astrophysics Data System (ADS)

    Chakroun, Mahmoud; Gogu, Grigore; Pacaud, Thomas; Thirion, François

    2014-09-01

    This study proposes an eco-innovative design process taking into consideration quality and environmental aspects in prioritizing and solving technical engineering problems. This approach provides a synergy between the Life Cycle Assessment (LCA), the nonquality matrix, the Theory of Inventive Problem Solving (TRIZ), morphological analysis and the Analytical Hierarchy Process (AHP). In the sequence of these tools, LCA assesses the environmental impacts generated by the system. Then, for a better consideration of environmental aspects, a new tool is developed, the non-quality matrix, which defines the problem to be solved first from an environmental point of view. The TRIZ method allows the generation of new concepts and contradiction resolution. Then, the morphological analysis offers the possibility of extending the search space of solutions in a design problem in a systematic way. Finally, the AHP identifies the promising solution(s) by providing a clear logic for the choice made. Their usefulness has been demonstrated through their application to a case study involving a centrifugal spreader with spinning discs.

  14. Estimation of dimensions and orientation of multiple riverine dune generations using spectral moments

    NASA Astrophysics Data System (ADS)

    Lisimenka, Aliaksandr; Kubicki, Adam

    2017-02-01

    A new spectral analysis technique is proposed for rhythmic bedform quantification, based on the 2D Fourier transform involving the calculation of a set of low-order spectral moments. The approach provides a tool for efficient quantification of bedform length and height as well as spatial crest-line alignment. Contrary to the conventional method, it not only describes the most energetic component of an undulating seabed surface but also retrieves information on its secondary structure without application of any band-pass filter of which the upper and lower cut-off frequencies are a priori unknown. Validation is based on bathymetric data collected in the main Vistula River mouth area (Przekop Wisły), Poland. This revealed two generations (distinct groups) of dunes which are migrating seawards along distinct paths, probably related to the hydrological regime of the river. The data enable the identification of dune divergence and convergence zones. The approach proved successful in the parameterisation of topographic roughness, an essential aspect in numerical modelling studies.

  15. [Health care innovation from a territorial perspective: a call for a new approach].

    PubMed

    Costa, Laís Silveira; Gadelha, Carlos Augusto Grabois; Maldonado, José

    2012-12-01

    Innovation plays an increasingly important role in health care, partly because it is responsible for a significant share of national investment in research and development, and partly because of its industrial and service provision base, which provides a conduit to future technology. The relationship between health care and development is also strengthened as a result of the leading role of health care in generating innovation. Nevertheless, Brazil's health care production base is persistently weak, hindering both universal provision of health care services and international competitiveness. This article, based on the theoretical framework of Political Economy and innovation systems, has sought to identify variables in subnational contexts that influence the dynamic of innovation generation in health care. To this end, the theoretical approach used lies on the assumption that innovation is a contextualized social process and that the production base in healthcare will remain weak if new variables involved in the dynamic of innovation are not taken into account.

  16. Multi-thresholds for fault isolation in the presence of uncertainties.

    PubMed

    Touati, Youcef; Mellal, Mohamed Arezki; Benazzouz, Djamel

    2016-05-01

    Monitoring of the faults is an important task in mechatronics. It involves the detection and isolation of faults which are performed by using the residuals. These residuals represent numerical values that define certain intervals called thresholds. In fact, the fault is detected if the residuals exceed the thresholds. In addition, each considered fault must activate a unique set of residuals to be isolated. However, in the presence of uncertainties, false decisions can occur due to the low sensitivity of certain residuals towards faults. In this paper, an efficient approach to make decision on fault isolation in the presence of uncertainties is proposed. Based on the bond graph tool, the approach is developed in order to generate systematically the relations between residuals and faults. The generated relations allow the estimation of the minimum detectable and isolable fault values. The latter is used to calculate the thresholds of isolation for each residual. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  17. The hypothalamic slice approach to neuroendocrinology.

    PubMed

    Hatton, G I

    1983-07-01

    The magnocellular peptidergic cells of the supraoptic and paraventricular nuclei comprise much of what is known as the hypothalamo-neurohypophysial system and is involved in several functions, including body fluid balance, parturition and lactation. While we have learned much from experiments in vivo, they have not produced a clear understanding of some of the crucial features associated with the functioning of this system. In particular, questions relating to the osmosensitivity of magnocellular neurones and the mechanism(s) by which their characteristic firing patterns are generated have not been answered using the older approaches. Electrophysiological studies with brain slices present direct evidence for osmosensitivity, and perhaps even osmoreceptivity, of magnocellular neurones. Other evidence indicates that the phasic bursting patterns of activity associated with vasopressin-releasing neurones (a) occur in the absence of patterned chemical synaptic input, (b) may be modulated by electrotonic conduction across gap junctions connecting magnocellular neurones and (c) are likely to be generated by endogenous membrane currents. These results make untenable the formerly held idea that phasic bursting activity is dependent upon recurrent synaptic inhibition.

  18. A transdisciplinary approach for supporting the integration of ecosystem services into land and water management

    NASA Astrophysics Data System (ADS)

    Fatt Siew, Tuck; Döll, Petra

    2015-04-01

    Transdisciplinary approaches are useful for supporting integrated land and water management. However, the implementation of the approach in practice to facilitate the co-production of useable socio-hydrological (and -ecological) knowledge among scientists and stakeholders is challenging. It requires appropriate methods to bring individuals with diverse interests and needs together and to integrate their knowledge for generating shared perspectives/understanding, identifying common goals, and developing actionable management strategies. The approach and the methods need, particularly, to be adapted to the local political and socio-cultural conditions. To demonstrate how knowledge co-production and integration can be done in practice, we present a transdisciplinary approach which has been implemented and adapted for supporting land and water management that takes ecosystem services into account in an arid region in northwestern China. Our approach comprises three steps: (1) stakeholder analysis and interdisciplinary knowledge integration, (2) elicitation of perspectives of scientists and stakeholders, scenario development, and identification of management strategies, and (3) evaluation of knowledge integration and social learning. Our adapted approach has enabled interdisciplinary and cross-sectoral communication among scientists and stakeholders. Furthermore, the application of a combination of participatory methods, including actor modeling, Bayesian Network modeling, and participatory scenario development, has contributed to the integration of system, target, and transformation knowledge of involved stakeholders. The realization of identified management strategies is unknown because other important and representative decision makers have not been involved in the transdisciplinary research process. The contribution of our transdisciplinary approach to social learning still needs to be assessed.

  19. Posterior versus frontal theta activity indexes approach motivation during affective autobiographical memories.

    PubMed

    Walden, K; Pornpattananangkul, N; Curlee, A; McAdams, D P; Nusslock, R

    2015-03-01

    Research has recently identified a promising neurophysiological marker of approach motivation involving posterior versus frontal (Pz - Fz) electroencephalographic (EEG) theta activity PFTA; Wacker, Chavanon, & Stemmler (Journal of Personality and Social Psychology 91:171-187, 2006). Preliminary evidence indicated that PFTA is modulated by dopaminergic activity, thought to underlie appetitive tendencies, and that it indexes self-reported behavioral activation system (BAS) sensitivity. To date, research has largely relied on resting indices of PFTA and has yet to examine the relationship between PFTA and specific approach-related affective states generated by emotionally salient laboratory tasks. Accordingly, the present study evaluated PFTA both at rest and during an ecologically valid autobiographical memory task in which participants recalled personal life experiences involving a goal-striving, an anxious apprehension, a low-point (i.e., difficult), and a neutral memory while EEG data were recorded. In line with prediction, elevated PFTA was observed during both goal-striving and anxious apprehension autobiographical memories. PFTA was particularly elevated during anxious apprehension memories coded as being high on approach-related tendencies. Elevated PFTA during anxious apprehension is consistent with a growing literature indicating that anxious apprehension is associated with elevated approach- and reward-related brain function. Lastly, elevated resting PFTA was positively correlated with self-reported trait anger, a negatively valenced emotion characterized by approach-related tendencies. These results have implications for (a) enhancing our understanding of the neurophysiology of approach-related emotions, (b) establishing PFTA as an index of appetitive motivational states, and (c) clarifying our understanding of the neurophysiology and approach-related tendencies associated with both anxious apprehension and anger.

  20. Stratway: A Modular Approach to Strategic Conflict Resolution

    NASA Technical Reports Server (NTRS)

    Hagen, George E.; Butler, Ricky W.; Maddalon, Jeffrey M.

    2011-01-01

    In this paper we introduce Stratway, a modular approach to finding long-term strategic resolutions to conflicts between aircraft. The modular approach provides both advantages and disadvantages. Our primary concern is to investigate the implications on the verification of safety-critical properties of a strategic resolution algorithm. By partitioning the problem into verifiable modules much stronger verification claims can be established. Since strategic resolution involves searching for solutions over an enormous state space, Stratway, like most similar algorithms, searches these spaces by applying heuristics, which present especially difficult verification challenges. An advantage of a modular approach is that it makes a clear distinction between the resolution function and the trajectory generation function. This allows the resolution computation to be independent of any particular vehicle. The Stratway algorithm was developed in both Java and C++ and is available through a open source license. Additionally there is a visualization application that is helpful when analyzing and quickly creating conflict scenarios.

  1. "Why not stoichiometry" versus "stoichiometry--why not?" Part I: General context.

    PubMed

    Michałowska-Kaczmarczyk, Anna Maria; Asuero, Agustin G; Michałowski, Tadeusz

    2015-01-01

    The elementary concepts involved with stoichiometry are considered from different viewpoints. Some examples of approximate calculations made according to the stoichiometric scheme are indicated, and correct resolution of the problems involved is presented. The principles of balancing chemical equations, based on their apparent similarities with algebraic equations, are criticized. The review concerns some peculiarities inherent in chemical reaction notation and its use (and abuse) in stoichiometric calculations that provide inconsistent results for various reasons. This "conventional" approach to stoichiometry is put in context with the generalized approach to electrolytic systems (GATES) established by Michałowski. The article contains a number of proposals that could potentially be taken into account and included in the next edition of the Orange Book. Notation of ions used in this article is not, deliberately, in accordance with actual IUPAC requirements in this respect. This article is intended to be provocative with the hope that some critical debate around the important topics treated should be generated and creatively expanded in the scientific community.

  2. Deciphering Epithelial–Mesenchymal Transition Regulatory Networks in Cancer through Computational Approaches

    PubMed Central

    Burger, Gerhard A.; Danen, Erik H. J.; Beltman, Joost B.

    2017-01-01

    Epithelial–mesenchymal transition (EMT), the process by which epithelial cells can convert into motile mesenchymal cells, plays an important role in development and wound healing but is also involved in cancer progression. It is increasingly recognized that EMT is a dynamic process involving multiple intermediate or “hybrid” phenotypes rather than an “all-or-none” process. However, the role of EMT in various cancer hallmarks, including metastasis, is debated. Given the complexity of EMT regulation, computational modeling has proven to be an invaluable tool for cancer research, i.e., to resolve apparent conflicts in experimental data and to guide experiments by generating testable hypotheses. In this review, we provide an overview of computational modeling efforts that have been applied to regulation of EMT in the context of cancer progression and its associated tumor characteristics. Moreover, we identify possibilities to bridge different modeling approaches and point out outstanding questions in which computational modeling can contribute to advance our understanding of pathological EMT. PMID:28824874

  3. A standard based approach for biomedical knowledge representation.

    PubMed

    Farkash, Ariel; Neuvirth, Hani; Goldschmidt, Yaara; Conti, Costanza; Rizzi, Federica; Bianchi, Stefano; Salvi, Erika; Cusi, Daniele; Shabo, Amnon

    2011-01-01

    The new generation of health information standards, where the syntax and semantics of the content is explicitly formalized, allows for interoperability in healthcare scenarios and analysis in clinical research settings. Studies involving clinical and genomic data include accumulating knowledge as relationships between genotypic and phenotypic information as well as associations within the genomic and clinical worlds. Some involve analysis results targeted at a specific disease; others are of a predictive nature specific to a patient and may be used by decision support applications. Representing knowledge is as important as representing data since data is more useful when coupled with relevant knowledge. Any further analysis and cross-research collaboration would benefit from persisting knowledge and data in a unified way. This paper describes a methodology used in Hypergenes, an EC FP7 project targeting Essential Hypertension, which captures data and knowledge using standards such as HL7 CDA and Clinical Genomics, aligned with the CEN EHR 13606 specification. We demonstrate the benefits of such an approach for clinical research as well as in healthcare oriented scenarios.

  4. Contingency Planning for Planetary Rovers

    NASA Technical Reports Server (NTRS)

    Dearden, Richard; Meuleau, Nicolas; Ramakrishnan, Sailesh; Smith, David; Washington, Rich; Clancy, Daniel (Technical Monitor)

    2002-01-01

    There has been considerable work in AI on planning under uncertainty. But this work generally assumes an extremely simple model of action that does not consider continuous time and resources. These assumptions are not reasonable for a Mars rover, which must cope with uncertainty about the duration of tasks, the power required, the data storage necessary, along with its position and orientation. In this paper, we outline an approach to generating contingency plans when the sources of uncertainty involve continuous quantities such as time and resources. The approach involves first constructing a "seed" plan, and then incrementally adding contingent branches to this plan in order to improve utility. The challenge is to figure out the best places to insert contingency branches. This requires an estimate of how much utility could be gained by building a contingent branch at any given place in the seed plan. Computing this utility exactly is intractable, but we outline an approximation method that back propagates utility distributions through a graph structure similar to that of a plan graph.

  5. Incremental Contingency Planning

    NASA Technical Reports Server (NTRS)

    Dearden, Richard; Meuleau, Nicolas; Ramakrishnan, Sailesh; Smith, David E.; Washington, Rich

    2003-01-01

    There has been considerable work in AI on planning under uncertainty. However, this work generally assumes an extremely simple model of action that does not consider continuous time and resources. These assumptions are not reasonable for a Mars rover, which must cope with uncertainty about the duration of tasks, the energy required, the data storage necessary, and its current position and orientation. In this paper, we outline an approach to generating contingency plans when the sources of uncertainty involve continuous quantities such as time and resources. The approach involves first constructing a "seed" plan, and then incrementally adding contingent branches to this plan in order to improve utility. The challenge is to figure out the best places to insert contingency branches. This requires an estimate of how much utility could be gained by building a contingent branch at any given place in the seed plan. Computing this utility exactly is intractable, but we outline an approximation method that back propagates utility distributions through a graph structure similar to that of a plan graph.

  6. Facile synthesis of the Ti3+ self-doped TiO2-graphene nanosheet composites with enhanced photocatalysis.

    PubMed

    Qiu, Bocheng; Zhou, Yi; Ma, Yunfei; Yang, Xiaolong; Sheng, Weiqin; Xing, Mingyang; Zhang, Jinlong

    2015-02-26

    This study developed a facile approach for preparing Ti(3+) self-doped TiO2-graphene photocatalyst by a one-step vacuum activation technology involved a relative lower temperature, which could be activated by the visible light owing to the synergistic effect among Ti(3+) doping, some new intersurface bonds generation and graphene oxide reduction. Compared with the traditional methods, the vacuum activation involves a low temperature and low-costing, which can achieve the reduction of GO, the self doping of Ti(3+) in TiO2 and the loading of TiO2 nanoparticles on GR surface at the same time. These resulting TiO2-graphene composites show the high photodegradation rate of MO, high hydrogen evolution activity and excellent IPCE in the visible light irradiation. The facile vacuum activation method can provide an effective and practical approach to improve the performance of TiO2-graphene and other metal oxides-graphene towards their practical photocatalytic applications.

  7. Facile synthesis of the Ti3+ self-doped TiO2-graphene nanosheet composites with enhanced photocatalysis

    NASA Astrophysics Data System (ADS)

    Qiu, Bocheng; Zhou, Yi; Ma, Yunfei; Yang, Xiaolong; Sheng, Weiqin; Xing, Mingyang; Zhang, Jinlong

    2015-02-01

    This study developed a facile approach for preparing Ti3+ self-doped TiO2-graphene photocatalyst by a one-step vacuum activation technology involved a relative lower temperature, which could be activated by the visible light owing to the synergistic effect among Ti3+ doping, some new intersurface bonds generation and graphene oxide reduction. Compared with the traditional methods, the vacuum activation involves a low temperature and low-costing, which can achieve the reduction of GO, the self doping of Ti3+ in TiO2 and the loading of TiO2 nanoparticles on GR surface at the same time. These resulting TiO2-graphene composites show the high photodegradation rate of MO, high hydrogen evolution activity and excellent IPCE in the visible light irradiation. The facile vacuum activation method can provide an effective and practical approach to improve the performance of TiO2-graphene and other metal oxides-graphene towards their practical photocatalytic applications.

  8. Citizen Science to Support Community-based Flood Early Warning and Resilience Building

    NASA Astrophysics Data System (ADS)

    Paul, J. D.; Buytaert, W.; Allen, S.; Ballesteros-Cánovas, J. A.; Bhusal, J.; Cieslik, K.; Clark, J.; Dewulf, A.; Dhital, M. R.; Hannah, D. M.; Liu, W.; Nayaval, J. L.; Schiller, A.; Smith, P. J.; Stoffel, M.; Supper, R.

    2017-12-01

    In Disaster Risk Management, an emerging shift has been noted from broad-scale, top-down assessments towards more participatory, community-based, bottom-up approaches. Combined with technologies for robust and low-cost sensor networks, a citizen science approach has recently emerged as a promising direction in the provision of extensive, real-time information for flood early warning systems. Here we present the framework and initial results of a major new international project, Landslide EVO, aimed at increasing local resilience against hydrologically induced disasters in western Nepal by exploiting participatory approaches to knowledge generation and risk governance. We identify three major technological developments that strongly support our approach to flood early warning and resilience building in Nepal. First, distributed sensor networks, participatory monitoring, and citizen science hold great promise in complementing official monitoring networks and remote sensing by generating site-specific information with local buy-in, especially in data-scarce regions. Secondly, the emergence of open source, cloud-based risk analysis platforms supports the construction of a modular, distributed, and potentially decentralised data processing workflow. Finally, linking data analysis platforms to social computer networks and ICT (e.g. mobile phones, tablets) allows tailored interfaces and people-centred decision- and policy-support systems to be built. Our proposition is that maximum impact is created if end-users are involved not only in data collection, but also over the entire project life-cycle, including the analysis and provision of results. In this context, citizen science complements more traditional knowledge generation practices, and also enhances multi-directional information provision, risk management, early-warning systems and local resilience building.

  9. High throughput generation and characterization of replication-competent clade C transmitter-founder simian human immunodeficiency viruses

    PubMed Central

    Dutta, Debashis; Johnson, Samuel; Dalal, Alisha; Deymier, Martin J.; Hunter, Eric

    2018-01-01

    Traditional restriction endonuclease-based cloning has been routinely used to generate replication-competent simian-human immunodeficiency viruses (SHIV) and simian tropic HIV (stHIV). This approach requires the existence of suitable restriction sites or the introduction of nucleotide changes to create them. Here, using an In-Fusion cloning technique that involves homologous recombination, we generated SHIVs and stHIVs based on epidemiologically linked clade C transmitted/founder HIV molecular clones from Zambia. Replacing vif from these HIV molecular clones with vif of SIVmac239 resulted in chimeric genomes used to generate infectious stHIV viruses. Likewise, exchanging HIV env genes and introducing N375 mutations to enhance macaque CD4 binding site and cloned into a SHIVAD8-EO backbone. The generated SHIVs and stHIV were infectious in TZMbl and ZB5 cells, as well as macaque PBMCs. Therefore, this method can replace traditional methods and be a valuable tool for the rapid generation and testing of molecular clones of stHIV and SHIV based on primary clinical isolates will be valuable to generate rapid novel challenge viruses for HIV vaccine/cure studies. PMID:29758076

  10. a Photogrammetric Pipeline for the 3d Reconstruction of Cassis Images on Board Exomars Tgo

    NASA Astrophysics Data System (ADS)

    Simioni, E.; Re, C.; Mudric, T.; Pommerol, A.; Thomas, N.; Cremonese, G.

    2017-07-01

    CaSSIS (Colour and Stereo Surface Imaging System) is the stereo imaging system onboard the European Space Agency and ROSCOSMOS ExoMars Trace Gas Orbiter (TGO) that has been launched on 14 March 2016 and entered a Mars elliptical orbit on 19 October 2016. During the first bounded orbits, CaSSIS returned its first multiband images taken on 22 and 26 November 2016. The telescope acquired 11 images, each composed by 30 framelets, of the Martian surface near Hebes Chasma and Noctis Labyrithus regions reaching at closest approach at a distance of 250 km from the surface. Despite of the eccentricity of this first orbit, CaSSIS has provided one stereo pair with a mean ground resolution of 6 m from a mean distance of 520 km. The team at the Astronomical Observatory of Padova (OAPD-INAF) is involved into different stereo oriented missions and it is realizing a software for the generation of Digital Terrain Models from the CaSSIS images. The SW will be then adapted also for other projects involving stereo camera systems. To compute accurate 3D models, several sequential methods and tools have been developed. The preliminary pipeline provides: the generation of rectified images from the CaSSIS framelets, a matching core and post-processing methods. The software includes in particular: an automatic tie points detection by the Speeded Up Robust Features (SURF) operator, an initial search for the correspondences through Normalize Cross Correlation (NCC) algorithm and the Adaptive Least Square Matching (LSM) algorithm in a hierarchical approach. This work will show a preliminary DTM generated by the first CaSSIS stereo images.

  11. A continuous latitudinal energy balance model to explore non-uniform climate engineering strategies

    NASA Astrophysics Data System (ADS)

    Bonetti, F.; McInnes, C. R.

    2016-12-01

    Current concentrations of atmospheric CO2 exceed measured historical levels in modern times, largely attributed to anthropogenic forcing since the industrial revolution. The required decline in emissions rates has never been achieved leading to recent interest in climate engineering for future risk-mitigation strategies. Climate engineering aims to offset human-driven climate change. It involves techniques developed both to reduce the concentration of CO2 in the atmosphere (Carbon Dioxide Removal (CDR) methods) and to counteract the radiative forcing that it generates (Solar Radiation Management (SRM) methods). In order to investigate effects of SRM technologies for climate engineering, an analytical model describing the main dynamics of the Earth's climate has been developed. The model is a time-dependent Energy Balance Model (EBM) with latitudinal resolution and allows for the evaluation of non-uniform climate engineering strategies. A significant disadvantage of climate engineering techniques involving the management of solar radiation is regional disparities in cooling. This model offers an analytical approach to design multi-objective strategies that counteract climate change on a regional basis: for example, to cool the Artic and restrict undesired impacts at mid-latitudes, or to control the equator-to-pole temperature gradient. Using the Green's function approach the resulting partial differential equation allows for the computation of the surface temperature as a function of time and latitude when a 1% per year increase in the CO2 concentration is considered. After the validation of the model through comparisons with high fidelity numerical models, it will be used to explore strategies for the injection of the aerosol precursors in the stratosphere. In particular, the model involves detailed description of the optical properties of the particles, the wash-out dynamics and the estimation of the radiative cooling they can generate.

  12. Detection of multiple damages employing best achievable eigenvectors under Bayesian inference

    NASA Astrophysics Data System (ADS)

    Prajapat, Kanta; Ray-Chaudhuri, Samit

    2018-05-01

    A novel approach is presented in this work to localize simultaneously multiple damaged elements in a structure along with the estimation of damage severity for each of the damaged elements. For detection of damaged elements, a best achievable eigenvector based formulation has been derived. To deal with noisy data, Bayesian inference is employed in the formulation wherein the likelihood of the Bayesian algorithm is formed on the basis of errors between the best achievable eigenvectors and the measured modes. In this approach, the most probable damage locations are evaluated under Bayesian inference by generating combinations of various possible damaged elements. Once damage locations are identified, damage severities are estimated using a Bayesian inference Markov chain Monte Carlo simulation. The efficiency of the proposed approach has been demonstrated by carrying out a numerical study involving a 12-story shear building. It has been found from this study that damage scenarios involving as low as 10% loss of stiffness in multiple elements are accurately determined (localized and severities quantified) even when 2% noise contaminated modal data are utilized. Further, this study introduces a term parameter impact (evaluated based on sensitivity of modal parameters towards structural parameters) to decide the suitability of selecting a particular mode, if some idea about the damaged elements are available. It has been demonstrated here that the accuracy and efficiency of the Bayesian quantification algorithm increases if damage localization is carried out a-priori. An experimental study involving a laboratory scale shear building and different stiffness modification scenarios shows that the proposed approach is efficient enough to localize the stories with stiffness modification.

  13. Wire Detection Algorithms for Navigation

    NASA Technical Reports Server (NTRS)

    Kasturi, Rangachar; Camps, Octavia I.

    2002-01-01

    In this research we addressed the problem of obstacle detection for low altitude rotorcraft flight. In particular, the problem of detecting thin wires in the presence of image clutter and noise was studied. Wires present a serious hazard to rotorcrafts. Since they are very thin, their detection early enough so that the pilot has enough time to take evasive action is difficult, as their images can be less than one or two pixels wide. Two approaches were explored for this purpose. The first approach involved a technique for sub-pixel edge detection and subsequent post processing, in order to reduce the false alarms. After reviewing the line detection literature, an algorithm for sub-pixel edge detection proposed by Steger was identified as having good potential to solve the considered task. The algorithm was tested using a set of images synthetically generated by combining real outdoor images with computer generated wire images. The performance of the algorithm was evaluated both, at the pixel and the wire levels. It was observed that the algorithm performs well, provided that the wires are not too thin (or distant) and that some post processing is performed to remove false alarms due to clutter. The second approach involved the use of an example-based learning scheme namely, Support Vector Machines. The purpose of this approach was to explore the feasibility of an example-based learning based approach for the task of detecting wires from their images. Support Vector Machines (SVMs) have emerged as a promising pattern classification tool and have been used in various applications. It was found that this approach is not suitable for very thin wires and of course, not suitable at all for sub-pixel thick wires. High dimensionality of the data as such does not present a major problem for SVMs. However it is desirable to have a large number of training examples especially for high dimensional data. The main difficulty in using SVMs (or any other example-based learning method) is the need for a very good set of positive and negative examples since the performance depends on the quality of the training set.

  14. Electronic hybridization detection in microarray format and DNA genotyping

    NASA Astrophysics Data System (ADS)

    Blin, Antoine; Cissé, Ismaïl; Bockelmann, Ulrich

    2014-02-01

    We describe an approach to substituting a fluorescence microarray with a surface made of an arrangement of electrolyte-gated field effect transistors. This was achieved using a dedicated blocking of non-specific interactions and comparing threshold voltage shifts of transistors exhibiting probe molecules of different base sequence. We apply the approach to detection of the 35delG mutation, which is related to non-syndromic deafness and is one of the most frequent mutations in humans. The process involves barcode sequences that are generated by Tas-PCR, a newly developed replication reaction using polymerase blocking. The barcodes are recognized by hybridization to surface attached probes and are directly detected by the semiconductor device.

  15. Electronic hybridization detection in microarray format and DNA genotyping

    PubMed Central

    Blin, Antoine; Cissé, Ismaïl; Bockelmann, Ulrich

    2014-01-01

    We describe an approach to substituting a fluorescence microarray with a surface made of an arrangement of electrolyte-gated field effect transistors. This was achieved using a dedicated blocking of non-specific interactions and comparing threshold voltage shifts of transistors exhibiting probe molecules of different base sequence. We apply the approach to detection of the 35delG mutation, which is related to non-syndromic deafness and is one of the most frequent mutations in humans. The process involves barcode sequences that are generated by Tas-PCR, a newly developed replication reaction using polymerase blocking. The barcodes are recognized by hybridization to surface attached probes and are directly detected by the semiconductor device. PMID:24569823

  16. Evaluation of Ochratoxin Recognition by Peptides Using Explicit Solvent Molecular Dynamics

    PubMed Central

    Thyparambil, Aby A.; Bazin, Ingrid; Guiseppi-Elie, Anthony

    2017-01-01

    Biosensing platforms based on peptide recognition provide a cost-effective and stable alternative to antibody-based capture and discrimination of ochratoxin-A (OTA) vs. ochratoxin-B (OTB) in monitoring bioassays. Attempts to engineer peptides with improved recognition efficacy require thorough structural and thermodynamic characterization of the binding-competent conformations. Classical molecular dynamics (MD) approaches alone do not provide a thorough assessment of a peptide’s recognition efficacy. In this study, in-solution binding properties of four different peptides, a hexamer (SNLHPK), an octamer (CSIVEDGK), NFO4 (VYMNRKYYKCCK), and a 13-mer (GPAGIDGPAGIRC), which were previously generated for OTA-specific recognition, were evaluated using an advanced MD simulation approach involving accelerated configurational search and predictive modeling. Peptide configurations relevant to ochratoxin binding were initially generated using biased exchange metadynamics and the dynamic properties associated with the in-solution peptide–ochratoxin binding were derived from Markov State Models. Among the various peptides, NFO4 shows superior in-solution OTA sensing and also shows superior selectivity for OTA vs. OTB due to the lower penalty associated with solvating its bound complex. Advanced MD approaches provide structural and energetic insights critical to the hapten-specific recognition to aid the engineering of peptides with better sensing efficacies. PMID:28505090

  17. Efficient Optimization of Low-Thrust Spacecraft Trajectories

    NASA Technical Reports Server (NTRS)

    Lee, Seungwon; Fink, Wolfgang; Russell, Ryan; Terrile, Richard; Petropoulos, Anastassios; vonAllmen, Paul

    2007-01-01

    A paper describes a computationally efficient method of optimizing trajectories of spacecraft driven by propulsion systems that generate low thrusts and, hence, must be operated for long times. A common goal in trajectory-optimization problems is to find minimum-time, minimum-fuel, or Pareto-optimal trajectories (here, Pareto-optimality signifies that no other solutions are superior with respect to both flight time and fuel consumption). The present method utilizes genetic and simulated-annealing algorithms to search for globally Pareto-optimal solutions. These algorithms are implemented in parallel form to reduce computation time. These algorithms are coupled with either of two traditional trajectory- design approaches called "direct" and "indirect." In the direct approach, thrust control is discretized in either arc time or arc length, and the resulting discrete thrust vectors are optimized. The indirect approach involves the primer-vector theory (introduced in 1963), in which the thrust control problem is transformed into a co-state control problem and the initial values of the co-state vector are optimized. In application to two example orbit-transfer problems, this method was found to generate solutions comparable to those of other state-of-the-art trajectory-optimization methods while requiring much less computation time.

  18. Next-Generation Sequencing Approaches in Genome-Wide Discovery of Single Nucleotide Polymorphism Markers Associated with Pungency and Disease Resistance in Pepper.

    PubMed

    Manivannan, Abinaya; Kim, Jin-Hee; Yang, Eun-Young; Ahn, Yul-Kyun; Lee, Eun-Su; Choi, Sena; Kim, Do-Sun

    2018-01-01

    Pepper is an economically important horticultural plant that has been widely used for its pungency and spicy taste in worldwide cuisines. Therefore, the domestication of pepper has been carried out since antiquity. Owing to meet the growing demand for pepper with high quality, organoleptic property, nutraceutical contents, and disease tolerance, genomics assisted breeding techniques can be incorporated to develop novel pepper varieties with desired traits. The application of next-generation sequencing (NGS) approaches has reformed the plant breeding technology especially in the area of molecular marker assisted breeding. The availability of genomic information aids in the deeper understanding of several molecular mechanisms behind the vital physiological processes. In addition, the NGS methods facilitate the genome-wide discovery of DNA based markers linked to key genes involved in important biological phenomenon. Among the molecular markers, single nucleotide polymorphism (SNP) indulges various benefits in comparison with other existing DNA based markers. The present review concentrates on the impact of NGS approaches in the discovery of useful SNP markers associated with pungency and disease resistance in pepper. The information provided in the current endeavor can be utilized for the betterment of pepper breeding in future.

  19. A Supramolecular Approach toward Bioinspired PAMAM-Dendronized Fusion Toxins.

    PubMed

    Kuan, Seah Ling; Förtsch, Christina; Ng, David Yuen Wah; Fischer, Stephan; Tokura, Yu; Liu, Weina; Wu, Yuzhou; Koynov, Kaloian; Barth, Holger; Weil, Tanja

    2016-06-01

    Nature has provided a highly optimized toolbox in bacterial endotoxins with precise functions dictated by their clear structural division. Inspired by this streamlined design, a supramolecular approach capitalizing on the strong biomolecular (streptavidin (SA))-biotin interactions is reported herein to prepare two multipartite fusion constructs, which involves the generation 2.0 (D2) or generation 3.0 (D3) polyamidoamine-dendronized transporter proteins (dendronized streptavidin (D3SA) and dendronized human serum albumin (D2HSA)) non-covalently fused to the C3bot1 enzyme from Clostridium botulinum, a potent and specific Rho-inhibitor. The fusion constructs, D3SA-C3 and D2HSA-C3, represent the first examples of dendronized protein transporters that are fused to the C3 enzyme, and it is successfully demonstrated that the C3 Rho-inhibitor is delivered into the cytosol of mammalian cells as determined from the characteristic C3-mediated changes in cell morphology and confocal microscopy. The design circumvents the low uptake of the C3 enzyme by eukaryotic cells and holds great promise for reprogramming the properties of toxin enzymes using a supramolecular approach to broaden their therapeutic applications. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. A solid-state controller for a wind-driven slip-ring induction generator

    NASA Astrophysics Data System (ADS)

    Velayudhan, C.; Bundell, J. H.; Leary, B. G.

    1984-08-01

    The three-phase induction generator appears to become the preferred choice for wind-powered systems operated in parallel with existing power systems. A problem arises in connection with the useful operating speed range of the squirrel-cage machine, which is relatively narrow, as, for instance, in the range from 1 to 1.15. Efficient extraction of energy from a wind turbine, on the other hand, requires a speed range, perhaps as large as 1 to 3. One approach for 'matching' the generator to the turbine for the extraction of maximum power at any usable wind speed involves the use of a slip-ring induction machine. The power demand of the slip-ring machine can be matched to the available output from the wind turbine by modifying the speed-torque characteristics of the generator. A description is presented of a simple electronic rotor resistance controller which can optimize the power taken from a wind turbine over the full speed range.

  1. Dual gait generative models for human motion estimation from a single camera.

    PubMed

    Zhang, Xin; Fan, Guoliang

    2010-08-01

    This paper presents a general gait representation framework for video-based human motion estimation. Specifically, we want to estimate the kinematics of an unknown gait from image sequences taken by a single camera. This approach involves two generative models, called the kinematic gait generative model (KGGM) and the visual gait generative model (VGGM), which represent the kinematics and appearances of a gait by a few latent variables, respectively. The concept of gait manifold is proposed to capture the gait variability among different individuals by which KGGM and VGGM can be integrated together, so that a new gait with unknown kinematics can be inferred from gait appearances via KGGM and VGGM. Moreover, a new particle-filtering algorithm is proposed for dynamic gait estimation, which is embedded with a segmental jump-diffusion Markov Chain Monte Carlo scheme to accommodate the gait variability in a long observed sequence. The proposed algorithm is trained from the Carnegie Mellon University (CMU) Mocap data and tested on the Brown University HumanEva data with promising results.

  2. Unstructured Grids for Sonic Boom Analysis and Design

    NASA Technical Reports Server (NTRS)

    Campbell, Richard L.; Nayani, Sudheer N.

    2015-01-01

    An evaluation of two methods for improving the process for generating unstructured CFD grids for sonic boom analysis and design has been conducted. The process involves two steps: the generation of an inner core grid using a conventional unstructured grid generator such as VGRID, followed by the extrusion of a sheared and stretched collar grid through the outer boundary of the core grid. The first method evaluated, known as COB, automatically creates a cylindrical outer boundary definition for use in VGRID that makes the extrusion process more robust. The second method, BG, generates the collar grid by extrusion in a very efficient manner. Parametric studies have been carried out and new options evaluated for each of these codes with the goal of establishing guidelines for best practices for maintaining boom signature accuracy with as small a grid as possible. In addition, a preliminary investigation examining the use of the CDISC design method for reducing sonic boom utilizing these grids was conducted, with initial results confirming the feasibility of a new remote design approach.

  3. Studying marine stratus with large eddy simulation

    NASA Technical Reports Server (NTRS)

    Moeng, Chin-Hoh

    1990-01-01

    Data sets from field experiments over the stratocumulus regime may include complications from larger scale variations, decoupled cloud layers, diurnal cycle, or entrainment instability, etc. On top of the already complicated turbulence-radiation-condensation processes within the cloud-topped boundary layer (CTBL), these complexities may sometimes make interpretation of the data sets difficult. To study these processes, a better understanding is needed of the basic processes involved in the prototype CTBL. For example, is cloud top radiative cooling the primary source of the turbulent kinetic energy (TKE) within the CTBL. Historically, laboratory measurements have played an important role in addressing the turbulence problems. The CTBL is a turbulent field which is probably impossible to generate in laboratories. Large eddy simulation (LES) is an alternative way of 'measuring' the turbulent structure under controlled environments, which allows the systematic examination of the basic physical processes involved. However, there are problems with the LES approach for the CTBL. The LES data need to be consistent with the observed data. The LES approach is discussed, and results are given which provide some insights into the simulated turbulent flow field. Problems with this approach for the CTBL and information from the FIRE experiment needed to justify the LES results are discussed.

  4. Qualitative evaluation: A critical and interpretative complementary approach to improve health programs and services

    PubMed Central

    Tayabas, Luz María Tejada; León, Teresita Castillo; ESPINO, JOEL MONARREZ

    2014-01-01

    This short essay aims at commenting on the origin, development, rationale, and main characteristics of qualitative evaluation (QE), emphasizing the value of this methodological tool to evaluate health programs and services. During the past decades, different approaches have come to light proposing complementary alternatives to appraise the performance of public health programs, mainly focusing on the implementation process involved rather than on measuring the impact of such actions. QE is an alternative tool that can be used to illustrate and understand the process faced when executing health programs. It can also lead to useful suggestions to modify its implementation from the stakeholders’ perspectives, as it uses a qualitative approach that considers participants as reflective subjects, generators of meanings. This implies that beneficiaries become involved in an active manner in the evaluated phenomena with the aim of improving the health programs or services that they receive. With this work we want to encourage evaluators in the field of public health to consider the use of QE as a complementary tool for program evaluation to be able to identify areas of opportunity to improve programs’ implementation processes from the perspective of intended beneficiaries. PMID:25152220

  5. Alzheimer Europe's position on involving people with dementia in research through PPI (patient and public involvement).

    PubMed

    Gove, Dianne; Diaz-Ponce, Ana; Georges, Jean; Moniz-Cook, Esme; Mountain, Gail; Chattat, Rabih; Øksnebjerg, Laila

    2018-06-01

    This paper reflects Alzheimer Europe's position on PPI (patient and public involvement) in the context of dementia research and highlights some of the challenges and potential risks and benefits associated with such meaningful involvement. The paper was drafted by Alzheimer Europe in collaboration with members of INTERDEM and the European Working Group of People with Dementia. It has been formally adopted by the Board of Alzheimer Europe and endorsed by the Board of INTERDEM and by the JPND working group 'Dementia Outcome Measures - Charting New Territory'. Alzheimer Europe is keen to promote the involvement of people with dementia in research, not only as participants but also in the context of PPI, by generating ideas for research, advising researchers, being involved in consultations and being directly involved in research activities. This position paper is in keeping with this objective. Topics covered include, amongst others, planning involvement, establishing roles and responsibilities, training and support, managing information and input from PPI, recognising the contribution of people with dementia involved in research in this way, promoting and protecting the rights and well-being of people with dementia, training and support, and promoting an inclusive approach and the necessary infrastructure for PPI in dementia research.

  6. Understanding force-generating microtubule systems through in vitro reconstitution

    PubMed Central

    Kok, Maurits; Dogterom, Marileen

    2016-01-01

    ABSTRACT Microtubules switch between growing and shrinking states, a feature known as dynamic instability. The biochemical parameters underlying dynamic instability are modulated by a wide variety of microtubule-associated proteins that enable the strict control of microtubule dynamics in cells. The forces generated by controlled growth and shrinkage of microtubules drive a large range of processes, including organelle positioning, mitotic spindle assembly, and chromosome segregation. In the past decade, our understanding of microtubule dynamics and microtubule force generation has progressed significantly. Here, we review the microtubule-intrinsic process of dynamic instability, the effect of external factors on this process, and how the resulting forces act on various biological systems. Recently, reconstitution-based approaches have strongly benefited from extensive biochemical and biophysical characterization of individual components that are involved in regulating or transmitting microtubule-driven forces. We will focus on the current state of reconstituting increasingly complex biological systems and provide new directions for future developments. PMID:27715396

  7. Real-time fuzzy inference based robot path planning

    NASA Technical Reports Server (NTRS)

    Pacini, Peter J.; Teichrow, Jon S.

    1990-01-01

    This project addresses the problem of adaptive trajectory generation for a robot arm. Conventional trajectory generation involves computing a path in real time to minimize a performance measure such as expended energy. This method can be computationally intensive, and it may yield poor results if the trajectory is weakly constrained. Typically some implicit constraints are known, but cannot be encoded analytically. The alternative approach used here is to formulate domain-specific knowledge, including implicit and ill-defined constraints, in terms of fuzzy rules. These rules utilize linguistic terms to relate input variables to output variables. Since the fuzzy rulebase is determined off-line, only high-level, computationally light processing is required in real time. Potential applications for adaptive trajectory generation include missile guidance and various sophisticated robot control tasks, such as automotive assembly, high speed electrical parts insertion, stepper alignment, and motion control for high speed parcel transfer systems.

  8. Review of sonic-boom simulation devices and techniques.

    NASA Technical Reports Server (NTRS)

    Edge, P. M., Jr.; Hubbard, H. H.

    1972-01-01

    Research on aircraft-generated sonic booms has led to the development of special techniques to generate controlled sonic-boom-type disturbances without the complications and expense of supersonic flight operations. This paper contains brief descriptions of several of these techniques along with the significant hardware items involved and indicates the advantages and disadvantages of each in research applications. Included are wind tunnels, ballistic ranges, spark discharges, piston phones, shock tubes, high-speed valve systems, and shaped explosive charges. Specialized applications include sonic-boom generation and propagation studies and the responses of structures, terrain, people, and animals. Situations for which simulators are applicable are shown to include both small-scale and large-scale laboratory tests and full-scale field tests. Although no one approach to simulation is ideal, the various techniques available generally complement each other to provide desired capability for a broad range of sonic-boom studies.

  9. Tissue architecture, cell traction, deformable scaffolds, and the forces that shape the embryo during morphogenesis.

    NASA Astrophysics Data System (ADS)

    Davidson, Lance

    2005-03-01

    Morphogenesis is the process of constucting form and shape. Morphogenesis during early development of the embryo involves orchestrated movements of cells and tissues. These morphogenetic movements establish the body plan and organs of the early embryo. The rates and trajectories of these movements depend on three physical features of the early embryo: 1) the forces generated by cells, 2) the mechanical properties of the tissues, and 3) the architecture of the tissues. These three mechanical features of the embryo are some of the earliest phenotypic features generated by the genome. We are taking an interdisciplinary approach combining biophysical, cell biological, and classical embryological techniques to understand the mechanics of morphogenesis. Using nanoNewton-sensitive force transducers we can apply forces and measure time dependent elastic modulii of tissue fragments 100 micrometers across. Using traction-force microscopy we can measure forces generated by cells on their environment. We use drugs and chimeric proteins to investigate the localization and function of molecular complexes responsible for force generation and the modulus. We use microsurgery to take-apart and construct novel tissues to investigate the role of geometry and architecture in the mechanics of morphogenesis. Together with simulation techniques these quantitative approaches will provide us with a practical nuts-and-bolts understanding of how the genome encodes the shapes and forms of life.

  10. Highly Efficient Vector-Inversion Pulse Generators

    NASA Technical Reports Server (NTRS)

    Rose, Franklin

    2004-01-01

    Improved transmission-line pulse generators of the vector-inversion type are being developed as lightweight sources of pulsed high voltage for diverse applications, including spacecraft thrusters, portable x-ray imaging systems, impulse radar systems, and corona-discharge systems for sterilizing gases. In this development, more than the customary attention is paid to principles of operation and details of construction so as to the maximize the efficiency of the pulse-generation process while minimizing the sizes of components. An important element of this approach is segmenting a pulse generator in such a manner that the electric field in each segment is always below the threshold for electrical breakdown. One design of particular interest, a complete description of which was not available at the time of writing this article, involves two parallel-plate transmission lines that are wound on a mandrel, share a common conductor, and are switched in such a manner that the pulse generator is divided into a "fast" and a "slow" section. A major innovation in this design is the addition of ferrite to the "slow" section to reduce the size of the mandrel needed for a given efficiency.

  11. An Examination of the Role of the United States Army Reserve in Support of the Defense Support of Civil Authorities (DSCA)

    DTIC Science & Technology

    2014-06-13

    portion of this study. I would also like to thank my instructors and colleagues of Staff Group 23 Alpha. This group of instructors took great care in...organizes to support local communities and States in catastrophic incidents. This holistic approach emphasizes the need for the involvement of the whole...intelligence, medical and dental , civil affairs and military information support operations, military police, CBRN, and Soldier Support and Force generation

  12. Synthesis of Methylenebicyclo[3.2.1]octanol by a Sm(II)-Induced 1,2-Rearrangement Reaction with Ring Expansion of Methylenebicyclo[4.2.0]octanone.

    PubMed

    Takatori, Kazuhiko; Ota, Shoya; Tendo, Kenta; Matsunaga, Kazuma; Nagasawa, Kokoro; Watanabe, Shinya; Kishida, Atsushi; Kogen, Hiroshi; Nagaoka, Hiroto

    2017-07-21

    Direct conversion of methylenebicyclo[4.2.0]octanone to methylenebicyclo[3.2.1]octanol by a Sm(II)-induced 1,2-rearrangement with ring expansion of the methylenecyclobutane is described. Three conditions were optimized to allow the adaptation of this approach to various substrates. A rearrangement mechanism is proposed involving the generation of a ketyl radical and cyclopentanation by ketyl-olefin cyclization, followed by radical fragmentation and subsequent protonation.

  13. Improving the Success of Strategic Management Using Big Data.

    PubMed

    Desai, Sapan S; Wilkerson, James; Roberts, Todd

    2016-01-01

    Strategic management involves determining organizational goals, implementing a strategic plan, and properly allocating resources. Poor access to pertinent and timely data misidentifies clinical goals, prevents effective resource allocation, and generates waste from inaccurate forecasting. Loss of operational efficiency diminishes the value stream, adversely impacts the quality of patient care, and hampers effective strategic management. We have pioneered an approach using big data to create competitive advantage by identifying trends in clinical practice, accurately anticipating future needs, and strategically allocating resources for maximum impact.

  14. Basal ganglia circuit loops, dopamine and motivation: A review and enquiry

    PubMed Central

    Ikemoto, Satoshi; Yang, Chen; Tan, Aaron

    2015-01-01

    Dopamine neurons located in the midbrain play a role in motivation that regulates approach behavior (approach motivation). In addition, activation and inactivation of dopamine neurons regulate mood and induce reward and aversion, respectively. Accumulating evidence suggests that such motivational role of dopamine neurons is not limited to those located in the ventral tegmental area, but also in the substantia nigra. The present paper reviews previous rodent work concerning dopamine’s role in approach motivation and the connectivity of dopamine neurons, and proposes two working models: One concerns the relationship between extracellular dopamine concentration and approach motivation. High, moderate and low concentrations of extracellular dopamine induce euphoric, seeking and aversive states, respectively. The other concerns circuit loops involving the cerebral cortex, basal ganglia, thalamus, epithalamus, and midbrain through which dopaminergic activity alters approach motivation. These models should help to generate hypothesis-driven research and provide insights for understanding altered states associated with drugs of abuse and affective disorders. PMID:25907747

  15. Statistical Approach To Estimate Vaccinia-Specific Neutralizing Antibody Titers Using a High-Throughput Assay▿

    PubMed Central

    Kennedy, Richard; Pankratz, V. Shane; Swanson, Eric; Watson, David; Golding, Hana; Poland, Gregory A.

    2009-01-01

    Because of the bioterrorism threat posed by agents such as variola virus, considerable time, resources, and effort have been devoted to biodefense preparation. One avenue of this research has been the development of rapid, sensitive, high-throughput assays to validate immune responses to poxviruses. Here we describe the adaptation of a β-galactosidase reporter-based vaccinia virus neutralization assay to large-scale use in a study that included over 1,000 subjects. We also describe the statistical methods involved in analyzing the large quantity of data generated. The assay and its associated methods should prove useful tools in monitoring immune responses to next-generation smallpox vaccines, studying poxvirus immunity, and evaluating therapeutic agents such as vaccinia virus immune globulin. PMID:19535540

  16. Grid generation by elliptic partial differential equations for a tri-element Augmentor-Wing airfoil

    NASA Technical Reports Server (NTRS)

    Sorenson, R. L.

    1982-01-01

    Two efforts to numerically simulate the flow about the Augmentor-Wing airfoil in the cruise configuration using the GRAPE elliptic partial differential equation grid generator algorithm are discussed. The Augmentor-Wing consists of a main airfoil with a slotted trailing edge for blowing and two smaller airfoils shrouding the blowing jet. The airfoil and the algorithm are described, and the application of GRAPE to an unsteady viscous flow simulation and a transonic full-potential approach is considered. The procedure involves dividing a complicated flow region into an arbitrary number of zones and ensuring continuity of grid lines, their slopes, and their point distributions across the zonal boundaries. The method for distributing the body-surface grid points is discussed.

  17. Evolution of In-Situ Generated Reinforcement Precipitates in Metal Matrix Composites

    NASA Technical Reports Server (NTRS)

    Sen, S.; Kar, S. K.; Catalina, A. V.; Stefanescu, D. M.; Dhindaw, B. K.

    2004-01-01

    Due to certain inherent advantages, in-situ production of Metal Matrix Composites (MMCs) have received considerable attention in the recent past. ln-situ techniques typically involve a chemical reaction that results in precipitation of a ceramic reinforcement phase. The size and spatial distribution of these precipitates ultimately determine the mechanical properties of these MMCs. In this paper we will investigate the validity of using classical growth laws and analytical expressions to describe the interaction between a precipitate and a solid-liquid interface (SLI) to predict the size and spatial evolution of the in-situ generated precipitates. Measurements made on size and distribution of Tic precipitates in a Ni&I matrix will be presented to test the validity of such an approach.

  18. Testing of Environmentally Preferable Aluminum Pretreatments and Coating Systems for Use on Space Shuttle Solid Rocket Boosters (SRB)

    NASA Technical Reports Server (NTRS)

    Clayton, C.; Raley, R.; Zook, L.

    2001-01-01

    The solid rocket booster (SRB) has historically used a chromate conversion coating prior to protective finish application. After conversion coating, an organic paint system consisting of a chromated epoxy primer and polyurethane topcoat is applied. An overall systems approach was selected to reduce waste generation from the coatings application and removal processes. While the most obvious waste reduction opportunity involved elimination of the chromate conversion coating, several other coating system configurations were explored in an attempt to reduce the total waste. This paper will briefly discuss the use of a systems view to reduce waste generation from the coating process and present the results of the qualification testing of nonchromated aluminum pretreatments and alternate coating systems configurations.

  19. Self-consistent formation of electron $\\kappa$ distribution: 1. Theory

    NASA Astrophysics Data System (ADS)

    Yoon, Peter H.; Rhee, Tongnyeol; Ryu, Chang-Mo

    2006-09-01

    Since the early days of plasma physics research suprathermal electrons were observed to be generated during beam-plasma laboratory experiments. Energetic electrons, often modeled by κ distributions, are also ubiquitously observed in space. Various particle acceleration mechanisms have been proposed to explain such a feature, but all previous theories rely on either qualitative analytical method or on non-self-consistent approaches. This paper discusses the self-consistent acceleration of electrons to suprathermal energies by weak turbulence processes which involve the Langmuir/ion-sound turbulence and the beam-plasma interaction. It is discussed that the spontaneous scatttering process, which is absent in the purely collisionless theory, is singularly responsible for the generation of κ distributions. The conclusion is that purely collisionless Vlasov theory cannot produce suprathermal population.

  20. Bimetallic catalysis for C–C and C–X coupling reactions

    PubMed Central

    Pye, Dominic R.

    2017-01-01

    Bimetallic catalysis represents an alternative paradigm for coupling chemistry that complements the more traditional single-site catalysis approach. In this perspective, recent advances in bimetallic systems for catalytic C–C and C–X coupling reactions are reviewed. Behavior which complements that of established single-site catalysts is highlighted. Two major reaction classes are covered. First, generation of catalytic amounts of organometallic species of e.g. Cu, Au, or Ni capable of transmetallation to a Pd co-catalyst (or other traditional cross-coupling catalyst) has allowed important new C–C coupling technologies to emerge. Second, catalytic transformations involving binuclear bond-breaking and/or bond-forming steps, in some cases involving metal–metal bonds, represent a frontier area for C–C and C–X coupling processes.

  1. Public policy: extending psychology's contributions to national priorities.

    PubMed

    DeLeon, Patrick H; Kazdin, Alan E

    2010-08-01

    Much of today's psychological research and practice is relevant to our national health agenda and can serve the public interest. President Obama's landmark health care reform success provides an unprecedented opportunity to revolutionize society's definition of "quality care" and highlight rehabilitation's potential. Advocacy, vision, and a public policy presence with persistence are critical. Those involved often focus exclusively upon specific issues (e.g., reimbursement, research funding, or graduate student support). By developing a "bigger picture" approach addressing society's real needs and embracing the changes technology will ultimately bring, psychology can have a more lasting impact. There are unlimited opportunities to advance the profession through personal involvement in the public policy arena. It is essential that psychology's next generation receives relevant mentoring.

  2. Human engineering analysis for the high speed civil transport flight deck

    NASA Technical Reports Server (NTRS)

    Regal, David M.; Alter, Keith W.

    1993-01-01

    The Boeing Company is investigating the feasibility of building a second generation supersonic transport. If current studies support its viability, this airplane, known as the High Speed Civil Transport (HSCT), could be launched early in the next century. The HSCT will cruise at Mach 2.4, be over 300 feet long, have an initial range of between 5000 and 6000 NM, and carry approximately 300 passengers. We are presently involved in developing an advanced flight deck for the HSCT. As part of this effort we are undertaking a human engineering analysis that involves a top-down, mission driven approach that will allow a systematic determination of flight deck functional and information requirements. The present paper describes this work.

  3. An experimental approach to the fundamental principles of hemodynamics.

    PubMed

    Pontiga, Francisco; Gaytán, Susana P

    2005-09-01

    An experimental model has been developed to give students hands-on experience with the fundamental laws of hemodynamics. The proposed experimental setup is of simple construction but permits the precise measurements of physical variables involved in the experience. The model consists in a series of experiments where different basic phenomena are quantitatively investigated, such as the pressure drop in a long straight vessel and in an obstructed vessel, the transition from laminar to turbulent flow, the association of vessels in vascular networks, or the generation of a critical stenosis. Through these experiments, students acquire a direct appreciation of the importance of the parameters involved in the relationship between pressure and flow rate, thus facilitating the comprehension of more complex problems in hemodynamics.

  4. Oxygen and Metals Processing on the Moon: Will Materials Science Change Our Future in Space?

    NASA Technical Reports Server (NTRS)

    Sibille, Laurent; Sadoway, Donald R.

    2008-01-01

    As part of an In-Situ Resource Utilization infrastructure on the lunar surface, the production of oxygen and metals by various technologies is under development within NASA projects. Such an effort reflects the ambition to change paradigms in space exploration to enable human presence for the long-term. Sustaining such presence involves the acceptance of a new concept in space activities; crews must be able to generate some of their consumables from local resources. The balance between accepting early development risks and reducing long-term mission risks is at the core of the technology development approach. We will present an overview of the technologies involved and present their possible impact on the future of human expansion in the solar system.

  5. Endocannabinoid system and drug addiction: new insights from mutant mice approaches.

    PubMed

    Maldonado, Rafael; Robledo, Patricia; Berrendero, Fernando

    2013-08-01

    The involvement of the endocannabinoid system in drug addiction was initially studied by the use of compounds with different affinities for each cannabinoid receptor or for the proteins involved in endocannabinoids inactivation. The generation of genetically modified mice with selective mutations in these endocannabinoid system components has now provided important advances in establishing their specific contribution to drug addiction. These genetic tools have identified the particular interest of CB1 cannabinoid receptor and endogenous anandamide as potential targets for drug addiction treatment. Novel genetic tools will allow determining if the modulation of CB2 cannabinoid receptor activity and 2-arachidonoylglycerol tone can also have an important therapeutic relevance for drug addiction. Copyright © 2013 Elsevier Ltd. All rights reserved.

  6. Perception of chemesthetic stimuli in groups who differ by food involvement and culinary experience.

    PubMed

    Byrnes, Nadia; Loss, Christopher R; Hayes, John E

    2015-12-01

    In the English language, there is generally a limited lexicon when referring to the sensations elicited by chemesthetic stimuli like capsaicin, allyl isothiocyanate, and eugenol, the orally irritating compounds found in chiles, wasabi, and cloves, respectively. Elsewhere, experts and novices have been shown to use language differently, with experts using more precise language. Here, we compare perceptual maps and word usage across three cohorts: experts with formal culinary education, naïve individuals with high Food Involvement Scale (FIS) scores, and naïve individuals with low FIS scores. We hypothesized that increased experience with foods, whether through informal experiential learning or formal culinary education, would have a significant influence on the perceptual maps generated from a sorting task conducted with chemesthetic stimuli, as well as on language use in a descriptive follow-up task to this sorting task. The low- and highFIS non-expert cohorts generated significantly similar maps, though in other respects the highFIS cohort was an intermediate between the lowFIS and expert cohorts. The highFIS and expert cohorts generated more attributes but used language more idiosyncratically than the lowFIS group. Overall, the results from the expert group with formal culinary education differed from the two naïve cohorts both in the perceptual map generated using MDS as well as the mean number of attributes generated. Present data suggest that both formal education and informal experiential learning result in lexical development, but the level and type of learning can have a significant influence on language use and the approach to a sorting task.

  7. Perception of chemesthetic stimuli in groups who differ by food involvement and culinary experience

    PubMed Central

    Byrnes, Nadia; Loss, Christopher R.; Hayes, John E.

    2015-01-01

    In the English language, there is generally a limited lexicon when referring to the sensations elicited by chemesthetic stimuli like capsaicin, allyl isothiocyanate, and eugenol, the orally irritating compounds found in chiles, wasabi, and cloves, respectively. Elsewhere, experts and novices have been shown to use language differently, with experts using more precise language. Here, we compare perceptual maps and word usage across three cohorts: experts with formal culinary education, naïve individuals with high Food Involvement Scale (FIS) scores, and naïve individuals with low FIS scores. We hypothesized that increased experience with foods, whether through informal experiential learning or formal culinary education, would have a significant influence on the perceptual maps generated from a sorting task conducted with chemesthetic stimuli, as well as on language use in a descriptive follow-up task to this sorting task. The low- and highFIS non-expert cohorts generated significantly similar maps, though in other respects the highFIS cohort was an intermediate between the lowFIS and expert cohorts. The highFIS and expert cohorts generated more attributes but used language more idiosyncratically than the lowFIS group. Overall, the results from the expert group with formal culinary education differed from the two naïve cohorts both in the perceptual map generated using MDS as well as the mean number of attributes generated. Present data suggest that both formal education and informal experiential learning result in lexical development, but the level and type of learning can have a significant influence on language use and the approach to a sorting task. PMID:26516297

  8. Exploring the decision-making process in the delivery of physiotherapy in a stroke unit.

    PubMed

    McGlinchey, Mark P; Davenport, Sally

    2015-01-01

    The aim of this study was to explore the decision-making process in the delivery of physiotherapy in a stroke unit. A focused ethnographical approach involving semi-structured interviews and observations of clinical practice was used. A purposive sample of seven neurophysiotherapists and four patients participated in semi-structured interviews. From this group, three neurophysiotherapists and four patients were involved in observation of practice. Data from interviews and observations were analysed to generate themes. Three themes were identified: planning the ideal physiotherapy delivery, the reality of physiotherapy delivery and involvement in the decision-making process. Physiotherapists used a variety of clinical reasoning strategies and considered many factors to influence their decision-making in the planning and delivery of physiotherapy post-stroke. These factors included the therapist's clinical experience, patient's presentation and response to therapy, prioritisation, organisational constraints and compliance with organisational practice. All physiotherapists highlighted the importance to involve patients in planning and delivering their physiotherapy. However, there were varying levels of patient involvement observed in this process. The study has generated insight into the reality of decision-making in the planning and delivery of physiotherapy post-stroke. Further research involving other stroke units is required to gain a greater understanding of this aspect of physiotherapy. Implications for Rehabilitation Physiotherapists need to consider multiple patient, therapist and organisational factors when planning and delivering physiotherapy in a stroke unit. Physiotherapists should continually reflect upon how they provide physiotherapy, with respect to the duration, frequency and time of day sessions are delivered, in order to guide current and future physiotherapy delivery. As patients may demonstrate varying levels of participation in deciding and understanding how physiotherapy is delivered, physiotherapists need to adjust how they engage patients in the decision-making process and manage patient expectations accordingly.

  9. Parameter reduction in nonlinear state-space identification of hysteresis

    NASA Astrophysics Data System (ADS)

    Fakhrizadeh Esfahani, Alireza; Dreesen, Philippe; Tiels, Koen; Noël, Jean-Philippe; Schoukens, Johan

    2018-05-01

    Recent work on black-box polynomial nonlinear state-space modeling for hysteresis identification has provided promising results, but struggles with a large number of parameters due to the use of multivariate polynomials. This drawback is tackled in the current paper by applying a decoupling approach that results in a more parsimonious representation involving univariate polynomials. This work is carried out numerically on input-output data generated by a Bouc-Wen hysteretic model and follows up on earlier work of the authors. The current article discusses the polynomial decoupling approach and explores the selection of the number of univariate polynomials with the polynomial degree. We have found that the presented decoupling approach is able to reduce the number of parameters of the full nonlinear model up to about 50%, while maintaining a comparable output error level.

  10. Beyond total treatment effects in randomised controlled trials: Baseline measurement of intermediate outcomes needed to reduce confounding in mediation investigations.

    PubMed

    Landau, Sabine; Emsley, Richard; Dunn, Graham

    2018-06-01

    Random allocation avoids confounding bias when estimating the average treatment effect. For continuous outcomes measured at post-treatment as well as prior to randomisation (baseline), analyses based on (A) post-treatment outcome alone, (B) change scores over the treatment phase or (C) conditioning on baseline values (analysis of covariance) provide unbiased estimators of the average treatment effect. The decision to include baseline values of the clinical outcome in the analysis is based on precision arguments, with analysis of covariance known to be most precise. Investigators increasingly carry out explanatory analyses to decompose total treatment effects into components that are mediated by an intermediate continuous outcome and a non-mediated part. Traditional mediation analysis might be performed based on (A) post-treatment values of the intermediate and clinical outcomes alone, (B) respective change scores or (C) conditioning on baseline measures of both intermediate and clinical outcomes. Using causal diagrams and Monte Carlo simulation, we investigated the performance of the three competing mediation approaches. We considered a data generating model that included three possible confounding processes involving baseline variables: The first two processes modelled baseline measures of the clinical variable or the intermediate variable as common causes of post-treatment measures of these two variables. The third process allowed the two baseline variables themselves to be correlated due to past common causes. We compared the analysis models implied by the competing mediation approaches with this data generating model to hypothesise likely biases in estimators, and tested these in a simulation study. We applied the methods to a randomised trial of pragmatic rehabilitation in patients with chronic fatigue syndrome, which examined the role of limiting activities as a mediator. Estimates of causal mediation effects derived by approach (A) will be biased if one of the three processes involving baseline measures of intermediate or clinical outcomes is operating. Necessary assumptions for the change score approach (B) to provide unbiased estimates under either process include the independence of baseline measures and change scores of the intermediate variable. Finally, estimates provided by the analysis of covariance approach (C) were found to be unbiased under all the three processes considered here. When applied to the example, there was evidence of mediation under all methods but the estimate of the indirect effect depended on the approach used with the proportion mediated varying from 57% to 86%. Trialists planning mediation analyses should measure baseline values of putative mediators as well as of continuous clinical outcomes. An analysis of covariance approach is recommended to avoid potential biases due to confounding processes involving baseline measures of intermediate or clinical outcomes, and not simply for increased precision.

  11. Exploiting the potential of unlabeled endoscopic video data with self-supervised learning.

    PubMed

    Ross, Tobias; Zimmerer, David; Vemuri, Anant; Isensee, Fabian; Wiesenfarth, Manuel; Bodenstedt, Sebastian; Both, Fabian; Kessler, Philip; Wagner, Martin; Müller, Beat; Kenngott, Hannes; Speidel, Stefanie; Kopp-Schneider, Annette; Maier-Hein, Klaus; Maier-Hein, Lena

    2018-06-01

    Surgical data science is a new research field that aims to observe all aspects of the patient treatment process in order to provide the right assistance at the right time. Due to the breakthrough successes of deep learning-based solutions for automatic image annotation, the availability of reference annotations for algorithm training is becoming a major bottleneck in the field. The purpose of this paper was to investigate the concept of self-supervised learning to address this issue. Our approach is guided by the hypothesis that unlabeled video data can be used to learn a representation of the target domain that boosts the performance of state-of-the-art machine learning algorithms when used for pre-training. Core of the method is an auxiliary task based on raw endoscopic video data of the target domain that is used to initialize the convolutional neural network (CNN) for the target task. In this paper, we propose the re-colorization of medical images with a conditional generative adversarial network (cGAN)-based architecture as auxiliary task. A variant of the method involves a second pre-training step based on labeled data for the target task from a related domain. We validate both variants using medical instrument segmentation as target task. The proposed approach can be used to radically reduce the manual annotation effort involved in training CNNs. Compared to the baseline approach of generating annotated data from scratch, our method decreases exploratively the number of labeled images by up to 75% without sacrificing performance. Our method also outperforms alternative methods for CNN pre-training, such as pre-training on publicly available non-medical (COCO) or medical data (MICCAI EndoVis2017 challenge) using the target task (in this instance: segmentation). As it makes efficient use of available (non-)public and (un-)labeled data, the approach has the potential to become a valuable tool for CNN (pre-)training.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Romanov, V.N.; Ackman, T.E.; Soong, Yee

    The looming global energy and environmental crises underscore a pressing need for the revision of current energy policies. The dominating albeit somewhat optimistic public perception is that hundreds of years worth of coal available for power generation will offset the decline of oil and gas reserves. Although use of coal accounts for half of U.S. electricity generation and for a quarter of world energy consumption, it has been perceived until recently as unwelcomed by environmentalists and legislators. For coal power generation to be properly considered, CO2 and other greenhouse gas (GHG) generation and deposition must be addressed to assuage globalmore » climate change concerns. Capturing and sequestering CO2 emissions is one of the principal modes of carbon management. Herein we will suggest a novel process that includes capturing GHG in abundant materials, which can be facilitated by controlled sequential heating and cooling of these solids. By taking advantage of the properties of waste materials generated during coal production and the exhaust heat generated by the power plants, such an approach permits the integration of the entire CO2 cycle, from generation to deposition. Coupling coal extraction/preparation with power generation facilities would improve the economics of “zero-emission” power plants due to the proximity of all the involved facilities.« less

  13. Disruption in the diabetic device care market

    PubMed Central

    Mohammed, Raihan

    2018-01-01

    As diabetes mellitus (DM) has approached pandemic proportions, the pressure for effective glycemic management is mounting. The starting point for managing and living well with DM involves early diagnosis and monitoring blood glucose levels. Therefore, self-monitoring of blood glucose (SMBG) can help patients maintain their blood glucose levels within the appropriate range. The general principle behind the current SMBG method involves a finger prick test to obtain a blood drop, which is applied onto a reagent strip and read by an automated device. Novel techniques are currently under evaluation to create the next generation of painless and accurate glucose monitoring for DM. We began by outlining how the emerging technology of the noninvasive glucose monitoring devices (NIGMDs) provides both economic and clinical benefits for health systems and patients. We further explored the engineering and techniques behind these upcoming devices. Finally, we evaluated how the NIGMDs disrupt the diabetic device care market and drive health care consumerism. We postulated that the NIGMDs play a pivotal role in the implementation of next generation of diabetes prevention strategies. PMID:29440935

  14. Flavor changing neutral currents involving heavy quarks with four generations

    NASA Astrophysics Data System (ADS)

    Arhrib, Abdesslam; Hou, Wei-Shu

    2006-07-01

    We study various flavor changing neutral currents (FCNC) involving heavy quarks in the Standard Model (SM) with a sequential fourth generation. After imposing B→Xsγ, B→Xsl+l- and Z→bbar b constraints, we find Script B(Z→sbar b+bar sb) can be enhanced by an order of magnitude to 10-7, while t→cZ,cH decays can reach 10-6, which are orders of magnitude higher than three generation SM. However, these rates are still not observable for the near future. With the era of Large Hadron Collider approaching, we focus on FCNC decays involving fourth generation b' and t' quarks. We calculate the rates for loop induced FCNC decays b'→bZ, bH, bg, bγ, as well as t'→tZ, tH, tg, tγ. If |Vcb'| is of order |Vcb| simeq 0.04, tree level b'→cW decay would dominate, posing a challenge since b-tagging is less effective. For |Vcb'| << |Vcb|, b'→tW would tend to dominate, while b'→t'W* could also open for heavier b', leading to the possibility of quadruple-W signals via b'bar b'→bbar bW+W-W+W-. The FCNC b'→bZ,bH decays could still dominate if mb' is just above 200 GeV. For the case of t', in general t'→bW would be dominant, hence it behaves like a heavy top. For both b' and t', except for the intriguing light b' case, FCNC decays are typically in the 10-4-10-2 range, and are quite detectable at the LHC. For a possible future International Linear Collider, we find the associated production of FCNC e+e-→bbar s, tbar c are below sensitivity, while e+e-→b'bar b and t'bar t can be better probed. Tevatron Run-II can still probe the lighter b' or t' scenario. LHC would either discover the fourth generation and measure the FCNC rates, or rule out the fourth generation conclusively. If discovered, the ILC can study the b' or t' decay modes in detail.

  15. The unholy trinity: taxonomy, species delimitation and DNA barcoding

    PubMed Central

    DeSalle, Rob; Egan, Mary G; Siddall, Mark

    2005-01-01

    Recent excitement over the development of an initiative to generate DNA sequences for all named species on the planet has in our opinion generated two major areas of contention as to how this ‘DNA barcoding’ initiative should proceed. It is critical that these two issues are clarified and resolved, before the use of DNA as a tool for taxonomy and species delimitation can be universalized. The first issue concerns how DNA data are to be used in the context of this initiative; this is the DNA barcode reader problem (or barcoder problem). Currently, many of the published studies under this initiative have used tree building methods and more precisely distance approaches to the construction of the trees that are used to place certain DNA sequences into a taxonomic context. The second problem involves the reaction of the taxonomic community to the directives of the ‘DNA barcoding’ initiative. This issue is extremely important in that the classical taxonomic approach and the DNA approach will need to be reconciled in order for the ‘DNA barcoding’ initiative to proceed with any kind of community acceptance. In fact, we feel that DNA barcoding is a misnomer. Our preference is for the title of the London meetings—Barcoding Life. In this paper we discuss these two concerns generated around the DNA barcoding initiative and attempt to present a phylogenetic systematic framework for an improved barcoder as well as a taxonomic framework for interweaving classical taxonomy with the goals of ‘DNA barcoding’. PMID:16214748

  16. Functional Genomic Screening Approaches in Mechanistic Toxicology and Potential Future Applications of CRISPR-Cas9

    PubMed Central

    Shen, Hua; McHale, Cliona M.; Smith, Martyn T; Zhang, Luoping

    2015-01-01

    Characterizing variability in the extent and nature of responses to environmental exposures is a critical aspect of human health risk assessment. Chemical toxicants act by many different mechanisms, however, and the genes involved in adverse outcome pathways (AOPs) and AOP networks are not yet characterized. Functional genomic approaches can reveal both toxicity pathways and susceptibility genes, through knockdown or knockout of all non-essential genes in a cell of interest, and identification of genes associated with a toxicity phenotype following toxicant exposure. Screening approaches in yeast and human near-haploid leukemic KBM7 cells, have identified roles for genes and pathways involved in response to many toxicants but are limited by partial homology among yeast and human genes and limited relevance to normal diploid cells. RNA interference (RNAi) suppresses mRNA expression level but is limited by off-target effects (OTEs) and incomplete knockdown. The recently developed gene editing approach called clustered regularly interspaced short palindrome repeats-associated nuclease (CRISPR)-Cas9, can precisely knock-out most regions of the genome at the DNA level with fewer OTEs than RNAi, in multiple human cell types, thus overcoming the limitations of the other approaches. It has been used to identify genes involved in the response to chemical and microbial toxicants in several human cell types and could readily be extended to the systematic screening of large numbers of environmental chemicals. CRISPR-Cas9 can also repress and activate gene expression, including that of non-coding RNA, with near-saturation, thus offering the potential to more fully characterize AOPs and AOP networks. Finally, CRISPR-Cas9 can generate complex animal models in which to conduct preclinical toxicity testing at the level of individual genotypes or haplotypes. Therefore, CRISPR-Cas9 is a powerful and flexible functional genomic screening approach that can be harnessed to provide unprecedented mechanistic insight in the field of modern toxicology. PMID:26041264

  17. Development of a Novel Guided Wave Generation System Using a Giant Magnetostrictive Actuator for Nondestructive Evaluation

    PubMed Central

    Luo, Mingzhang; Li, Weijie; Wang, Junming; Chen, Xuemin; Song, Gangbing

    2018-01-01

    As a common approach to nondestructive testing and evaluation, guided wave-based methods have attracted much attention because of their wide detection range and high detection efficiency. It is highly desirable to develop a portable guided wave testing system with high actuating energy and variable frequency. In this paper, a novel giant magnetostrictive actuator with high actuation power is designed and implemented, based on the giant magnetostrictive (GMS) effect. The novel GMS actuator design involves a conical energy-focusing head that can focus the amplified mechanical energy generated by the GMS actuator. This design enables the generation of stress waves with high energy, and the focusing of the generated stress waves on the test object. The guided wave generation system enables two kinds of output modes: the coded pulse signal and the sweep signal. The functionality and the advantages of the developed system are validated through laboratory testing in the quality assessment of rock bolt-reinforced structures. In addition, the developed GMS actuator and the supporting system are successfully implemented and applied in field tests. The device can also be used in other nondestructive testing and evaluation applications that require high-power stress wave generation. PMID:29510540

  18. Development of a Novel Guided Wave Generation System Using a Giant Magnetostrictive Actuator for Nondestructive Evaluation.

    PubMed

    Luo, Mingzhang; Li, Weijie; Wang, Junming; Wang, Ning; Chen, Xuemin; Song, Gangbing

    2018-03-04

    As a common approach to nondestructive testing and evaluation, guided wave-based methods have attracted much attention because of their wide detection range and high detection efficiency. It is highly desirable to develop a portable guided wave testing system with high actuating energy and variable frequency. In this paper, a novel giant magnetostrictive actuator with high actuation power is designed and implemented, based on the giant magnetostrictive (GMS) effect. The novel GMS actuator design involves a conical energy-focusing head that can focus the amplified mechanical energy generated by the GMS actuator. This design enables the generation of stress waves with high energy, and the focusing of the generated stress waves on the test object. The guided wave generation system enables two kinds of output modes: the coded pulse signal and the sweep signal. The functionality and the advantages of the developed system are validated through laboratory testing in the quality assessment of rock bolt-reinforced structures. In addition, the developed GMS actuator and the supporting system are successfully implemented and applied in field tests. The device can also be used in other nondestructive testing and evaluation applications that require high-power stress wave generation.

  19. State-Chart Autocoder

    NASA Technical Reports Server (NTRS)

    Clark, Kenneth; Watney, Garth; Murray, Alexander; Benowitz, Edward

    2007-01-01

    A computer program translates Unified Modeling Language (UML) representations of state charts into source code in the C, C++, and Python computing languages. ( State charts signifies graphical descriptions of states and state transitions of a spacecraft or other complex system.) The UML representations constituting the input to this program are generated by using a UML-compliant graphical design program to draw the state charts. The generated source code is consistent with the "quantum programming" approach, which is so named because it involves discrete states and state transitions that have features in common with states and state transitions in quantum mechanics. Quantum programming enables efficient implementation of state charts, suitable for real-time embedded flight software. In addition to source code, the autocoder program generates a graphical-user-interface (GUI) program that, in turn, generates a display of state transitions in response to events triggered by the user. The GUI program is wrapped around, and can be used to exercise the state-chart behavior of, the generated source code. Once the expected state-chart behavior is confirmed, the generated source code can be augmented with a software interface to the rest of the software with which the source code is required to interact.

  20. Parallel tiled Nussinov RNA folding loop nest generated using both dependence graph transitive closure and loop skewing.

    PubMed

    Palkowski, Marek; Bielecki, Wlodzimierz

    2017-06-02

    RNA secondary structure prediction is a compute intensive task that lies at the core of several search algorithms in bioinformatics. Fortunately, the RNA folding approaches, such as the Nussinov base pair maximization, involve mathematical operations over affine control loops whose iteration space can be represented by the polyhedral model. Polyhedral compilation techniques have proven to be a powerful tool for optimization of dense array codes. However, classical affine loop nest transformations used with these techniques do not optimize effectively codes of dynamic programming of RNA structure predictions. The purpose of this paper is to present a novel approach allowing for generation of a parallel tiled Nussinov RNA loop nest exposing significantly higher performance than that of known related code. This effect is achieved due to improving code locality and calculation parallelization. In order to improve code locality, we apply our previously published technique of automatic loop nest tiling to all the three loops of the Nussinov loop nest. This approach first forms original rectangular 3D tiles and then corrects them to establish their validity by means of applying the transitive closure of a dependence graph. To produce parallel code, we apply the loop skewing technique to a tiled Nussinov loop nest. The technique is implemented as a part of the publicly available polyhedral source-to-source TRACO compiler. Generated code was run on modern Intel multi-core processors and coprocessors. We present the speed-up factor of generated Nussinov RNA parallel code and demonstrate that it is considerably faster than related codes in which only the two outer loops of the Nussinov loop nest are tiled.

  1. Indirect methods for reference interval determination - review and recommendations.

    PubMed

    Jones, Graham R D; Haeckel, Rainer; Loh, Tze Ping; Sikaris, Ken; Streichert, Thomas; Katayev, Alex; Barth, Julian H; Ozarda, Yesim

    2018-04-19

    Reference intervals are a vital part of the information supplied by clinical laboratories to support interpretation of numerical pathology results such as are produced in clinical chemistry and hematology laboratories. The traditional method for establishing reference intervals, known as the direct approach, is based on collecting samples from members of a preselected reference population, making the measurements and then determining the intervals. An alternative approach is to perform analysis of results generated as part of routine pathology testing and using appropriate statistical techniques to determine reference intervals. This is known as the indirect approach. This paper from a working group of the International Federation of Clinical Chemistry (IFCC) Committee on Reference Intervals and Decision Limits (C-RIDL) aims to summarize current thinking on indirect approaches to reference intervals. The indirect approach has some major potential advantages compared with direct methods. The processes are faster, cheaper and do not involve patient inconvenience, discomfort or the risks associated with generating new patient health information. Indirect methods also use the same preanalytical and analytical techniques used for patient management and can provide very large numbers for assessment. Limitations to the indirect methods include possible effects of diseased subpopulations on the derived interval. The IFCC C-RIDL aims to encourage the use of indirect methods to establish and verify reference intervals, to promote publication of such intervals with clear explanation of the process used and also to support the development of improved statistical techniques for these studies.

  2. Next generation of network medicine: interdisciplinary signaling approaches.

    PubMed

    Korcsmaros, Tamas; Schneider, Maria Victoria; Superti-Furga, Giulio

    2017-02-20

    In the last decade, network approaches have transformed our understanding of biological systems. Network analyses and visualizations have allowed us to identify essential molecules and modules in biological systems, and improved our understanding of how changes in cellular processes can lead to complex diseases, such as cancer, infectious and neurodegenerative diseases. "Network medicine" involves unbiased large-scale network-based analyses of diverse data describing interactions between genes, diseases, phenotypes, drug targets, drug transport, drug side-effects, disease trajectories and more. In terms of drug discovery, network medicine exploits our understanding of the network connectivity and signaling system dynamics to help identify optimal, often novel, drug targets. Contrary to initial expectations, however, network approaches have not yet delivered a revolution in molecular medicine. In this review, we propose that a key reason for the limited impact, so far, of network medicine is a lack of quantitative multi-disciplinary studies involving scientists from different backgrounds. To support this argument, we present existing approaches from structural biology, 'omics' technologies (e.g., genomics, proteomics, lipidomics) and computational modeling that point towards how multi-disciplinary efforts allow for important new insights. We also highlight some breakthrough studies as examples of the potential of these approaches, and suggest ways to make greater use of the power of interdisciplinarity. This review reflects discussions held at an interdisciplinary signaling workshop which facilitated knowledge exchange from experts from several different fields, including in silico modelers, computational biologists, biochemists, geneticists, molecular and cell biologists as well as cancer biologists and pharmacologists.

  3. A Review of the Water and Energy Sectors and the Use of a Nexus Approach in Abu Dhabi.

    PubMed

    Paul, Parneet; Al Tenaiji, Ameena Kulaib; Braimah, Nuhu

    2016-03-25

    Rapid population increase coupled with urbanization and industrialization has resulted in shortages of water in the Middle East. This situation is further exacerbated by global climate change due to greenhouse gas emissions. Recent research advocates that solutions to the global water security and scarcity crisis must involve water-energy nexus approaches. This means adopting policies and strategies that harmonize these inter-related sectors to minimize environmental impact while maximizing human benefit. In the case of Abu Dhabi, when designing and locating oil/gas refineries and associated power generation facilities, previous relevant decisions were based on simple economic and geographical grounds, such as nearness to oil rigs, pipelines, existing industries and port facilities, etc. The subsequent design and location of water abstraction and treatment works operated by the waste heat from these refining and/or power generation processes was catered for as an afterthought, meaning that there is now a mismatch between the water and energy supplies and demands. This review study was carried out to show how Abu Dhabi is trying now to integrate its water-energy sectors using a nexus approach so that future water/power infrastructure is designed optimally and operated in harmony, especially in regard to future demand. Based upon this review work, some recommendations are made for designers and policy makers alike to bolster the nexus approach that Abu Dhabi is pursuing.

  4. Navigating complexity through knowledge coproduction: Mainstreaming ecosystem services into disaster risk reduction.

    PubMed

    Reyers, Belinda; Nel, Jeanne L; O'Farrell, Patrick J; Sitas, Nadia; Nel, Deon C

    2015-06-16

    Achieving the policy and practice shifts needed to secure ecosystem services is hampered by the inherent complexities of ecosystem services and their management. Methods for the participatory production and exchange of knowledge offer an avenue to navigate this complexity together with the beneficiaries and managers of ecosystem services. We develop and apply a knowledge coproduction approach based on social-ecological systems research and assess its utility in generating shared knowledge and action for ecosystem services. The approach was piloted in South Africa across four case studies aimed at reducing the risk of disasters associated with floods, wildfires, storm waves, and droughts. Different configurations of stakeholders (knowledge brokers, assessment teams, implementers, and bridging agents) were involved in collaboratively designing each study, generating and exchanging knowledge, and planning for implementation. The approach proved useful in the development of shared knowledge on the sizable contribution of ecosystem services to disaster risk reduction. This knowledge was used by stakeholders to design and implement several actions to enhance ecosystem services, including new investments in ecosystem restoration, institutional changes in the private and public sector, and innovative partnerships of science, practice, and policy. By bringing together multiple disciplines, sectors, and stakeholders to jointly produce the knowledge needed to understand and manage a complex system, knowledge coproduction approaches offer an effective avenue for the improved integration of ecosystem services into decision making.

  5. Navigating complexity through knowledge coproduction: Mainstreaming ecosystem services into disaster risk reduction

    PubMed Central

    Reyers, Belinda; Nel, Jeanne L.; O’Farrell, Patrick J.; Sitas, Nadia; Nel, Deon C.

    2015-01-01

    Achieving the policy and practice shifts needed to secure ecosystem services is hampered by the inherent complexities of ecosystem services and their management. Methods for the participatory production and exchange of knowledge offer an avenue to navigate this complexity together with the beneficiaries and managers of ecosystem services. We develop and apply a knowledge coproduction approach based on social–ecological systems research and assess its utility in generating shared knowledge and action for ecosystem services. The approach was piloted in South Africa across four case studies aimed at reducing the risk of disasters associated with floods, wildfires, storm waves, and droughts. Different configurations of stakeholders (knowledge brokers, assessment teams, implementers, and bridging agents) were involved in collaboratively designing each study, generating and exchanging knowledge, and planning for implementation. The approach proved useful in the development of shared knowledge on the sizable contribution of ecosystem services to disaster risk reduction. This knowledge was used by stakeholders to design and implement several actions to enhance ecosystem services, including new investments in ecosystem restoration, institutional changes in the private and public sector, and innovative partnerships of science, practice, and policy. By bringing together multiple disciplines, sectors, and stakeholders to jointly produce the knowledge needed to understand and manage a complex system, knowledge coproduction approaches offer an effective avenue for the improved integration of ecosystem services into decision making. PMID:26082541

  6. A Review of the Water and Energy Sectors and the Use of a Nexus Approach in Abu Dhabi

    PubMed Central

    Paul, Parneet; Al Tenaiji, Ameena Kulaib; Braimah, Nuhu

    2016-01-01

    Rapid population increase coupled with urbanization and industrialization has resulted in shortages of water in the Middle East. This situation is further exacerbated by global climate change due to greenhouse gas emissions. Recent research advocates that solutions to the global water security and scarcity crisis must involve water–energy nexus approaches. This means adopting policies and strategies that harmonize these inter-related sectors to minimize environmental impact while maximizing human benefit. In the case of Abu Dhabi, when designing and locating oil/gas refineries and associated power generation facilities, previous relevant decisions were based on simple economic and geographical grounds, such as nearness to oil rigs, pipelines, existing industries and port facilities, etc. The subsequent design and location of water abstraction and treatment works operated by the waste heat from these refining and/or power generation processes was catered for as an afterthought, meaning that there is now a mismatch between the water and energy supplies and demands. This review study was carried out to show how Abu Dhabi is trying now to integrate its water–energy sectors using a nexus approach so that future water/power infrastructure is designed optimally and operated in harmony, especially in regard to future demand. Based upon this review work, some recommendations are made for designers and policy makers alike to bolster the nexus approach that Abu Dhabi is pursuing. PMID:27023583

  7. Differences in Parental Involvement Typologies among Baby Boomers, Generation X, and Generation Y Parents: A Study of Select Bay Area Region of Houston Elementary Schools

    ERIC Educational Resources Information Center

    Veloz, Elizabeth Andrea

    2010-01-01

    The purpose of this study was to determine whether differences existed among generations (Baby Boomers, Generation X, and Generation Y) regarding the levels of parental involvement within each of these generations. Also examined were additional factors such as the parents. socioeconomic status, educational level, marital status, and ethnicity. The…

  8. Introduction to Bayesian statistical approaches to compositional analyses of transgenic crops 1. Model validation and setting the stage.

    PubMed

    Harrison, Jay M; Breeze, Matthew L; Harrigan, George G

    2011-08-01

    Statistical comparisons of compositional data generated on genetically modified (GM) crops and their near-isogenic conventional (non-GM) counterparts typically rely on classical significance testing. This manuscript presents an introduction to Bayesian methods for compositional analysis along with recommendations for model validation. The approach is illustrated using protein and fat data from two herbicide tolerant GM soybeans (MON87708 and MON87708×MON89788) and a conventional comparator grown in the US in 2008 and 2009. Guidelines recommended by the US Food and Drug Administration (FDA) in conducting Bayesian analyses of clinical studies on medical devices were followed. This study is the first Bayesian approach to GM and non-GM compositional comparisons. The evaluation presented here supports a conclusion that a Bayesian approach to analyzing compositional data can provide meaningful and interpretable results. We further describe the importance of method validation and approaches to model checking if Bayesian approaches to compositional data analysis are to be considered viable by scientists involved in GM research and regulation. Copyright © 2011 Elsevier Inc. All rights reserved.

  9. Boolean network inference from time series data incorporating prior biological knowledge.

    PubMed

    Haider, Saad; Pal, Ranadip

    2012-01-01

    Numerous approaches exist for modeling of genetic regulatory networks (GRNs) but the low sampling rates often employed in biological studies prevents the inference of detailed models from experimental data. In this paper, we analyze the issues involved in estimating a model of a GRN from single cell line time series data with limited time points. We present an inference approach for a Boolean Network (BN) model of a GRN from limited transcriptomic or proteomic time series data based on prior biological knowledge of connectivity, constraints on attractor structure and robust design. We applied our inference approach to 6 time point transcriptomic data on Human Mammary Epithelial Cell line (HMEC) after application of Epidermal Growth Factor (EGF) and generated a BN with a plausible biological structure satisfying the data. We further defined and applied a similarity measure to compare synthetic BNs and BNs generated through the proposed approach constructed from transitions of various paths of the synthetic BNs. We have also compared the performance of our algorithm with two existing BN inference algorithms. Through theoretical analysis and simulations, we showed the rarity of arriving at a BN from limited time series data with plausible biological structure using random connectivity and absence of structure in data. The framework when applied to experimental data and data generated from synthetic BNs were able to estimate BNs with high similarity scores. Comparison with existing BN inference algorithms showed the better performance of our proposed algorithm for limited time series data. The proposed framework can also be applied to optimize the connectivity of a GRN from experimental data when the prior biological knowledge on regulators is limited or not unique.

  10. Performing label-fusion-based segmentation using multiple automatically generated templates.

    PubMed

    Chakravarty, M Mallar; Steadman, Patrick; van Eede, Matthijs C; Calcott, Rebecca D; Gu, Victoria; Shaw, Philip; Raznahan, Armin; Collins, D Louis; Lerch, Jason P

    2013-10-01

    Classically, model-based segmentation procedures match magnetic resonance imaging (MRI) volumes to an expertly labeled atlas using nonlinear registration. The accuracy of these techniques are limited due to atlas biases, misregistration, and resampling error. Multi-atlas-based approaches are used as a remedy and involve matching each subject to a number of manually labeled templates. This approach yields numerous independent segmentations that are fused using a voxel-by-voxel label-voting procedure. In this article, we demonstrate how the multi-atlas approach can be extended to work with input atlases that are unique and extremely time consuming to construct by generating a library of multiple automatically generated templates of different brains (MAGeT Brain). We demonstrate the efficacy of our method for the mouse and human using two different nonlinear registration algorithms (ANIMAL and ANTs). The input atlases consist a high-resolution mouse brain atlas and an atlas of the human basal ganglia and thalamus derived from serial histological data. MAGeT Brain segmentation improves the identification of the mouse anterior commissure (mean Dice Kappa values (κ = 0.801), but may be encountering a ceiling effect for hippocampal segmentations. Applying MAGeT Brain to human subcortical structures improves segmentation accuracy for all structures compared to regular model-based techniques (κ = 0.845, 0.752, and 0.861 for the striatum, globus pallidus, and thalamus, respectively). Experiments performed with three manually derived input templates suggest that MAGeT Brain can approach or exceed the accuracy of multi-atlas label-fusion segmentation (κ = 0.894, 0.815, and 0.895 for the striatum, globus pallidus, and thalamus, respectively). Copyright © 2012 Wiley Periodicals, Inc.

  11. Rocket Ejector Studies for Application to RBCC Engines: An Integrated Experimental/CFD Approach

    NASA Technical Reports Server (NTRS)

    Pal, S.; Merkle, C. L.; Anderson, W. E.; Santoro, R. J.

    1997-01-01

    Recent interest in low cost, reliable access to space has generated increased interest in advanced technology approaches to space transportation systems. A key to the success of such programs lies in the development of advanced propulsion systems capable of achieving the performance and operations goals required for the next generation of space vehicles. One extremely promising approach involves the combination of rocket and air- breathing engines into a rocket-based combined-cycle engine (RBCC). A key element of that engine is the rocket ejector which is utilized in the zero to Mach two operating regime. Studies of RBCC engine concepts are not new and studies dating back thirty years are well documented in the literature. However, studies focused on the rocket ejector mode of the RBCC cycle are lacking. The present investigation utilizes an integrated experimental and computation fluid dynamics (CFD) approach to examine critical rocket ejector performance issues. In particular, the development of a predictive methodology capable of performance prediction is a key objective in order to analyze thermal choking and its control, primary/secondary pressure matching considerations, and effects of nozzle expansion ratio. To achieve this objective, the present study emphasizes obtaining new data using advanced optical diagnostics such as Raman spectroscopy and CFD techniques to investigate mixing in the rocket ejector mode. A new research facility for the study of the rocket ejector mode is described along with the diagnostic approaches to be used. The CFD modeling approach is also described along with preliminary CFD predictions obtained to date.

  12. A multi-temporal analysis approach for land cover mapping in support of nuclear incident response

    NASA Astrophysics Data System (ADS)

    Sah, Shagan; van Aardt, Jan A. N.; McKeown, Donald M.; Messinger, David W.

    2012-06-01

    Remote sensing can be used to rapidly generate land use maps for assisting emergency response personnel with resource deployment decisions and impact assessments. In this study we focus on constructing accurate land cover maps to map the impacted area in the case of a nuclear material release. The proposed methodology involves integration of results from two different approaches to increase classification accuracy. The data used included RapidEye scenes over Nine Mile Point Nuclear Power Station (Oswego, NY). The first step was building a coarse-scale land cover map from freely available, high temporal resolution, MODIS data using a time-series approach. In the case of a nuclear accident, high spatial resolution commercial satellites such as RapidEye or IKONOS can acquire images of the affected area. Land use maps from the two image sources were integrated using a probability-based approach. Classification results were obtained for four land classes - forest, urban, water and vegetation - using Euclidean and Mahalanobis distances as metrics. Despite the coarse resolution of MODIS pixels, acceptable accuracies were obtained using time series features. The overall accuracies using the fusion based approach were in the neighborhood of 80%, when compared with GIS data sets from New York State. The classifications were augmented using this fused approach, with few supplementary advantages such as correction for cloud cover and independence from time of year. We concluded that this method would generate highly accurate land maps, using coarse spatial resolution time series satellite imagery and a single date, high spatial resolution, multi-spectral image.

  13. A Rapid Embryonic Stem Cell-Based Mouse Model for B-cell Lymphomas Driven by Epstein-Barr Virus Protein LMP1.

    PubMed

    Ba, Zhaoqing; Meng, Fei-Long; Gostissa, Monica; Huang, Pei-Yi; Ke, Qiang; Wang, Zhe; Dao, Mai N; Fujiwara, Yuko; Rajewsky, Klaus; Zhang, Baochun; Alt, Frederick W

    2015-06-01

    The Epstein-Barr virus (EBV) latent membrane protein 1 (LMP1) contributes to oncogenic human B-cell transformation. Mouse B cells conditionally expressing LMP1 are not predisposed to B-cell malignancies, as LMP1-expressing B cells are eliminated by T cells. However, mice with conditional B-cell LMP1 expression and genetic elimination of α/β and γ/δ T cells ("CLT" mice) die early in association with B-cell lymphoproliferation and lymphomagenesis. Generation of CLT mice involves in-breeding multiple independently segregating alleles. Thus, although introduction of additional activating or knockout mutations into the CLT model is desirable for further B-cell expansion and immunosurveillance studies, doing such experiments by germline breeding is time-consuming, expensive, and sometimes unfeasible. To generate a more tractable model, we generated clonal CLT embryonic stem (ES) cells from CLT embryos and injected them into RAG2-deficient blastocysts to generate chimeric mice, which, like germline CLT mice, harbor splenic CLT B cells and lack T cells. CLT chimeric mice generated by this RAG2-deficient blastocyst complementation ("RDBC") approach die rapidly in association with B-cell lymphoproliferation and lymphoma. Because CLT lymphomas routinely express the activation-induced cytidine deaminase (AID) antibody diversifier, we tested potential AID roles by eliminating the AID gene in CLT ES cells and testing them via RDBC. We found that CLT and AID-deficient CLT ES chimeras had indistinguishable phenotypes, showing that AID is not essential for LMP1-induced lymphomagenesis. Beyond expanding accessibility and utility of CLT mice as a cancer immunotherapy model, our studies provide a new approach for facilitating generation of genetically complex mouse cancer models. ©2015 American Association for Cancer Research.

  14. A Rapid Embryonic Stem Cell-Based Mouse Model for B-cell Lymphomas Driven by Epstein-Barr Virus Protein LMP1

    PubMed Central

    Ba, Zhaoqing; Meng, Fei-Long; Gostissa, Monica; Huang, Pei-Yi; Ke, Qiang; Wang, Zhe; Dao, Mai N.; Fujiwara, Yuko; Rajewsky, Klaus; Baochun, Zhang; Alt, Frederick W.

    2015-01-01

    The Epstein-Barr virus (EBV) latent membrane protein 1 (LMP1) contributes to oncogenic human B-cell transformation. Mouse B cells conditionally expressing LMP1 are not predisposed to B-cell malignancies, as LMP1-expressing B cells are eliminated by T cells. However, mice with conditional B-cell LMP1 expression and genetic elimination of α/β and γ/δ T cells (“CLT” mice) die early in association with B-cell lymphoproliferation and lymphomagenesis. Generation of CLT mice involves in-breeding multiple independently segregating alleles. Thus, while introduction of additional activating or knock-out mutations into the CLT model is desirable for further B-cell expansion and immunosurveillance studies, doing such experiments by germline breeding is time-consuming, expensive and sometimes unfeasible. To generate a more tractable model, we generated clonal CLT ES cells from CLT embryos and injected them into RAG2-deficient blastocysts to generate chimeric mice, which like germline CLT mice harbor splenic CLT B cells and lack T cells. CLT chimeric mice generated by this RAG2-deficient blastocyst complementation (“RDBC”) approach die rapidly in association with B-cell lymphoproliferation and lymphoma. As CLT lymphomas routinely express the Activation-Induced Cytidine Deaminase (AID) antibody diversifier, we tested potential AID roles by eliminating the AID gene in CLT ES cells and testing them via RDBC. We found that CLT and AID-deficient CLT ES chimeras had indistinguishable phenotypes, showing that AID is not essential for LMP1-induced lymphomagenesis. Beyond expanding accessibility and utility of CLT mice as a cancer immunotherapy model, our studies provide a new approach for facilitating generation of genetically complex mouse cancer models. PMID:25934172

  15. Evaluating the use of key performance indicators to evidence the patient experience.

    PubMed

    McCance, Tanya; Hastings, Jack; Dowler, Hilda

    2015-11-01

    To test eight person-centred key performance indicators and the feasibility of an appropriate measurement framework as an approach to evidencing the patient experience. The value of measuring the quality of patient care is undisputed in the international literature, however, the type of measures that can be used to generate data that is meaningful for practice continues to be debated. This paper offers a different perspective to the 'measurement' of the nursing and midwifery contribution to the patient experience. Fourth generation evaluation was the methodological approach used to evaluate the implementation of the key performance indicators and measurement framework across three participating organisations involving nine practice settings. Data were collected by repeated use of claims, concerns and issues with staff working across nine participating sites (n = 18) and the senior executives from the three partner organisations (n = 12). Data were collected during the facilitated sessions with stakeholders and analysed in conjunction with the data generated from the measurement framework. The data reveal the inherent value placed on the evidence generated from the implementation of the key performance indicators as reflected in the following themes: measuring what matters; evidencing the patient experience; engaging staff; a focus for improving practice; and articulating and demonstrating the positive contribution of nursing and midwifery. The implementation of the key performance indicators and the measurement framework has been effective in generating evidence that demonstrates the patient experience. The nature of the data generated not only privileges the patient voice but also offers feedback to nurses and midwives that can inform the development of person-centred cultures. The use of these indicators will produce evidence of patient experience that can be used by nurse and midwives to celebrate and further inform person-centred practice. © 2015 John Wiley & Sons Ltd.

  16. Methods for the scientific study of discrimination and health: an ecosocial approach.

    PubMed

    Krieger, Nancy

    2012-05-01

    The scientific study of how discrimination harms health requires theoretically grounded methods. At issue is how discrimination, as one form of societal injustice, becomes embodied inequality and is manifested as health inequities. As clarified by ecosocial theory, methods must address the lived realities of discrimination as an exploitative and oppressive societal phenomenon operating at multiple levels and involving myriad pathways across both the life course and historical generations. An integrated embodied research approach hence must consider (1) the structural level-past and present de jure and de facto discrimination; (2) the individual level-issues of domains, nativity, and use of both explicit and implicit discrimination measures; and (3) how current research methods likely underestimate the impact of racism on health.

  17. Light, heat, action: neural control of fruit fly behaviour.

    PubMed

    Owald, David; Lin, Suewei; Waddell, Scott

    2015-09-19

    The fruit fly Drosophila melanogaster has emerged as a popular model to investigate fundamental principles of neural circuit operation. The sophisticated genetics and small brain permit a cellular resolution understanding of innate and learned behavioural processes. Relatively recent genetic and technical advances provide the means to specifically and reproducibly manipulate the function of many fly neurons with temporal resolution. The same cellular precision can also be exploited to express genetically encoded reporters of neural activity and cell-signalling pathways. Combining these approaches in living behaving animals has great potential to generate a holistic view of behavioural control that transcends the usual molecular, cellular and systems boundaries. In this review, we discuss these approaches with particular emphasis on the pioneering studies and those involving learning and memory.

  18. Multicriteria plan optimization in the hands of physicians: a pilot study in prostate cancer and brain tumors.

    PubMed

    Müller, Birgit S; Shih, Helen A; Efstathiou, Jason A; Bortfeld, Thomas; Craft, David

    2017-11-06

    The purpose of this study was to demonstrate the feasibility of physician driven planning in intensity modulated radiotherapy (IMRT) with a multicriteria optimization (MCO) treatment planning system and template based plan optimization. Exploiting the full planning potential of MCO navigation, this alternative planning approach intends to improve planning efficiency and individual plan quality. Planning was retrospectively performed on 12 brain tumor and 10 post-prostatectomy prostate patients previously treated with MCO-IMRT. For each patient, physicians were provided with a template-based generated Pareto surface of optimal plans to navigate, using the beam angles from the original clinical plans. We compared physician generated plans to clinically delivered plans (created by dosimetrists) in terms of dosimetric differences, physician preferences and planning times. Plan qualities were similar, however physician generated and clinical plans differed in the prioritization of clinical goals. Physician derived prostate plans showed significantly better sparing of the high dose rectum and bladder regions (p(D1) < 0.05; D1: dose received by 1% of the corresponding structure). Physicians' brain tumor plans indicated higher doses for targets and brainstem (p(D1) < 0.05). Within blinded plan comparisons physicians preferred the clinical plans more often (brain: 6:3 out of 12, prostate: 2:6 out of 10) (not statistically significant). While times of physician involvement were comparable for prostate planning, the new workflow reduced the average involved time for brain cases by 30%. Planner times were reduced for all cases. Subjective benefits, such as a better understanding of planning situations, were observed by clinicians through the insight into plan optimization and experiencing dosimetric trade-offs. We introduce physician driven planning with MCO for brain and prostate tumors as a feasible planning workflow. The proposed approach standardizes the planning process by utilizing site specific templates and integrates physicians more tightly into treatment planning. Physicians' navigated plan qualities were comparable to the clinical plans. Given the reduction of planning time of the planner and the equal or lower planning time of physicians, this approach has the potential to improve departmental efficiencies.

  19. Using Bond Graphs for Articulated, Flexible Multi-bodies, Sensors, Actuators, and Controllers with Application to the International Space Station

    NASA Technical Reports Server (NTRS)

    Montgomery, Raymond C.; Granda, Jose J.

    2003-01-01

    Conceptually, modeling of flexible, multi-body systems involves a formulation as a set of time-dependent partial differential equations. However, for practical, engineering purposes, this modeling is usually done using the method of Finite Elements, which approximates the set of partial differential equations, thus generalizing the approach to all continuous media. This research investigates the links between the Bond Graph method and the classical methods used to develop system models and advocates the Bond Graph Methodology and current bond graph tools as alternate approaches that will lead to a quick and precise understanding of a flexible multi-body system under automatic control. For long endurance, complex spacecraft, because of articulation and mission evolution the model of the physical system may change frequently. So a method of automatic generation and regeneration of system models that does not lead to implicit equations, as does the Lagrange equation approach, is desirable. The bond graph method has been shown to be amenable to automatic generation of equations with appropriate consideration of causality. Indeed human-interactive software now exists that automatically generates both symbolic and numeric system models and evaluates causality as the user develops the model, e.g. the CAMP-G software package. In this paper the CAMP-G package is used to generate a bond graph model of the International Space Station (ISS) at an early stage in its assembly, Zvezda. The ISS is an ideal example because it is a collection of bodies that are articulated, many of which are highly flexible. Also many reaction jets are used to control translation and attitude, and many electric motors are used to articulate appendages, which consist of photovoltaic arrays and composite assemblies. The Zvezda bond graph model is compared to an existing model, which was generated by the NASA Johnson Space Center during the Verification and Analysis Cycle of Zvezda.

  20. Germline and reproductive tract effects intensify in male mice with successive generations of estrogenic exposure

    PubMed Central

    Horan, Tegan S.; Marre, Alyssa; Hassold, Terry; Lawson, Crystal; Hunt, Patricia A.

    2017-01-01

    The hypothesis that developmental estrogenic exposure induces a constellation of male reproductive tract abnormalities is supported by experimental and human evidence. Experimental data also suggest that some induced effects persist in descendants of exposed males. These multi- and transgenerational effects are assumed to result from epigenetic changes to the germline, but few studies have directly analyzed germ cells. Typically, studies of transgenerational effects have involved exposing one generation and monitoring effects in subsequent unexposed generations. This approach, however, has limited human relevance, since both the number and volume of estrogenic contaminants has increased steadily over time, intensifying rather than reducing or eliminating exposure. Using an outbred CD-1 mouse model, and a sensitive and quantitative marker of germline development, meiotic recombination, we tested the effect of successive generations of exposure on the testis. We targeted the germline during a narrow, perinatal window using oral exposure to the synthetic estrogen, ethinyl estradiol. A complex three generation exposure protocol allowed us to compare the effects of individual, paternal, and grandpaternal (ancestral) exposure. Our data indicate that multiple generations of exposure not only exacerbate germ cell exposure effects, but also increase the incidence and severity of reproductive tract abnormalities. Taken together, our data suggest that male sensitivity to environmental estrogens is increased by successive generations of exposure. PMID:28727826

  1. A multi-temporal fusion-based approach for land cover mapping in support of nuclear incident response

    NASA Astrophysics Data System (ADS)

    Sah, Shagan

    An increasingly important application of remote sensing is to provide decision support during emergency response and disaster management efforts. Land cover maps constitute one such useful application product during disaster events; if generated rapidly after any disaster, such map products can contribute to the efficacy of the response effort. In light of recent nuclear incidents, e.g., after the earthquake/tsunami in Japan (2011), our research focuses on constructing rapid and accurate land cover maps of the impacted area in case of an accidental nuclear release. The methodology involves integration of results from two different approaches, namely coarse spatial resolution multi-temporal and fine spatial resolution imagery, to increase classification accuracy. Although advanced methods have been developed for classification using high spatial or temporal resolution imagery, only a limited amount of work has been done on fusion of these two remote sensing approaches. The presented methodology thus involves integration of classification results from two different remote sensing modalities in order to improve classification accuracy. The data used included RapidEye and MODIS scenes over the Nine Mile Point Nuclear Power Station in Oswego (New York, USA). The first step in the process was the construction of land cover maps from freely available, high temporal resolution, low spatial resolution MODIS imagery using a time-series approach. We used the variability in the temporal signatures among different land cover classes for classification. The time series-specific features were defined by various physical properties of a pixel, such as variation in vegetation cover and water content over time. The pixels were classified into four land cover classes - forest, urban, water, and vegetation - using Euclidean and Mahalanobis distance metrics. On the other hand, a high spatial resolution commercial satellite, such as RapidEye, can be tasked to capture images over the affected area in the case of a nuclear event. This imagery served as a second source of data to augment results from the time series approach. The classifications from the two approaches were integrated using an a posteriori probability-based fusion approach. This was done by establishing a relationship between the classes, obtained after classification of the two data sources. Despite the coarse spatial resolution of MODIS pixels, acceptable accuracies were obtained using time series features. The overall accuracies using the fusion-based approach were in the neighborhood of 80%, when compared with GIS data sets from New York State. This fusion thus contributed to classification accuracy refinement, with a few additional advantages, such as correction for cloud cover and providing for an approach that is robust against point-in-time seasonal anomalies, due to the inclusion of multi-temporal data. We concluded that this approach is capable of generating land cover maps of acceptable accuracy and rapid turnaround, which in turn can yield reliable estimates of crop acreage of a region. The final algorithm is part of an automated software tool, which can be used by emergency response personnel to generate a nuclear ingestion pathway information product within a few hours of data collection.

  2. Business Modeling to Implement an eHealth Portal for Infection Control: A Reflection on Co-Creation With Stakeholders.

    PubMed

    van Limburg, Maarten; Wentzel, Jobke; Sanderman, Robbert; van Gemert-Pijnen, Lisette

    2015-08-13

    It is acknowledged that the success and uptake of eHealth improve with the involvement of users and stakeholders to make technology reflect their needs. Involving stakeholders in implementation research is thus a crucial element in developing eHealth technology. Business modeling is an approach to guide implementation research for eHealth. Stakeholders are involved in business modeling by identifying relevant stakeholders, conducting value co-creation dialogs, and co-creating a business model. Because implementation activities are often underestimated as a crucial step while developing eHealth, comprehensive and applicable approaches geared toward business modeling in eHealth are scarce. This paper demonstrates the potential of several stakeholder-oriented analysis methods and their practical application was demonstrated using Infectionmanager as an example case. In this paper, we aim to demonstrate how business modeling, with the focus on stakeholder involvement, is used to co-create an eHealth implementation. We divided business modeling in 4 main research steps. As part of stakeholder identification, we performed literature scans, expert recommendations, and snowball sampling (Step 1). For stakeholder analyzes, we performed "basic stakeholder analysis," stakeholder salience, and ranking/analytic hierarchy process (Step 2). For value co-creation dialogs, we performed a process analysis and stakeholder interviews based on the business model canvas (Step 3). Finally, for business model generation, we combined all findings into the business model canvas (Step 4). Based on the applied methods, we synthesized a step-by-step guide for business modeling with stakeholder-oriented analysis methods that we consider suitable for implementing eHealth. The step-by-step guide for business modeling with stakeholder involvement enables eHealth researchers to apply a systematic and multidisciplinary, co-creative approach for implementing eHealth. Business modeling becomes an active part in the entire development process of eHealth and starts an early focus on implementation, in which stakeholders help to co-create the basis necessary for a satisfying success and uptake of the eHealth technology.

  3. Signature of genetic associations in oral cancer.

    PubMed

    Sharma, Vishwas; Nandan, Amrita; Sharma, Amitesh Kumar; Singh, Harpreet; Bharadwaj, Mausumi; Sinha, Dhirendra Narain; Mehrotra, Ravi

    2017-10-01

    Oral cancer etiology is complex and controlled by multi-factorial events including genetic events. Candidate gene studies, genome-wide association studies, and next-generation sequencing identified various chromosomal loci to be associated with oral cancer. There is no available review that could give us the comprehensive picture of genetic loci identified to be associated with oral cancer by candidate gene studies-based, genome-wide association studies-based, and next-generation sequencing-based approaches. A systematic literature search was performed in the PubMed database to identify the loci associated with oral cancer by exclusive candidate gene studies-based, genome-wide association studies-based, and next-generation sequencing-based study approaches. The information of loci associated with oral cancer is made online through the resource "ORNATE." Next, screening of the loci validated by candidate gene studies and next-generation sequencing approach or by two independent studies within candidate gene studies or next-generation sequencing approaches were performed. A total of 264 loci were identified to be associated with oral cancer by candidate gene studies, genome-wide association studies, and next-generation sequencing approaches. In total, 28 loci, that is, 14q32.33 (AKT1), 5q22.2 (APC), 11q22.3 (ATM), 2q33.1 (CASP8), 11q13.3 (CCND1), 16q22.1 (CDH1), 9p21.3 (CDKN2A), 1q31.1 (COX-2), 7p11.2 (EGFR), 22q13.2 (EP300), 4q35.2 (FAT1), 4q31.3 (FBXW7), 4p16.3 (FGFR3), 1p13.3 (GSTM1-GSTT1), 11q13.2 (GSTP1), 11p15.5 (H-RAS), 3p25.3 (hOGG1), 1q32.1 (IL-10), 4q13.3 (IL-8), 12p12.1 (KRAS), 12q15 (MDM2), 12q13.12 (MLL2), 9q34.3 (NOTCH1), 17p13.1 (p53), 3q26.32 (PIK3CA), 10q23.31 (PTEN), 13q14.2 (RB1), and 5q14.2 (XRCC4), were validated to be associated with oral cancer. "ORNATE" gives a snapshot of genetic loci associated with oral cancer. All 28 loci were validated to be linked to oral cancer for which further fine-mapping followed by gene-by-gene and gene-environment interaction studies is needed to confirm their involvement in modifying oral cancer.

  4. Sustainable urban systems: Co-design and framing for transformation.

    PubMed

    Webb, Robert; Bai, Xuemei; Smith, Mark Stafford; Costanza, Robert; Griggs, David; Moglia, Magnus; Neuman, Michael; Newman, Peter; Newton, Peter; Norman, Barbara; Ryan, Chris; Schandl, Heinz; Steffen, Will; Tapper, Nigel; Thomson, Giles

    2018-02-01

    Rapid urbanisation generates risks and opportunities for sustainable development. Urban policy and decision makers are challenged by the complexity of cities as social-ecological-technical systems. Consequently there is an increasing need for collaborative knowledge development that supports a whole-of-system view, and transformational change at multiple scales. Such holistic urban approaches are rare in practice. A co-design process involving researchers, practitioners and other stakeholders, has progressed such an approach in the Australian context, aiming to also contribute to international knowledge development and sharing. This process has generated three outputs: (1) a shared framework to support more systematic knowledge development and use, (2) identification of barriers that create a gap between stated urban goals and actual practice, and (3) identification of strategic focal areas to address this gap. Developing integrated strategies at broader urban scales is seen as the most pressing need. The knowledge framework adopts a systems perspective that incorporates the many urban trade-offs and synergies revealed by a systems view. Broader implications are drawn for policy and decision makers, for researchers and for a shared forward agenda.

  5. Combined Simulated Annealing and Genetic Algorithm Approach to Bus Network Design

    NASA Astrophysics Data System (ADS)

    Liu, Li; Olszewski, Piotr; Goh, Pong-Chai

    A new method - combined simulated annealing (SA) and genetic algorithm (GA) approach is proposed to solve the problem of bus route design and frequency setting for a given road network with fixed bus stop locations and fixed travel demand. The method involves two steps: a set of candidate routes is generated first and then the best subset of these routes is selected by the combined SA and GA procedure. SA is the main process to search for a better solution to minimize the total system cost, comprising user and operator costs. GA is used as a sub-process to generate new solutions. Bus demand assignment on two alternative paths is performed at the solution evaluation stage. The method was implemented on four theoretical grid networks of different size and a benchmark network. Several GA operators (crossover and mutation) were utilized and tested for their effectiveness. The results show that the proposed method can efficiently converge to the optimal solution on a small network but computation time increases significantly with network size. The method can also be used for other transport operation management problems.

  6. A spatial decision support system (SDSS) for sustainable tourism planning in Cameron Highlands, Malaysia

    NASA Astrophysics Data System (ADS)

    Aminu, M.; Matori, A. N.; Yusof, K. W.

    2014-02-01

    The study describes a methodological approach based on an integrated use of Geographic Information System (GIS) and Analytic Network Process (ANP) of Multi Criteria Evaluation (MCE) to determine nature conservation and tourism development priorities among the highland areas. A set of criteria and indicators were defined to evaluate the highlands biodiversity conservation and tourism development. Pair wise comparison technique was used in order to support solution of a decision problem by evaluating possible alternatives from different perspectives. After the weights have been derived from the pairwise comparison technique, the next step was to compute the unweighted supermatrix, weighted supermatrix and the limit matrix. The limit matrix was normalized to obtain the priorities and the results transferred into GIS environment. Elements evaluated and ranked were represented by criterion maps. Map layers reflecting the opinion of different experts involved were summed using the weighted overlay approach of GIS. Subsequently sustainable tourism development scenarios were generated. The generation of scenarios highlighted the critical issues of the decision problem because it allows one to gradually narrow down a problem.

  7. Generation algorithm of craniofacial structure contour in cephalometric images

    NASA Astrophysics Data System (ADS)

    Mondal, Tanmoy; Jain, Ashish; Sardana, H. K.

    2010-02-01

    Anatomical structure tracing on cephalograms is a significant way to obtain cephalometric analysis. Computerized cephalometric analysis involves both manual and automatic approaches. The manual approach is limited in accuracy and repeatability. In this paper we have attempted to develop and test a novel method for automatic localization of craniofacial structure based on the detected edges on the region of interest. According to the grey scale feature at the different region of the cephalometric images, an algorithm for obtaining tissue contour is put forward. Using edge detection with specific threshold an improved bidirectional contour tracing approach is proposed by an interactive selection of the starting edge pixels, the tracking process searches repetitively for an edge pixel at the neighborhood of previously searched edge pixel to segment images, and then craniofacial structures are obtained. The effectiveness of the algorithm is demonstrated by the preliminary experimental results obtained with the proposed method.

  8. Multivariate longitudinal data analysis with mixed effects hidden Markov models.

    PubMed

    Raffa, Jesse D; Dubin, Joel A

    2015-09-01

    Multiple longitudinal responses are often collected as a means to capture relevant features of the true outcome of interest, which is often hidden and not directly measurable. We outline an approach which models these multivariate longitudinal responses as generated from a hidden disease process. We propose a class of models which uses a hidden Markov model with separate but correlated random effects between multiple longitudinal responses. This approach was motivated by a smoking cessation clinical trial, where a bivariate longitudinal response involving both a continuous and a binomial response was collected for each participant to monitor smoking behavior. A Bayesian method using Markov chain Monte Carlo is used. Comparison of separate univariate response models to the bivariate response models was undertaken. Our methods are demonstrated on the smoking cessation clinical trial dataset, and properties of our approach are examined through extensive simulation studies. © 2015, The International Biometric Society.

  9. ATC simulation of helicopter IFR approaches into major terminal areas using RNAV, MLS, and CDTI

    NASA Technical Reports Server (NTRS)

    Tobias, L.; Lee, H. Q.; Peach, L. L.; Willett, F. M., Jr.; Obrien, P. J.

    1981-01-01

    The introduction of independent helicopter IFR routes at hub airports was investigated in a real time air traffic control system simulation involving a piloted helicopter simulator, computer generated air traffic, and air traffic controllers. The helicopter simulator was equipped to fly area navigation (RNAV) routes and microwave landing system approaches. Problems studied included: (1) pilot acceptance of the approach procedure and tracking accuracy; (2) ATC procedures for handling a mix of helicopter and fixed wing traffic; and (3) utility of the cockpit display of traffic information (CDTI) for the helicopter in the hub airport environment. Results indicate that the helicopter routes were acceptable to the subject pilots and were noninterfering with fixed wing traffic. Merging and spacing maneuvers using CDTI were successfully carried out by the pilots, but controllers had some reservations concerning the acceptability of the CDTI procedures.

  10. Culturally Tailored Depression/Suicide Prevention in Latino Youth: Community Perspectives.

    PubMed

    Ford-Paz, Rebecca E; Reinhard, Christine; Kuebbeler, Andrea; Contreras, Richard; Sánchez, Bernadette

    2015-10-01

    Latino adolescents are at elevated risk for depression and suicide compared to other ethnic groups. Project goals were to gain insight from community leaders about depression risk factors particular to Latino adolescents and generate innovative suggestions to improve cultural relevance of prevention interventions. This project utilized a CBPR approach to enhance cultural relevance, acceptability, and utility of the findings and subsequent program development. Two focus groups of youth and youth-involved Latino community leaders (n = 18) yielded three overarching themes crucial to a culturally tailored depression prevention intervention: (1) utilize a multipronged and sustainable intervention approach, (2) raise awareness about depression in culturally meaningful ways, and (3) promote Latino youth's social connection and cultural enrichment activities. Findings suggest that both adaptation of existing prevention programs and development of hybrid approaches may be necessary to reduce depression/suicide disparities for Latino youth. One such hybrid program informed by community stakeholders is described.

  11. A review of active learning approaches to experimental design for uncovering biological networks

    PubMed Central

    2017-01-01

    Various types of biological knowledge describe networks of interactions among elementary entities. For example, transcriptional regulatory networks consist of interactions among proteins and genes. Current knowledge about the exact structure of such networks is highly incomplete, and laboratory experiments that manipulate the entities involved are conducted to test hypotheses about these networks. In recent years, various automated approaches to experiment selection have been proposed. Many of these approaches can be characterized as active machine learning algorithms. Active learning is an iterative process in which a model is learned from data, hypotheses are generated from the model to propose informative experiments, and the experiments yield new data that is used to update the model. This review describes the various models, experiment selection strategies, validation techniques, and successful applications described in the literature; highlights common themes and notable distinctions among methods; and identifies likely directions of future research and open problems in the area. PMID:28570593

  12. Translating science into the next generation meat quality program for Australian lamb.

    PubMed

    Pethick, D W; Ball, A J; Banks, R G; Gardner, G E; Rowe, J B; Jacob, R H

    2014-02-01

    This paper introduces a series of papers in the form of a special edition that reports phenotypic analyses done in parallel with genotypic analyses for the Australian Sheep Industry Cooperative Research Centre (Sheep CRC) using data generated from the information nucleus flock (INF). This has allowed new knowledge to be gained of the genetic, environment and management factors that impact on the carcase and eating quality, visual appeal, odour and health attributes of Australian lamb meat. The research described involved close collaboration with commercial partners across the supply chain in the sire breeding as well as the meat processing industries. This approach has enabled timely delivery and adoption of research results to industry in an unprecedented way and provides a good model for future research. © 2013.

  13. NASA's Involvement in Technology Development and Transfer: The Ohio Hybrid Bus Project

    NASA Technical Reports Server (NTRS)

    Viterna, Larry A.

    1997-01-01

    A government and industry cooperative is using advanced power technology in a city transit bus that will offer double the fuel economy, and reduce emissions to one tenth of government standards. The heart of the vehicle's power system is a natural gas fueled generator unit. Power from both the generator and an advanced energy storage system is provided to a variable speed electric motor attached to the rear drive axle. A unique aspect of the vehicle's design is its use of "super" capacitors for recovery of energy during braking. This is the largest vehicle ever built using this advanced energy recovery technology. This paper describes the project goals and approach, results of its system performance modeling, and the status of the development team's effort.

  14. Studies of uncontrolled air traffic patterns, phase 1

    NASA Technical Reports Server (NTRS)

    Baxa, E. G., Jr.; Scharf, L. L.; Ruedger, W. H.; Modi, J. A.; Wheelock, S. L.; Davis, C. M.

    1975-01-01

    The general aviation air traffic flow patterns at uncontrolled airports are investigated and analyzed and traffic pattern concepts are developed to minimize the midair collision hazard in uncontrolled airspace. An analytical approach to evaluate midair collision hazard probability as a function of traffic densities is established which is basically independent of path structure. Two methods of generating space-time interrelationships between terminal area aircraft are presented; one is a deterministic model to generate pseudorandom aircraft tracks, the other is a statistical model in preliminary form. Some hazard measures are presented for selected traffic densities. It is concluded that the probability of encountering a hazard should be minimized independently of any other considerations and that the number of encounters involving visible-avoidable aircraft should be maximized at the expense of encounters in other categories.

  15. Impact of Discrete Corrections in a Modular Approach for Trajectory Generation in Quadruped Robots

    NASA Astrophysics Data System (ADS)

    Pinto, Carla M. A.; Santos, Cristina P.; Rocha, Diana; Matos, Vítor

    2011-09-01

    Online generation of trajectories in robots is a very complex task that involves the combination of different types of movements, i.e., distinct motor primitives. The later are used to model complex behaviors in robots, such as locomotion in irregular terrain and obstacle avoidance. In this paper, we consider two motor primitives: rhythmic and discrete. We study the effect on the robots' gaits of superimposing the two motor primitives, considering two distinct types of coupling. Additionally, we simulate two scenarios, where the discrete primitive is inserted in all of the four limbs, or is inserted in ipsilateral pairs of limbs. Numerical results show that amplitude and frequency of the periodic solutions, corresponding to the gaits trot and pace, are almost constant for diffusive and synaptic couplings.

  16. Enhanced electrohydrodynamic force generation in a two-stroke cycle dielectric-barrier-discharge plasma actuator

    NASA Astrophysics Data System (ADS)

    Sato, Shintaro; Takahashi, Masayuki; Ohnishi, Naofumi

    2017-05-01

    An approach for electrohydrodynamic (EHD) force production is proposed with a focus on a charge cycle on a dielectric surface. The cycle, consisting of positive-charging and neutralizing strokes, is completely different from the conventional methodology, which involves a negative-charging stroke, in that the dielectric surface charge is constantly positive. The two-stroke charge cycle is realized by applying a DC voltage combined with repetitive pulses. Simulation results indicate that the negative pulse eliminates the surface charge accumulated during constant voltage phase, resulting in repetitive EHD force generation. The time-averaged EHD force increases almost linearly with increasing repetitive pulse frequency and becomes one order of magnitude larger than that driven by the sinusoidal voltage, which has the same peak-to-peak voltage.

  17. In Vivo and In Situ Detection of Macromolecular Free Radicals Using Immuno-Spin Trapping and Molecular Magnetic Resonance Imaging.

    PubMed

    Towner, Rheal A; Smith, Nataliya

    2018-05-20

    In vivo free radical imaging in preclinical models of disease has become a reality. Free radicals have traditionally been characterized by electron spin resonance (ESR) or electron paramagnetic resonance (EPR) spectroscopy coupled with spin trapping. The disadvantage of the ESR/EPR approach is that spin adducts are short-lived due to biological reductive and/or oxidative processes. Immuno-spin trapping (IST) involves the use of an antibody that recognizes macromolecular 5,5-dimethyl-pyrroline-N-oxide (DMPO) spin adducts (anti-DMPO antibody), regardless of the oxidative/reductive state of trapped radical adducts. Recent Advances: The IST approach has been extended to an in vivo application that combines IST with molecular magnetic resonance imaging (mMRI). This combined IST-mMRI approach involves the use of a spin-trapping agent, DMPO, to trap free radicals in disease models, and administration of an mMRI probe, an anti-DMPO probe, which combines an antibody against DMPO-radical adducts and an MRI contrast agent, resulting in targeted free radical adduct detection. The combined IST-mMRI approach has been used in several rodent disease models, including diabetes, amyotrophic lateral sclerosis (ALS), gliomas, and septic encephalopathy. The advantage of this approach is that heterogeneous levels of trapped free radicals can be detected directly in vivo and in situ to pin point where free radicals are formed in different tissues. The approach can also be used to assess therapeutic agents that are either free radical scavengers or generate free radicals. Smaller probe constructs and radical identification approaches are being considered. The focus of this review is on the different applications that have been studied, advantages and limitations, and future directions. Antioxid. Redox Signal. 28, 1404-1415.

  18. Role of dietary bioactive natural products in estrogen receptor-positive breast cancer

    PubMed Central

    Bak, Min Ji; Das Gupta, Soumyasri; Wahler, Joseph; Suh, Nanjoo

    2016-01-01

    Estrogen receptor (ER)-positive breast cancer, including luminal-A and -B, is the most common type of breast cancer. Extended exposure to estrogen is associated with an increased risk of breast cancer. Both ER-dependent and ER-independent mechanisms have been implicated in estrogen-mediated carcinogenesis. The ER-dependent pathway involves cell growth and proliferation triggered by the binding of estrogen to the ER. The ER-independent mechanisms depend on the metabolism of estrogen to generate genotoxic metabolites, free radicals and reactive oxygen species to induce breast cancer. A better understanding of the mechanisms that drive ER-positive breast cancer will help optimize targeted approaches to prevent or treat breast cancer. A growing emphasis is being placed on alternative medicine and dietary approaches toward the prevention and treatment of breast cancer. Many natural products and bioactive compounds found in foods have been shown to inhibit breast carcinogenesis via inhibition of estrogen induced oxidative stress as well as ER signaling. This review summarizes the role of bioactive natural products that are involved in the prevention and treatment of estrogen-related and ER-positive breast cancer. PMID:27016037

  19. Factors associated with involvement in nonmetropolitan LGBTQ organizations: Proximity? Generativity? Minority stress? Social location?

    PubMed

    Paceley, Megan S; Oswald, Ramona Faith; Hardesty, Jennifer L

    2014-01-01

    Little is known about involvement in LGBTQ organizations. Factors associated with involvement in nonmetropolitan LGBTQ organizations were examined using logistic regression and survey data from 426 LGBTQ individuals residing in a nonmetropolitan region. Involvement was examined in five types of organizations (professional, social/recreational, religious, political, and community center/charity). The same model testing proximity, generativity, minority stress, and social location hypotheses was repeated for each organization type. Results demonstrate that the generativity hypothesis is most strongly supported. Indeed, emotional attachment to the LGBTQ community significantly increased the odds of involvement in every type of organization. However, the factors associated with involvement otherwise differed by organization type. Implications for organizational leaders are discussed.

  20. Broadening the interface bandwidth in simulation based training

    NASA Technical Reports Server (NTRS)

    Somers, Larry E.

    1989-01-01

    Currently most computer based simulations rely exclusively on computer generated graphics to create the simulation. When training is involved, the method almost exclusively used to display information to the learner is text displayed on the cathode ray tube. MICROEXPERT Systems is concentrating on broadening the communications bandwidth between the computer and user by employing a novel approach to video image storage combined with sound and voice output. An expert system is used to combine and control the presentation of analog video, sound, and voice output with computer based graphics and text. Researchers are currently involved in the development of several graphics based user interfaces for NASA, the U.S. Army, and the U.S. Navy. Here, the focus is on the human factors considerations, software modules, and hardware components being used to develop these interfaces.

  1. Phobos-Grunt ; Russian Sample Return Mission

    NASA Astrophysics Data System (ADS)

    Marov, M.

    As an important milestone in the Mars exploration, space vehicle of new generation "Phobos-Grunt" is planned to be launched by the Russian Aviation and Space Agency. The project is optimized around Phobos sample return mission and follow up missions targeted to study some Main asteroid belt bodies, NEO , and short period comets. The principal constrain is "Soyuz-Fregat" rather than "Proton" launcher utilization to accomplish these challenging goals. The vehicle design incorporates innovative SEP technology involving electrojet engines that allowed us to increase significantly the missions energetic capabilities, as well as high autonomous on- board systems . Basic criteria underlining the "Phobos-Grunt" mission scenario, scientific objections and rationale, involving Mars observations during the vehicle insertion into Mars orbit and Phobos approach manoeuvres, are discussed and an opportunity for international cooperation is suggested.

  2. Bottom-Up and Top-Down Mechanisms of General Anesthetics Modulate Different Dimensions of Consciousness

    PubMed Central

    Mashour, George A.; Hudetz, Anthony G.

    2017-01-01

    There has been controversy regarding the precise mechanisms of anesthetic-induced unconsciousness, with two salient approaches that have emerged within systems neuroscience. One prominent approach is the “bottom up” paradigm, which argues that anesthetics suppress consciousness by modulating sleep-wake nuclei and neural circuits in the brainstem and diencephalon that have evolved to control arousal states. Another approach is the “top-down” paradigm, which argues that anesthetics suppress consciousness by modulating the cortical and thalamocortical circuits involved in the integration of neural information. In this article, we synthesize these approaches by mapping bottom-up and top-down mechanisms of general anesthetics to two distinct but inter-related dimensions of consciousness: level and content. We show how this explains certain empirical observations regarding the diversity of anesthetic drug effects. We conclude with a more nuanced discussion of how levels and contents of consciousness interact to generate subjective experience and what this implies for the mechanisms of anesthetic-induced unconsciousness. PMID:28676745

  3. Bottom-Up and Top-Down Mechanisms of General Anesthetics Modulate Different Dimensions of Consciousness.

    PubMed

    Mashour, George A; Hudetz, Anthony G

    2017-01-01

    There has been controversy regarding the precise mechanisms of anesthetic-induced unconsciousness, with two salient approaches that have emerged within systems neuroscience. One prominent approach is the "bottom up" paradigm, which argues that anesthetics suppress consciousness by modulating sleep-wake nuclei and neural circuits in the brainstem and diencephalon that have evolved to control arousal states. Another approach is the "top-down" paradigm, which argues that anesthetics suppress consciousness by modulating the cortical and thalamocortical circuits involved in the integration of neural information. In this article, we synthesize these approaches by mapping bottom-up and top-down mechanisms of general anesthetics to two distinct but inter-related dimensions of consciousness: level and content. We show how this explains certain empirical observations regarding the diversity of anesthetic drug effects. We conclude with a more nuanced discussion of how levels and contents of consciousness interact to generate subjective experience and what this implies for the mechanisms of anesthetic-induced unconsciousness.

  4. Arabic sign language recognition based on HOG descriptor

    NASA Astrophysics Data System (ADS)

    Ben Jmaa, Ahmed; Mahdi, Walid; Ben Jemaa, Yousra; Ben Hamadou, Abdelmajid

    2017-02-01

    We present in this paper a new approach for Arabic sign language (ArSL) alphabet recognition using hand gesture analysis. This analysis consists in extracting a histogram of oriented gradient (HOG) features from a hand image and then using them to generate an SVM Models. Which will be used to recognize the ArSL alphabet in real-time from hand gesture using a Microsoft Kinect camera. Our approach involves three steps: (i) Hand detection and localization using a Microsoft Kinect camera, (ii) hand segmentation and (iii) feature extraction using Arabic alphabet recognition. One each input image first obtained by using a depth sensor, we apply our method based on hand anatomy to segment hand and eliminate all the errors pixels. This approach is invariant to scale, to rotation and to translation of the hand. Some experimental results show the effectiveness of our new approach. Experiment revealed that the proposed ArSL system is able to recognize the ArSL with an accuracy of 90.12%.

  5. RapidNAM: generative manufacturing approach of nasoalveolar molding devices for presurgical cleft lip and palate treatment.

    PubMed

    Bauer, Franz Xaver; Schönberger, Markus; Gattinger, Johannes; Eblenkamp, Markus; Wintermantel, Erich; Rau, Andrea; Güll, Florian Dieter; Wolff, Klaus-Dietrich; Loeffelbein, Denys J

    2017-08-28

    Nasoalveolar molding (NAM) is an accepted treatment strategy in presurgical cleft therapy. The major drawbacks of the treatment listed in the literature relate to the time of the treatment and the coordination of the required interdisciplinary team of therapists, parents, and patients. To overcome these limitations, we present the automated RapidNAM concept that facilitates the design and manufacturing process of NAM devices, and that allows the virtual modification and subsequent manufacture of the devices in advance, with a growth prediction factor adapted to the patient's natural growth. The RapidNAM concept involves (i) the prediction of three trajectories that envelope the fragmented alveolar segments with the goal to mimic a harmonic arch, (ii) the extrusion from the larger toward the smaller alveolar segment along the envelope curves toward the harmonic upper alveolar arch, and (iii) the generation of the NAM device with a ventilation hole, fixation pin, and fixation points for the nasal stents. A feasibility study for a vector-based approach was successfully conducted for unilateral and bilateral cleft lip and palate (CLP) patients. A comparison of the modified target models with the reference target models showed similar results. For further improvement, the number of landmarks used to modify the models was increased by a curve-based approach.

  6. An Improved Artificial Bee Colony-Based Approach for Zoning Protected Ecological Areas

    PubMed Central

    Shao, Jing; Yang, Lina; Peng, Ling; Chi, Tianhe; Wang, Xiaomeng

    2015-01-01

    China is facing ecological and environmental challenges as its urban growth rate continues to rise, and zoning protected ecological areas is recognized as an effective response measure. Zoning inherently involves both site attributes and aggregation attributes, and the combination of mathematical models and heuristic algorithms have proven advantageous. In this article, an improved artificial bee colony (IABC)-based approach is proposed for zoning protected ecological areas at a regional scale. Three main improvements were made: the first is the use of multiple strategies to generate the initial bee population of a specific quality and diversity, the second is an exploitation search procedure to generate neighbor solutions combining “replace” and “alter” operations, and the third is a “swap” strategy to enable a local search for the iterative optimal solution. The IABC algorithm was verified using simulated data. Then it was applied to define an optimum scheme of protected ecological areas of Sanya (in the Hainan province of China), and a reasonable solution was obtained. Finally, a comparison experiment with other methods (agent-based land allocation model, ant colony optimization, and density slicing) was conducted and demonstrated that the IABC algorithm was more effective and efficient than the other methods. Through this study, we aimed to provide a scientifically sound, practical approach for zoning procedures. PMID:26394148

  7. Production of gold nanoparticles by electrode-respiring Geobacter sulfurreducens biofilms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tanzil, Abid H.; Sultana, Sujala T.; Saunders, Steven R.

    2016-12-01

    Current chemical syntheses of nanoparticles (NP) has had limited success due to the relatively high environmental cost caused by the use of harsh chemicals requiring necessary purification and size-selective fractionation. Therefore, biological approaches have received recent attention for their potential to overcome these obstacles as a benign synthetic approach. The intrinsic nature of biomolecules present in microorganisms has intrigued researchers to design bottom-up approaches to biosynthesize metal nanoparticles using microorganisms. Most of the literature work has focused on NP synthesis using planktonic cells while the use of biofilms are limited. The goal of this work was to synthesize gold nanoparticlesmore » (AuNPs) using electrode respiring Geobacter sulfurreducens biofilms. We found that most of the AuNPs are generated in the extracellular matrix of Geobacter biofilms with an average particle size of 20 nm. The formation of AuNPs was verified using TEM, FTIR and EDX. We also found that the extracellular substances extracted from electrode respiring G. sulfurreducens biofilms can reduce Au3+ to AuNPs. It appears that reducing sugars were involved in bioreduction and synthesis of AuNPs and amine groups acted as the major biomolecules involved in binding. This is first demonstration of AuNPs formation from the extracellular matrix of electrode respiring biofilms.« less

  8. Physiological, Biochemical, and Molecular Mechanisms of Heat Stress Tolerance in Plants

    PubMed Central

    Hasanuzzaman, Mirza; Nahar, Kamrun; Alam, Md. Mahabub; Roychowdhury, Rajib; Fujita, Masayuki

    2013-01-01

    High temperature (HT) stress is a major environmental stress that limits plant growth, metabolism, and productivity worldwide. Plant growth and development involve numerous biochemical reactions that are sensitive to temperature. Plant responses to HT vary with the degree and duration of HT and the plant type. HT is now a major concern for crop production and approaches for sustaining high yields of crop plants under HT stress are important agricultural goals. Plants possess a number of adaptive, avoidance, or acclimation mechanisms to cope with HT situations. In addition, major tolerance mechanisms that employ ion transporters, proteins, osmoprotectants, antioxidants, and other factors involved in signaling cascades and transcriptional control are activated to offset stress-induced biochemical and physiological alterations. Plant survival under HT stress depends on the ability to perceive the HT stimulus, generate and transmit the signal, and initiate appropriate physiological and biochemical changes. HT-induced gene expression and metabolite synthesis also substantially improve tolerance. The physiological and biochemical responses to heat stress are active research areas, and the molecular approaches are being adopted for developing HT tolerance in plants. This article reviews the recent findings on responses, adaptation, and tolerance to HT at the cellular, organellar, and whole plant levels and describes various approaches being taken to enhance thermotolerance in plants. PMID:23644891

  9. An update on gain-of-function mutations in primary immunodeficiency diseases.

    PubMed

    Jhamnani, Rekha D; Rosenzweig, Sergio D

    2017-12-01

    Most primary immunodeficiencies described since 1952 were associated with loss-of-function defects. With the advent and popularization of unbiased next-generation sequencing diagnostic approaches followed by functional validation techniques, many gain-of-function mutations leading to immunodeficiency have also been identified. This review highlights the updates on pathophysiology mechanisms and new therapeutic approaches involving primary immunodeficiencies because of gain-of-function mutations. The more recent developments related to gain-of-function primary immunodeficiencies mostly involving increased infection susceptibility but also immune dysregulation and autoimmunity, were reviewed. Updates regarding pathophysiology mechanisms, different mutation types, clinical features, laboratory markers, current and potential new treatments on patients with caspase recruitment domain family member 11, signal transducer and activator of transcription 1, signal transducer and activator of transcription 3, phosphatidylinositol-4,5-biphosphate 3-kinase catalytic 110, phosphatidylinositol-4,5-biphosphate 3-kinase regulatory subunit 1, chemokine C-X-C motif receptor 4, sterile α motif domain containing 9-like, and nuclear factor κ-B subunit 2 gain-of-function mutations are reviewed for each disease. With the identification of gain-of-function mutations as a cause of immunodeficiency, new genetic pathophysiology mechanisms unveiled and new-targeted therapeutic approaches can be explored as potential rescue treatments for these diseases.

  10. Spinning AdS loop diagrams: two point functions

    NASA Astrophysics Data System (ADS)

    Giombi, Simone; Sleight, Charlotte; Taronna, Massimo

    2018-06-01

    We develop a systematic approach to evaluating AdS loop amplitudes with spinning legs based on the spectral (or "split") representation of bulk-to-bulk propagators, which re-expresses loop diagrams in terms of spectral integrals and higher-point tree diagrams. In this work we focus on 2pt one-loop Witten diagrams involving totally symmetric fields of arbitrary mass and integer spin. As an application of this framework, we study the contribution to the anomalous dimension of higher-spin currents generated by bubble diagrams in higher-spin gauge theories on AdS.

  11. Design approaches to more energy efficient engines

    NASA Technical Reports Server (NTRS)

    Saunders, N. T.; Colladay, R. S.; Macioce, L. E.

    1978-01-01

    The status of NASA's Energy Efficient Engine Project, a comparative government-industry effort aimed at advancing the technology base for the next generation of large turbofan engines for civil aircraft transports is summarized. Results of recently completed studies are reviewed. These studies involved selection of engine cycles and configurations that offer potential for at least 12% lower fuel consumption than current engines and also are economically attractive and environmentally acceptable. Emphasis is on the advancements required in component technologies and systems design concepts to permit future development of these more energy efficient engines.

  12. Community detection in complex networks using proximate support vector clustering

    NASA Astrophysics Data System (ADS)

    Wang, Feifan; Zhang, Baihai; Chai, Senchun; Xia, Yuanqing

    2018-03-01

    Community structure, one of the most attention attracting properties in complex networks, has been a cornerstone in advances of various scientific branches. A number of tools have been involved in recent studies concentrating on the community detection algorithms. In this paper, we propose a support vector clustering method based on a proximity graph, owing to which the introduced algorithm surpasses the traditional support vector approach both in accuracy and complexity. Results of extensive experiments undertaken on computer generated networks and real world data sets illustrate competent performances in comparison with the other counterparts.

  13. Microscale bioprocess optimisation.

    PubMed

    Micheletti, Martina; Lye, Gary J

    2006-12-01

    Microscale processing techniques offer the potential to speed up the delivery of new drugs to the market, reducing development costs and increasing patient benefit. These techniques have application across both the chemical and biopharmaceutical sectors. The approach involves the study of individual bioprocess operations at the microlitre scale using either microwell or microfluidic formats. In both cases the aim is to generate quantitative bioprocess information early on, so as to inform bioprocess design and speed translation to the manufacturing scale. Automation can enhance experimental throughput and will facilitate the parallel evaluation of competing biocatalyst and process options.

  14. A tensor approach to modeling of nonhomogeneous nonlinear systems

    NASA Technical Reports Server (NTRS)

    Yurkovich, S.; Sain, M.

    1980-01-01

    Model following control methodology plays a key role in numerous application areas. Cases in point include flight control systems and gas turbine engine control systems. Typical uses of such a design strategy involve the determination of nonlinear models which generate requested control and response trajectories for various commands. Linear multivariable techniques provide trim about these motions; and protection logic is added to secure the hardware from excursions beyond the specification range. This paper reports upon experience in developing a general class of such nonlinear models based upon the idea of the algebraic tensor product.

  15. Immunotherapy with myeloid cells for tolerance induction

    PubMed Central

    Rodriguez-García, Mercedes; Boros, Peter; Bromberg, Jonathan S.; Ochando, Jordi C.

    2013-01-01

    Purpose of review Understanding the interplay between myeloid dendritic cells and T cells under tolerogenic conditions, and whether their interactions induce the development of antigen-specific regulatory T cells (Tregs) is critical to uncover the mechanisms involved in the induction of indefinite allograft survival. Recent findings Myeloid dendritic cell–T-cell interactions are seminal events that determine the outcome of the immune response, and multiple in-vitro protocols suggest the generation of tolerogenic myeloid dendritic cells that modulate T-cell responses, and determine the outcome of the immune response to an allograft following adoptive transfer. We believe that identifying specific conditions that lead to the generation of tolerogenic myeloid dendritic cells and Tregs are critical for the manipulation the immune response towards the development of transplantation tolerance. Summary We summarize recent findings regarding specific culture conditions that generate tolerogenic myeloid dendritic cells that induce T-cell hyporesponsiveness and Treg development, and represents a novel immunotherapeutic approach to promote the induction of indefinite graft survival prolongation. The interpretations presented here illustrate that different mechanisms govern the generation tolerogenic myeloid dendritic cells, and we discuss the concomitant therapeutic implications. PMID:20616727

  16. Stochastic Modeling based on Dictionary Approach for the Generation of Daily Precipitation Occurrences

    NASA Astrophysics Data System (ADS)

    Panu, U. S.; Ng, W.; Rasmussen, P. F.

    2009-12-01

    The modeling of weather states (i.e., precipitation occurrences) is critical when the historical data are not long enough for the desired analysis. Stochastic models (e.g., Markov Chain and Alternating Renewal Process (ARP)) of the precipitation occurrence processes generally assume the existence of short-term temporal-dependency between the neighboring states while implying the existence of long-term independency (randomness) of states in precipitation records. Existing temporal-dependent models for the generation of precipitation occurrences are restricted either by the fixed-length memory (e.g., the order of a Markov chain model), or by the reining states in segments (e.g., persistency of homogenous states within dry/wet-spell lengths of an ARP). The modeling of variable segment lengths and states could be an arduous task and a flexible modeling approach is required for the preservation of various segmented patterns of precipitation data series. An innovative Dictionary approach has been developed in the field of genome pattern recognition for the identification of frequently occurring genome segments in DNA sequences. The genome segments delineate the biologically meaningful ``words" (i.e., segments with a specific patterns in a series of discrete states) that can be jointly modeled with variable lengths and states. A meaningful “word”, in hydrology, can be referred to a segment of precipitation occurrence comprising of wet or dry states. Such flexibility would provide a unique advantage over the traditional stochastic models for the generation of precipitation occurrences. Three stochastic models, namely, the alternating renewal process using Geometric distribution, the second-order Markov chain model, and the Dictionary approach have been assessed to evaluate their efficacy for the generation of daily precipitation sequences. Comparisons involved three guiding principles namely (i) the ability of models to preserve the short-term temporal-dependency in data through the concepts of autocorrelation, average mutual information, and Hurst exponent, (ii) the ability of models to preserve the persistency within the homogenous dry/wet weather states through analysis of dry/wet-spell lengths between the observed and generated data, and (iii) the ability to assesses the goodness-of-fit of models through the likelihood estimates (i.e., AIC and BIC). Past 30 years of observed daily precipitation records from 10 Canadian meteorological stations were utilized for comparative analyses of the three models. In general, the Markov chain model performed well. The remainders of the models were found to be competitive from one another depending upon the scope and purpose of the comparison. Although the Markov chain model has a certain advantage in the generation of daily precipitation occurrences, the structural flexibility offered by the Dictionary approach in modeling the varied segment lengths of heterogeneous weather states provides a distinct and powerful advantage in the generation of precipitation sequences.

  17. Optimization of rotating equipment in offshore wind farm

    NASA Astrophysics Data System (ADS)

    Okunade, O. A.

    2014-07-01

    The paper considered the improvement of rotating equipment in a wind farm, and how these could maximise the farm power capacity. It aimed to increase capacity of electricity generation through a renewable source in UK and contribute to 15 per cent energy- consumption target, set by EU on electricity through renewable sources by 2020. With reference to a case study in UK offshore wind farm, the paper analysed the critique of the farm, as a design basis for its optimization. It considered power production as design situation, load cases and constraints, in order to reflect characteristics and behaviour of a standard design. The scope, which considered parts that were directly involved in power generation, covered rotor blades and the impacts of gearbox and generator to power generation. The scope did not however cover support structures like tower design. The approaches of detail data analysis of the blade at typical wind load conditions, were supported by data from acceptable design standards, relevant authorities and professional bodies. The findings in proposed model design showed at least over 3 per cent improvement on the existing electricity generation. It also indicated overall effects on climate change.

  18. Variations in the perceptions of peer and coach motivational climate.

    PubMed

    Vazou, Spiridoula

    2010-06-01

    This study examined (a) variations in the perceptions of peer- and coach-generated motivational climate within and between teams and (b) individual- and group-level factors that can account for these variations. Participants were 483 athletes between 12 and 16 years old. The results showed that perceptions of both peer- and coach-generated climate varied as a function of group-level variables, namely team success, coach's gender (except for peer ego-involving climate), and team type (only for coach ego-involving climate). Perceptions of peer- and coach-generated climate also varied as a function of individual-level variables, namely athletes' task and ego orientations, gender, and age (only for coach task-involving and peer ego-involving climate). Moreover, within-team variations in perceptions of peer- and coach-generated climate as a function of task and ego orientation levels were identified. Identifying and controlling the factors that influence perceptions of peer- and coach-generated climate may be important in strengthening task-involving motivational cues.

  19. The neuronal differentiation process involves a series of antioxidant proteins.

    PubMed

    Oh, J-E; Karlmark Raja, K; Shin, J-H; Hengstschläger, M; Pollak, A; Lubec, G

    2005-11-01

    Involvement of individual antioxidant proteins (AOXP) and antioxidants in the differentiation process has been already reported. A systematic search strategy for detecting differentially regulated AOXP in neuronal differentiation, however, has not been published so far. The aim of this study was to provide an analytical tool identifying AOXP and to generate a differentiation-related AOXP expressional pattern. The undifferentiated N1E-115 neuroblastoma cell line was switched into a neuronal phenotype by DMSO treatment and used for proteomic experiments: We used two-dimensional gel electrophoresis followed by unambiguous mass spectrometrical (MALDI-TOF-TOF) identification of proteins to generate a map of AOXP. 16 AOXP were unambiguously determined in both cell lines; catalase, thioredoxin domain-containing protein 4 and hypothetical glutaredoxin/glutathione S-transferase C terminus-containing protein were detectable in the undifferentiated cells only. Five AOXP were observed in both, undifferentiated and differentiated cells and thioredoxin, thioredoxin-like protein p19, thioredoxin reductase 1, superoxide dismutases (Mn and Cu-Zn), glutathione synthetase, glutathione S-transferase P1 and Mu1 were detected in differentiated cells exclusively. Herein a differential expressional pattern is presented that reveals so far unpublished antioxidant principles involved in neuronal differentiation by a protein chemical approach, unambiguously identifying AOXP. This finding not only shows concomitant determination of AOXP but also serves as an analytical tool and forms the basis for design of future studies addressing AOXP and differentiation per se.

  20. Entropy generation method to quantify thermal comfort.

    PubMed

    Boregowda, S C; Tiwari, S N; Chaturvedi, S K

    2001-12-01

    The present paper presents a thermodynamic approach to assess the quality of human-thermal environment interaction and quantify thermal comfort. The approach involves development of entropy generation term by applying second law of thermodynamics to the combined human-environment system. The entropy generation term combines both human thermal physiological responses and thermal environmental variables to provide an objective measure of thermal comfort. The original concepts and definitions form the basis for establishing the mathematical relationship between thermal comfort and entropy generation term. As a result of logic and deterministic approach, an Objective Thermal Comfort Index (OTCI) is defined and established as a function of entropy generation. In order to verify the entropy-based thermal comfort model, human thermal physiological responses due to changes in ambient conditions are simulated using a well established and validated human thermal model developed at the Institute of Environmental Research of Kansas State University (KSU). The finite element based KSU human thermal computer model is being utilized as a "Computational Environmental Chamber" to conduct series of simulations to examine the human thermal responses to different environmental conditions. The output from the simulation, which include human thermal responses and input data consisting of environmental conditions are fed into the thermal comfort model. Continuous monitoring of thermal comfort in comfortable and extreme environmental conditions is demonstrated. The Objective Thermal Comfort values obtained from the entropy-based model are validated against regression based Predicted Mean Vote (PMV) values. Using the corresponding air temperatures and vapor pressures that were used in the computer simulation in the regression equation generates the PMV values. The preliminary results indicate that the OTCI and PMV values correlate well under ideal conditions. However, an experimental study is needed in the future to fully establish the validity of the OTCI formula and the model. One of the practical applications of this index is that could it be integrated in thermal control systems to develop human-centered environmental control systems for potential use in aircraft, mass transit vehicles, intelligent building systems, and space vehicles.

  1. Entropy generation method to quantify thermal comfort

    NASA Technical Reports Server (NTRS)

    Boregowda, S. C.; Tiwari, S. N.; Chaturvedi, S. K.

    2001-01-01

    The present paper presents a thermodynamic approach to assess the quality of human-thermal environment interaction and quantify thermal comfort. The approach involves development of entropy generation term by applying second law of thermodynamics to the combined human-environment system. The entropy generation term combines both human thermal physiological responses and thermal environmental variables to provide an objective measure of thermal comfort. The original concepts and definitions form the basis for establishing the mathematical relationship between thermal comfort and entropy generation term. As a result of logic and deterministic approach, an Objective Thermal Comfort Index (OTCI) is defined and established as a function of entropy generation. In order to verify the entropy-based thermal comfort model, human thermal physiological responses due to changes in ambient conditions are simulated using a well established and validated human thermal model developed at the Institute of Environmental Research of Kansas State University (KSU). The finite element based KSU human thermal computer model is being utilized as a "Computational Environmental Chamber" to conduct series of simulations to examine the human thermal responses to different environmental conditions. The output from the simulation, which include human thermal responses and input data consisting of environmental conditions are fed into the thermal comfort model. Continuous monitoring of thermal comfort in comfortable and extreme environmental conditions is demonstrated. The Objective Thermal Comfort values obtained from the entropy-based model are validated against regression based Predicted Mean Vote (PMV) values. Using the corresponding air temperatures and vapor pressures that were used in the computer simulation in the regression equation generates the PMV values. The preliminary results indicate that the OTCI and PMV values correlate well under ideal conditions. However, an experimental study is needed in the future to fully establish the validity of the OTCI formula and the model. One of the practical applications of this index is that could it be integrated in thermal control systems to develop human-centered environmental control systems for potential use in aircraft, mass transit vehicles, intelligent building systems, and space vehicles.

  2. Effective progression of nuclear magnetic resonance-detected fragment hits.

    PubMed

    Eaton, Hugh L; Wyss, Daniel F

    2011-01-01

    Fragment-based drug discovery (FBDD) has become increasingly popular over the last decade as an alternate lead generation tool to HTS approaches. Several compounds have now progressed into the clinic which originated from a fragment-based approach, demonstrating the utility of this emerging field. While fragment hit identification has become much more routine and may involve different screening approaches, the efficient progression of fragment hits into quality lead series may still present a major bottleneck for the broadly successful application of FBDD. In our laboratory, we have extensive experience in fragment-based NMR screening (SbN) and the subsequent iterative progression of fragment hits using structure-assisted chemistry. To maximize impact, we have applied this approach strategically to early- and high-priority targets, and those struggling for leads. Its application has yielded a clinical candidate for BACE1 and lead series in about one third of the SbN/FBDD projects. In this chapter, we will give an overview of our strategy and focus our discussion on NMR-based FBDD approaches. Copyright © 2011 Elsevier Inc. All rights reserved.

  3. The application of an industry level participatory ergonomics approach in developing MSD interventions.

    PubMed

    Tappin, D C; Vitalis, A; Bentley, T A

    2016-01-01

    Participatory ergonomics projects are traditionally applied within one organisation. In this study, a participative approach was applied across the New Zealand meat processing industry, involving multiple organisations and geographical regions. The purpose was to develop interventions to reduce musculoskeletal disorder (MSD) risk. This paper considers the value of an industry level participatory ergonomics approach in achieving this. The main rationale for a participative approach included the need for industry credibility, and to generate MSD interventions that address industry level MSD risk factors. An industry key stakeholder group became the primary vehicle for formal participation. The study resulted in an intervention plan that included the wider work system and industry practices. These interventions were championed across the industry by the key stakeholder group and have extended beyond the life of the study. While this approach helped to meet the study aim, the existence of an industry-supported key stakeholder group and a mandate for the initiative are important prerequisites for success. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  4. An Approach for Dynamic Grids

    NASA Technical Reports Server (NTRS)

    Slater, John W.; Liou, Meng-Sing; Hindman, Richard G.

    1994-01-01

    An approach is presented for the generation of two-dimensional, structured, dynamic grids. The grid motion may be due to the motion of the boundaries of the computational domain or to the adaptation of the grid to the transient, physical solution. A time-dependent grid is computed through the time integration of the grid speeds which are computed from a system of grid speed equations. The grid speed equations are derived from the time-differentiation of the grid equations so as to ensure that the dynamic grid maintains the desired qualities of the static grid. The grid equations are the Euler-Lagrange equations derived from a variational statement for the grid. The dynamic grid method is demonstrated for a model problem involving boundary motion, an inviscid flow in a converging-diverging nozzle during startup, and a viscous flow over a flat plate with an impinging shock wave. It is shown that the approach is more accurate for transient flows than an approach in which the grid speeds are computed using a finite difference with respect to time of the grid. However, the approach requires significantly more computational effort.

  5. A new generation of cancer genome diagnostics for routine clinical use: overcoming the roadblocks to personalized cancer medicine.

    PubMed

    Heuckmann, J M; Thomas, R K

    2015-09-01

    The identification of 'druggable' kinase gene alterations has revolutionized cancer treatment in the last decade by providing new and successfully targetable drug targets. Thus, genotyping tumors for matching the right patients with the right drugs have become a clinical routine. Today, advances in sequencing technology and computational genome analyses enable the discovery of a constantly growing number of genome alterations relevant for clinical decision making. As a consequence, several technological approaches have emerged in order to deal with these rapidly increasing demands for clinical cancer genome analyses. Here, we describe challenges on the path to the broad introduction of diagnostic cancer genome analyses and the technologies that can be applied to overcome them. We define three generations of molecular diagnostics that are in clinical use. The latest generation of these approaches involves deep and thus, highly sensitive sequencing of all therapeutically relevant types of genome alterations-mutations, copy number alterations and rearrangements/fusions-in a single assay. Such approaches therefore have substantial advantages (less time and less tissue required) over PCR-based methods that typically have to be combined with fluorescence in situ hybridization for detection of gene amplifications and fusions. Since these new technologies work reliably on routine diagnostic formalin-fixed, paraffin-embedded specimens, they can help expedite the broad introduction of personalized cancer therapy into the clinic by providing comprehensive, sensitive and accurate cancer genome diagnoses in 'real-time'. © The Author 2015. Published by Oxford University Press on behalf of the European Society for Medical Oncology. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  6. A systems-level approach for metabolic engineering of yeast cell factories.

    PubMed

    Kim, Il-Kwon; Roldão, António; Siewers, Verena; Nielsen, Jens

    2012-03-01

    The generation of novel yeast cell factories for production of high-value industrial biotechnological products relies on three metabolic engineering principles: design, construction, and analysis. In the last two decades, strong efforts have been put on developing faster and more efficient strategies and/or technologies for each one of these principles. For design and construction, three major strategies are described in this review: (1) rational metabolic engineering; (2) inverse metabolic engineering; and (3) evolutionary strategies. Independent of the selected strategy, the process of designing yeast strains involves five decision points: (1) choice of product, (2) choice of chassis, (3) identification of target genes, (4) regulating the expression level of target genes, and (5) network balancing of the target genes. At the construction level, several molecular biology tools have been developed through the concept of synthetic biology and applied for the generation of novel, engineered yeast strains. For comprehensive and quantitative analysis of constructed strains, systems biology tools are commonly used and using a multi-omics approach. Key information about the biological system can be revealed, for example, identification of genetic regulatory mechanisms and competitive pathways, thereby assisting the in silico design of metabolic engineering strategies for improving strain performance. Examples on how systems and synthetic biology brought yeast metabolic engineering closer to industrial biotechnology are described in this review, and these examples should demonstrate the potential of a systems-level approach for fast and efficient generation of yeast cell factories. © 2011 Federation of European Microbiological Societies. Published by Blackwell Publishing Ltd. All rights reserved.

  7. International Approaches for Nuclear Waste Disposal in Geological Formations: Report on Fifth Worldwide Review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Faybishenko, Boris; Birkholzer, Jens; Persoff, Peter

    2016-08-01

    An important issue for present and future generations is the final disposal of spent nuclear fuel. Over the past over forty years, the development of technologies to isolate both spent nuclear fuel (SNF) and other high-level nuclear waste (HLW) generated at nuclear power plants and from production of defense materials, and low- and intermediate-level nuclear waste (LILW) in underground rock and sediments has been found to be a challenging undertaking. Finding an appropriate solution for the disposal of nuclear waste is an important issue for protection of the environment and public health, and it is a prerequisite for the futuremore » of nuclear power. The purpose of a deep geological repository for nuclear waste is to provide to future generations, protection against any harmful release of radioactive material, even after the memory of the repository may have been lost, and regardless of the technical knowledge of future generations. The results of a wide variety of investigations on the development of technology for radioactive waste isolation from 19 countries were published in the First Worldwide Review in 1991 (Witherspoon, 1991). The results of investigations from 26 countries were published in the Second Worldwide Review in 1996 (Witherspoon, 1996). The results from 32 countries were summarized in the Third Worldwide Review in 2001 (Witherspoon and Bodvarsson, 2001). The last compilation had results from 24 countries assembled in the Fourth Worldwide Review (WWR) on radioactive waste isolation (Witherspoon and Bodvarsson, 2006). Since publication of the last report in 2006, radioactive waste disposal approaches have continued to evolve, and there have been major developments in a number of national geological disposal programs. Significant experience has been obtained both in preparing and reviewing cases for the operational and long-term safety of proposed and operating repositories. Disposal of radioactive waste is a complex issue, not only because of the nature of the waste, but also because of the detailed regulatory structure for dealing with radioactive waste, the variety of stakeholders involved, and (in some cases) the number of regulatory entities involved.« less

  8. Autonomous perception and decision making in cyber-physical systems

    NASA Astrophysics Data System (ADS)

    Sarkar, Soumik

    2011-07-01

    The cyber-physical system (CPS) is a relatively new interdisciplinary technology area that includes the general class of embedded and hybrid systems. CPSs require integration of computation and physical processes that involves the aspects of physical quantities such as time, energy and space during information processing and control. The physical space is the source of information and the cyber space makes use of the generated information to make decisions. This dissertation proposes an overall architecture of autonomous perception-based decision & control of complex cyber-physical systems. Perception involves the recently developed framework of Symbolic Dynamic Filtering for abstraction of physical world in the cyber space. For example, under this framework, sensor observations from a physical entity are discretized temporally and spatially to generate blocks of symbols, also called words that form a language. A grammar of a language is the set of rules that determine the relationships among words to build sentences. Subsequently, a physical system is conjectured to be a linguistic source that is capable of generating a specific language. The proposed technology is validated on various (experimental and simulated) case studies that include health monitoring of aircraft gas turbine engines, detection and estimation of fatigue damage in polycrystalline alloys, and parameter identification. Control of complex cyber-physical systems involve distributed sensing, computation, control as well as complexity analysis. A novel statistical mechanics-inspired complexity analysis approach is proposed in this dissertation. In such a scenario of networked physical systems, the distribution of physical entities determines the underlying network topology and the interaction among the entities forms the abstract cyber space. It is envisioned that the general contributions, made in this dissertation, will be useful for potential application areas such as smart power grids and buildings, distributed energy systems, advanced health care procedures and future ground and air transportation systems.

  9. Decomposition approach of the nitrogen generation process: empirical study on the Shimabara Peninsula in Japan.

    PubMed

    Fujii, Hidemichi; Nakagawa, Kei; Kagabu, Makoto

    2016-11-01

    Groundwater nitrate pollution is one of the most prevalent water-related environmental problems worldwide. The objective of this study is to identify the determinants of nitrogen pollutant changes with a focus on the nitrogen generation process. The novelty of our research framework is to cost-effectively identify the factors involved in nitrogen pollutant generation using public data. This study focuses on three determinant factors: (1) nitrogen intensity changes, (2) structural changes, and (3) scale changes. This study empirically analyses three sectors, including crop production, farm animals, and the household, on the Shimabara Peninsula in Japan. Our results show that the nitrogen supply from crop production sectors has decreased because the production has been scaled down and shifted towards lower nitrogen intensive crops. In the farm animal sector, the nitrogen supply has also been successfully reduced due to scaling-down efforts. Households have decreased the nitrogen supply by diffusion of integrated septic tank and sewerage systems.

  10. Rotating rake design for unique measurement of fan-generated spinning acoustic modes

    NASA Technical Reports Server (NTRS)

    Konno, Kevin E.; Hausmann, Clifford R.

    1993-01-01

    In light of the current emphasis on noise reduction in subsonic aircraft design, NASA has been actively studying the source of and propagation of noise generated by subsonic fan engines. NASA/LeRC has developed and tested a unique method of accurately measuring these spinning acoustic modes generated by an experimental fan. This mode measuring method is based on the use of a rotating microphone rake. Testing was conducted in the 9 x 15 Low-speed Wind Tunnel. The rotating rake was tested with the Advanced Ducted Propeller (ADP) model. This memorandum discusses the design and performance of the motor/drive system for the fan-synchronized rotating acoustic rake. This novel motor/drive design approach is now being adapted for additional acoustic mode studies in new test rigs as baseline data for the future design of active noise control for subsonic fan engines. Included in this memorandum are the research requirements, motor/drive specifications, test performance results, and a description of the controls and software involved.

  11. Generation of intervention strategy for a genetic regulatory network represented by a family of Markov Chains.

    PubMed

    Berlow, Noah; Pal, Ranadip

    2011-01-01

    Genetic Regulatory Networks (GRNs) are frequently modeled as Markov Chains providing the transition probabilities of moving from one state of the network to another. The inverse problem of inference of the Markov Chain from noisy and limited experimental data is an ill posed problem and often generates multiple model possibilities instead of a unique one. In this article, we address the issue of intervention in a genetic regulatory network represented by a family of Markov Chains. The purpose of intervention is to alter the steady state probability distribution of the GRN as the steady states are considered to be representative of the phenotypes. We consider robust stationary control policies with best expected behavior. The extreme computational complexity involved in search of robust stationary control policies is mitigated by using a sequential approach to control policy generation and utilizing computationally efficient techniques for updating the stationary probability distribution of a Markov chain following a rank one perturbation.

  12. A Window Into Clinical Next-Generation Sequencing-Based Oncology Testing Practices.

    PubMed

    Nagarajan, Rakesh; Bartley, Angela N; Bridge, Julia A; Jennings, Lawrence J; Kamel-Reid, Suzanne; Kim, Annette; Lazar, Alexander J; Lindeman, Neal I; Moncur, Joel; Rai, Alex J; Routbort, Mark J; Vasalos, Patricia; Merker, Jason D

    2017-12-01

    - Detection of acquired variants in cancer is a paradigm of precision medicine, yet little has been reported about clinical laboratory practices across a broad range of laboratories. - To use College of American Pathologists proficiency testing survey results to report on the results from surveys on next-generation sequencing-based oncology testing practices. - College of American Pathologists proficiency testing survey results from more than 250 laboratories currently performing molecular oncology testing were used to determine laboratory trends in next-generation sequencing-based oncology testing. - These presented data provide key information about the number of laboratories that currently offer or are planning to offer next-generation sequencing-based oncology testing. Furthermore, we present data from 60 laboratories performing next-generation sequencing-based oncology testing regarding specimen requirements and assay characteristics. The findings indicate that most laboratories are performing tumor-only targeted sequencing to detect single-nucleotide variants and small insertions and deletions, using desktop sequencers and predesigned commercial kits. Despite these trends, a diversity of approaches to testing exists. - This information should be useful to further inform a variety of topics, including national discussions involving clinical laboratory quality systems, regulation and oversight of next-generation sequencing-based oncology testing, and precision oncology efforts in a data-driven manner.

  13. Living in "survival mode:" Intergenerational transmission of trauma from the Holodomor genocide of 1932-1933 in Ukraine.

    PubMed

    Bezo, Brent; Maggi, Stefania

    2015-06-01

    Qualitative methodology was used to investigate the intergenerational impact of the 1932-1933 Holodomor genocide on three generations in 15 Ukrainian families. Each family, residing in Ukraine, consisted of a first generation survivor, a second generation adult child and a third generation adult grandchild of the same line. The findings show that the Holodomor, a genocide that claimed millions of lives by forced starvation, still exerts substantial effects on generations born decades later. Specifically, thematic analysis of the 45 semi-structured, in-depth interviews, done between July and November 2010, revealed that a constellation of emotions, inner states and trauma-based coping strategies emerged in the survivors during the genocide period and were subsequently transmitted into the second and third generations. This constellation, summarized by participants as living in "survival mode," included horror, fear, mistrust, sadness, shame, anger, stress and anxiety, decreased self-worth, stockpiling of food, reverence for food, overemphasis on food and overeating, inability to discard unneeded items, an indifference toward others, social hostility and risky health behaviours. Since both the family and community-society were found to be involved in trauma transmission, the findings highlight the importance of multi-framework approaches for studying and healing collective trauma. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Potential of Micro Hydroelectric Generator Embedded at 30,000 PE Effluent Discharge of Sewerage Treatment Plant

    NASA Astrophysics Data System (ADS)

    Che Munaaim, M. A.; Razali, N.; Ayob, A.; Hamidin, N.; Othuman Mydin, M. A.

    2018-03-01

    A micro hydroelectric generator is an energy conversion approach to generate electricity from potential (motion) energy to an electrical energy. In this research, it is desired to be implemented by using a micro hydroelectric generator which is desired to be embedded at the continuous flow of effluent discharge point of domestic sewerage treatment plant (STP). This research evaluates the potential of electricity generation from micro hydroelectric generator attached to 30,000 PE sewerage treatment plant. The power output obtained from calculation of electrical power conversion is used to identify the possibility of this system and its ability to provide electrical energy, which can minimize the cost of electric bill especially for the pumping system. The overview of this system on the practical application with the consideration of payback period is summarized. The ultimate aim of the whole application is to have a self-ecosystem electrical power generated for the internal use of STP by using its own flowing water in supporting the sustainable engineering towards renewable energy and energy efficient approach. The results shows that the output power obtained is lower than expected output power (12 kW) and fall beyond of the range of a micro hydro power (5kW - 100kW) since it is only generating 1.58 kW energy by calculation. It is also observed that the estimated payback period is longer which i.e 7 years to recoup the return of investment. A range of head from 4.5 m and above for the case where the flow shall at least have maintained at 0.05 m3/s in the selected plant in order to achieved a feasible power output. In conclusion, wastewater treatment process involves the flowing water (potential energy) especially at the effluent discharge point of STP is possibly harvested for electricity generation by embedding the micro hydroelectric generator. However, the selection of STP needs to have minimum 4.5 meter head with 0.05 m3/s of continuously flowing water to make it feasible to harvest.

  15. The Role of Hands-On Science Labs in Engaging the Next Generation of Space Explorers

    NASA Astrophysics Data System (ADS)

    Williams, Teresa A. J.

    2002-01-01

    Each country participating on the International Space Station (ISS) recognizes the importance of educating the coming generation about space and its opportunities. In 2001 the St. James School in downtown Houston, Texas was approached with a proposal to renovate an unused classroom and become involved with the "GLOBE" Program and other Internet based international learning resources. This inner-city school willingly agreed to the program based on "hands-on" learning. One month after room conversion and ten computer terminals donated by area businesses connectivity established to the internet the students immediately began using the "Global Learning and Observations to Benefit the Environment (GLOBE)" program and the International Space Station (ISS) Program educational resources. The "GLOBE" program involves numerous scientific and technical agencies studying the Earth, who make it their goal to provide educational resources to an international community of K-12 scientist. This project was conceived as a successor to the "Interactive Elementary Space Museum for the New Millennium" a space museum in a school corridor without the same type of budget. The laboratory is a collaboration, which involved area businesses, volunteers from the NASA/Johnson Space Center ISS Outreach Program, and students. This paper will outline planning and operation of the school science laboratory project from the point of view of the schools interest and involvement and assess its success to date. It will consider the lessons learned by the participating school administrations in the management of the process and discuss some of the issues that can both promote and discourage school participation in such projects.

  16. Disentangling patient and public involvement in healthcare decisions: why the difference matters.

    PubMed

    Fredriksson, Mio; Tritter, Jonathan Q

    2017-01-01

    Patient and public involvement has become an integral aspect of many developed health systems and is judged to be an essential driver for reform. However, little attention has been paid to the distinctions between patients and the public, and the views of patients are often seen to encompass those of the general public. Using an ideal-type approach, we analyse crucial distinctions between patient involvement and public involvement using examples from Sweden and England. We highlight that patients have sectional interests as health service users in contrast to citizens who engage as a public policy agent reflecting societal interests. Patients draw on experiential knowledge and focus on output legitimacy and performance accountability, aim at typical representativeness, and a direct responsiveness to individual needs and preferences. In contrast, the public contributes with collective perspectives generated from diversity, centres on input legitimacy achieved through statistical representativeness, democratic accountability and indirect responsiveness to general citizen preferences. Thus, using patients as proxies for the public fails to achieve intended goals and benefits of involvement. We conclude that understanding and measuring the impact of patient and public involvement can only develop with the application of a clearer comprehension of the differences. © 2016 Foundation for the Sociology of Health & Illness.

  17. Naturally derived and synthetic scaffolds for skeletal muscle reconstruction☆

    PubMed Central

    Wolf, Matthew T.; Dearth, Christopher L.; Sonnenberg, Sonya B.; Loboa, Elizabeth G.; Badylak, Stephen F.

    2017-01-01

    Skeletal muscle tissue has an inherent capacity for regeneration following injury. However, severe trauma, such as volumetric muscle loss, overwhelms these natural muscle repair mechanisms prompting the search for a tissue engineering/regenerative medicine approach to promote functional skeletal muscle restoration. A desirable approach involves a bioscaffold that simultaneously acts as an inductive microenvironment and as a cell/drug delivery vehicle to encourage muscle ingrowth. Both biologically active, naturally derived materials (such as extracellular matrix) and carefully engineered synthetic polymers have been developed to provide such a muscle regenerative environment. Next generation naturally derived/synthetic “hybrid materials” would combine the advantageous properties of these materials to create an optimal platform for cell/drug delivery and possess inherent bioactive properties. Advances in scaffolds using muscle tissue engineering are reviewed herein. PMID:25174309

  18. Methods for the Scientific Study of Discrimination and Health: An Ecosocial Approach

    PubMed Central

    2012-01-01

    The scientific study of how discrimination harms health requires theoretically grounded methods. At issue is how discrimination, as one form of societal injustice, becomes embodied inequality and is manifested as health inequities. As clarified by ecosocial theory, methods must address the lived realities of discrimination as an exploitative and oppressive societal phenomenon operating at multiple levels and involving myriad pathways across both the life course and historical generations. An integrated embodied research approach hence must consider (1) the structural level—past and present de jure and de facto discrimination; (2) the individual level—issues of domains, nativity, and use of both explicit and implicit discrimination measures; and (3) how current research methods likely underestimate the impact of racism on health. PMID:22420803

  19. Reconstructing biochemical pathways from time course data.

    PubMed

    Srividhya, Jeyaraman; Crampin, Edmund J; McSharry, Patrick E; Schnell, Santiago

    2007-03-01

    Time series data on biochemical reactions reveal transient behavior, away from chemical equilibrium, and contain information on the dynamic interactions among reacting components. However, this information can be difficult to extract using conventional analysis techniques. We present a new method to infer biochemical pathway mechanisms from time course data using a global nonlinear modeling technique to identify the elementary reaction steps which constitute the pathway. The method involves the generation of a complete dictionary of polynomial basis functions based on the law of mass action. Using these basis functions, there are two approaches to model construction, namely the general to specific and the specific to general approach. We demonstrate that our new methodology reconstructs the chemical reaction steps and connectivity of the glycolytic pathway of Lactococcus lactis from time course experimental data.

  20. Strategic science for eating disorders research and policy impact.

    PubMed

    Roberto, Christina A; Brownell, Kelly D

    2017-03-01

    Scientific research often fails to have relevance and impact because scientists do not engage policy makers and influencers in the process of identifying information needs and generating high priority questions. To address this scholarship-policy gap, we have developed a model of Strategic Science. This research approach involves working with policy makers and influencers to craft research questions that will answer important and timely policy-related questions. The goal is to create tighter links between research and policy and ensure findings are communicated efficiently to change agents best positioned to apply the research to policy debates. In this article, we lay out a model for Strategic Science and describe how this approach may help advance policy research and action for eating disorders. © 2017 Wiley Periodicals, Inc.

  1. Challenges and opportunities for improving food quality and nutrition through plant biotechnology.

    PubMed

    Francis, David; Finer, John J; Grotewold, Erich

    2017-04-01

    Plant biotechnology has been around since the advent of humankind, resulting in tremendous improvements in plant cultivation through crop domestication, breeding and selection. The emergence of transgenic approaches involving the introduction of defined DNA sequences into plants by humans has rapidly changed the surface of our planet by further expanding the gene pool used by plant breeders for plant improvement. Transgenic approaches in food plants have raised concerns on the merits, social implications, ecological risks and true benefits of plant biotechnology. The recently acquired ability to precisely edit plant genomes by modifying native genes without introducing new genetic material offers new opportunities to rapidly exploit natural variation, create new variation and incorporate changes with the goal to generate more productive and nutritious plants. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Light, heat, action: neural control of fruit fly behaviour

    PubMed Central

    Owald, David; Lin, Suewei; Waddell, Scott

    2015-01-01

    The fruit fly Drosophila melanogaster has emerged as a popular model to investigate fundamental principles of neural circuit operation. The sophisticated genetics and small brain permit a cellular resolution understanding of innate and learned behavioural processes. Relatively recent genetic and technical advances provide the means to specifically and reproducibly manipulate the function of many fly neurons with temporal resolution. The same cellular precision can also be exploited to express genetically encoded reporters of neural activity and cell-signalling pathways. Combining these approaches in living behaving animals has great potential to generate a holistic view of behavioural control that transcends the usual molecular, cellular and systems boundaries. In this review, we discuss these approaches with particular emphasis on the pioneering studies and those involving learning and memory. PMID:26240426

  3. Results of a transparent expert consultation on patient and public involvement in palliative care research.

    PubMed

    Daveson, Barbara A; de Wolf-Linder, Susanne; Witt, Jana; Newson, Kirstie; Morris, Carolyn; Higginson, Irene J; Evans, Catherine J

    2015-12-01

    Support and evidence for patient, unpaid caregiver and public involvement in research (user involvement) are growing. Consensus on how best to involve users in palliative care research is lacking. To determine an optimal user-involvement model for palliative care research. We hosted a consultation workshop using expert presentations, discussion and nominal group technique to generate recommendations and consensus on agreement of importance. A total of 35 users and 32 researchers were approached to attend the workshop, which included break-out groups and a ranking exercise. Descriptive statistical analysis to establish consensus and highlight divergence was applied. Qualitative analysis of discussions was completed to aid interpretation of findings. Participants involved in palliative care research were invited to a global research institute, UK. A total of 12 users and 5 researchers participated. Users wanted their involvement to be more visible, including during dissemination, with a greater emphasis on the difference their involvement makes. Researchers wanted to improve productivity, relevance and quality through involvement. Users and researchers agreed that an optimal model should consist of (a) early involvement to ensure meaningful involvement and impact and (b) diverse virtual and face-to-face involvement methods to ensure flexibility. For involvement in palliative care research to succeed, early and flexible involvement is required. Researchers should advertise opportunities for involvement and promote impact of involvement via dissemination plans. Users should prioritise adding value to research through enhancing productivity, quality and relevance. More research is needed not only to inform implementation and ensure effectiveness but also to investigate the cost-effectiveness of involvement in palliative care research. © The Author(s) 2015.

  4. Results of a transparent expert consultation on patient and public involvement in palliative care research

    PubMed Central

    Daveson, Barbara A; de Wolf-Linder, Susanne; Witt, Jana; Newson, Kirstie; Morris, Carolyn; Higginson, Irene J; Evans, Catherine J

    2015-01-01

    Background: Support and evidence for patient, unpaid caregiver and public involvement in research (user involvement) are growing. Consensus on how best to involve users in palliative care research is lacking. Aim: To determine an optimal user-involvement model for palliative care research. Design: We hosted a consultation workshop using expert presentations, discussion and nominal group technique to generate recommendations and consensus on agreement of importance. A total of 35 users and 32 researchers were approached to attend the workshop, which included break-out groups and a ranking exercise. Descriptive statistical analysis to establish consensus and highlight divergence was applied. Qualitative analysis of discussions was completed to aid interpretation of findings. Setting/participants: Participants involved in palliative care research were invited to a global research institute, UK. Results: A total of 12 users and 5 researchers participated. Users wanted their involvement to be more visible, including during dissemination, with a greater emphasis on the difference their involvement makes. Researchers wanted to improve productivity, relevance and quality through involvement. Users and researchers agreed that an optimal model should consist of (a) early involvement to ensure meaningful involvement and impact and (b) diverse virtual and face-to-face involvement methods to ensure flexibility. Conclusion: For involvement in palliative care research to succeed, early and flexible involvement is required. Researchers should advertise opportunities for involvement and promote impact of involvement via dissemination plans. Users should prioritise adding value to research through enhancing productivity, quality and relevance. More research is needed not only to inform implementation and ensure effectiveness but also to investigate the cost-effectiveness of involvement in palliative care research. PMID:25931336

  5. A rapid and low noise switch from RANS to WMLES on curvilinear grids with compressible flow solvers

    NASA Astrophysics Data System (ADS)

    Deck, Sébastien; Weiss, Pierre-Elie; Renard, Nicolas

    2018-06-01

    A turbulent inflow for a rapid and low noise switch from RANS to Wall-Modelled LES on curvilinear grids with compressible flow solvers is presented. It can be embedded within the computational domain in practical applications with WMLES grids around three-dimensional geometries in a flexible zonal hybrid RANS/LES modelling context. It relies on a physics-motivated combination of Zonal Detached Eddy Simulation (ZDES) as the WMLES technique together with a Dynamic Forcing method processing the fluctuations caused by a Zonal Immersed Boundary Condition describing roughness elements. The performance in generating a physically-sound turbulent flow field with the proper mean skin friction and turbulent profiles after a short relaxation length is equivalent to more common inflow methods thanks to the generation of large-scale streamwise vorticity by the roughness elements. Comparisons in a low Mach-number zero-pressure-gradient flat-plate turbulent boundary layer up to Reθ = 6 100 reveal that the pressure field is dominated by the spurious noise caused by the synthetic turbulence methods (Synthetic Eddy Method and White Noise injection), contrary to the new low-noise approach which may be used to obtain the low-frequency component of wall pressure and reproduce its intermittent nature. The robustness of the method is tested in the flow around a three-element airfoil with WMLES in the upper boundary layer near the trailing edge of the main element. In spite of the very short relaxation distance allowed, self-sustainable resolved turbulence is generated in the outer layer with significantly less spurious noise than with the approach involving White Noise. The ZDES grid count for this latter test case is more than two orders of magnitude lower than the Wall-Resolved LES requirement and a unique mesh is involved, which is much simpler than some multiple-mesh strategies devised for WMLES or turbulent inflow.

  6. AgMIP: Next Generation Models and Assessments

    NASA Astrophysics Data System (ADS)

    Rosenzweig, C.

    2014-12-01

    Next steps in developing next-generation crop models fall into several categories: significant improvements in simulation of important crop processes and responses to stress; extension from simplified crop models to complex cropping systems models; and scaling up from site-based models to landscape, national, continental, and global scales. Crop processes that require major leaps in understanding and simulation in order to narrow uncertainties around how crops will respond to changing atmospheric conditions include genetics; carbon, temperature, water, and nitrogen; ozone; and nutrition. The field of crop modeling has been built on a single crop-by-crop approach. It is now time to create a new paradigm, moving from 'crop' to 'cropping system.' A first step is to set up the simulation technology so that modelers can rapidly incorporate multiple crops within fields, and multiple crops over time. Then the response of these more complex cropping systems can be tested under different sustainable intensification management strategies utilizing the updated simulation environments. Model improvements for diseases, pests, and weeds include developing process-based models for important diseases, frameworks for coupling air-borne diseases to crop models, gathering significantly more data on crop impacts, and enabling the evaluation of pest management strategies. Most smallholder farming in the world involves integrated crop-livestock systems that cannot be represented by crop modeling alone. Thus, next-generation cropping system models need to include key linkages to livestock. Livestock linkages to be incorporated include growth and productivity models for grasslands and rangelands as well as the usual annual crops. There are several approaches for scaling up, including use of gridded models and development of simpler quasi-empirical models for landscape-scale analysis. On the assessment side, AgMIP is leading a community process for coordinated contributions to IPCC AR6 that involves the key modeling groups from around the world including North America, Europe, South America, Sub-Saharan Africa, South Asia, East Asia, and Australia and Oceania. This community process will lead to mutually agreed protocols for coordinated global and regional assessments.

  7. A hybrid deep learning approach to predict malignancy of breast lesions using mammograms

    NASA Astrophysics Data System (ADS)

    Wang, Yunzhi; Heidari, Morteza; Mirniaharikandehei, Seyedehnafiseh; Gong, Jing; Qian, Wei; Qiu, Yuchen; Zheng, Bin

    2018-03-01

    Applying deep learning technology to medical imaging informatics field has been recently attracting extensive research interest. However, the limited medical image dataset size often reduces performance and robustness of the deep learning based computer-aided detection and/or diagnosis (CAD) schemes. In attempt to address this technical challenge, this study aims to develop and evaluate a new hybrid deep learning based CAD approach to predict likelihood of a breast lesion detected on mammogram being malignant. In this approach, a deep Convolutional Neural Network (CNN) was firstly pre-trained using the ImageNet dataset and serve as a feature extractor. A pseudo-color Region of Interest (ROI) method was used to generate ROIs with RGB channels from the mammographic images as the input to the pre-trained deep network. The transferred CNN features from different layers of the CNN were then obtained and a linear support vector machine (SVM) was trained for the prediction task. By applying to a dataset involving 301 suspicious breast lesions and using a leave-one-case-out validation method, the areas under the ROC curves (AUC) = 0.762 and 0.792 using the traditional CAD scheme and the proposed deep learning based CAD scheme, respectively. An ensemble classifier that combines the classification scores generated by the two schemes yielded an improved AUC value of 0.813. The study results demonstrated feasibility and potentially improved performance of applying a new hybrid deep learning approach to develop CAD scheme using a relatively small dataset of medical images.

  8. Identification of rare genetic variants in Italian patients with dementia by targeted gene sequencing.

    PubMed

    Bartoletti-Stella, Anna; Baiardi, Simone; Stanzani-Maserati, Michelangelo; Piras, Silvia; Caffarra, Paolo; Raggi, Alberto; Pantieri, Roberta; Baldassari, Sara; Caporali, Leonardo; Abu-Rumeileh, Samir; Linarello, Simona; Liguori, Rocco; Parchi, Piero; Capellari, Sabina

    2018-06-01

    Genetics is intricately involved in the etiology of neurodegenerative dementias. The incidence of monogenic dementia among all neurodegenerative forms is unknown due to the lack of systematic studies and of patient/clinician access to extensive diagnostic procedures. In this study, we conducted targeted sequencing in 246 clinically heterogeneous patients, mainly with early-onset and/or familial neurodegenerative dementia, using a custom-designed next-generation sequencing panel covering 27 genes known to harbor mutations that can cause different types of dementia, in addition to the detection of C9orf72 repeat expansions. Forty-nine patients (19.9%) carried known pathogenic or novel, likely pathogenic, variants, involving both common (presenilin 1, presenilin 2, C9orf72, and granulin) and rare (optineurin, serpin family I member 1 and protein kinase cyclic adenosine monophosphate (cAMP)-dependent type I regulatory subunit beta) dementia-associated genes. Our results support the use of an extended next-generation sequencing panels as a quick, accurate, and cost-effective method for diagnosis in clinical practice. This approach could have a significant impact on the proportion of tested patients, especially among those with an early disease onset. Copyright © 2018 Elsevier Inc. All rights reserved.

  9. Systems Biology for Smart Crops and Agricultural Innovation: Filling the Gaps between Genotype and Phenotype for Complex Traits Linked with Robust Agricultural Productivity and Sustainability

    PubMed Central

    Pathak, Rajesh Kumar; Gupta, Sanjay Mohan; Gaur, Vikram Singh; Pandey, Dinesh

    2015-01-01

    Abstract In recent years, rapid developments in several omics platforms and next generation sequencing technology have generated a huge amount of biological data about plants. Systems biology aims to develop and use well-organized and efficient algorithms, data structure, visualization, and communication tools for the integration of these biological data with the goal of computational modeling and simulation. It studies crop plant systems by systematically perturbing them, checking the gene, protein, and informational pathway responses; integrating these data; and finally, formulating mathematical models that describe the structure of system and its response to individual perturbations. Consequently, systems biology approaches, such as integrative and predictive ones, hold immense potential in understanding of molecular mechanism of agriculturally important complex traits linked to agricultural productivity. This has led to identification of some key genes and proteins involved in networks of pathways involved in input use efficiency, biotic and abiotic stress resistance, photosynthesis efficiency, root, stem and leaf architecture, and nutrient mobilization. The developments in the above fields have made it possible to design smart crops with superior agronomic traits through genetic manipulation of key candidate genes. PMID:26484978

  10. Active Tube-Shaped Actuator with Embedded Square Rod-Shaped Ionic Polymer-Metal Composites for Robotic-Assisted Manipulation

    PubMed Central

    Liu, Jiayu; Zhu, Denglin; Chen, Hualing

    2018-01-01

    This paper reports a new technique involving the design, fabrication, and characterization of an ionic polymer-metal composite- (IPMC-) embedded active tube, which can achieve multidegree-of-freedom (MODF) bending motions desirable in many applications, such as a manipulator and an active catheter. However, traditional strip-type IPMC actuators are limited in only being able to generate 1-dimensional bending motion. So, in this paper, we try to develop an approach which involves molding or integrating rod-shaped IPMC actuators into a soft silicone rubber structure to create an active tube. We modified the Nafion solution casting method and developed a complete sequence of a fabrication process for rod-shaped IPMCs with square cross sections and four insulated electrodes on the surface. The silicone gel was cured at a suitable temperature to form a flexible tube using molds fabricated by 3D printing technology. By applying differential voltages to the four electrodes of each IPMC rod-shaped actuator, MDOF bending motions of the active tube can be generated. Experimental results show that such IPMC-embedded tube designs can be used for developing robotic-assisted manipulation. PMID:29770160

  11. Active Tube-Shaped Actuator with Embedded Square Rod-Shaped Ionic Polymer-Metal Composites for Robotic-Assisted Manipulation.

    PubMed

    Wang, Yanjie; Liu, Jiayu; Zhu, Denglin; Chen, Hualing

    2018-01-01

    This paper reports a new technique involving the design, fabrication, and characterization of an ionic polymer-metal composite- (IPMC-) embedded active tube, which can achieve multidegree-of-freedom (MODF) bending motions desirable in many applications, such as a manipulator and an active catheter. However, traditional strip-type IPMC actuators are limited in only being able to generate 1-dimensional bending motion. So, in this paper, we try to develop an approach which involves molding or integrating rod-shaped IPMC actuators into a soft silicone rubber structure to create an active tube. We modified the Nafion solution casting method and developed a complete sequence of a fabrication process for rod-shaped IPMCs with square cross sections and four insulated electrodes on the surface. The silicone gel was cured at a suitable temperature to form a flexible tube using molds fabricated by 3D printing technology. By applying differential voltages to the four electrodes of each IPMC rod-shaped actuator, MDOF bending motions of the active tube can be generated. Experimental results show that such IPMC-embedded tube designs can be used for developing robotic-assisted manipulation.

  12. Legal approaches regarding health-care decisions involving minors: implications for next-generation sequencing

    PubMed Central

    Sénécal, Karine; Thys, Kristof; Vears, Danya F; Van Assche, Kristof; Knoppers, Bartha M; Borry, Pascal

    2016-01-01

    The development of next-generation sequencing (NGS) technologies are revolutionizing medical practice, facilitating more accurate, sophisticated and cost-effective genetic testing. NGS is already being implemented in the clinic assisting diagnosis and management of disorders with a strong heritable component. Although considerable attention has been paid to issues regarding return of incidental or secondary findings, matters of consent are less well explored. This is particularly important for the use of NGS in minors. Recent guidelines addressing genomic testing and screening of children and adolescents have suggested that as ‘young children' lack decision-making capacity, decisions about testing must be conducted by a surrogate, namely their parents. This prompts consideration of the age at which minors can provide lawful consent to health-care interventions, and consequently NGS performed for diagnostic purposes. Here, we describe the existing legal approaches regarding the rights of minors to consent to health-care interventions, including how laws in the 28 Member States of the European Union and in Canada consider competent minors, and then apply this to the context of NGS. There is considerable variation in the rights afforded to minors across countries. Many legal systems determine that minors would be allowed, or may even be required, to make decisions about interventions such as NGS. However, minors are often considered as one single homogeneous population who always require parental consent, rather than recognizing there are different categories of ‘minors' and that capacity to consent or to be involved in discussions and decision-making process is a spectrum rather than a hurdle. PMID:27302841

  13. Legal approaches regarding health-care decisions involving minors: implications for next-generation sequencing.

    PubMed

    Sénécal, Karine; Thys, Kristof; Vears, Danya F; Van Assche, Kristof; Knoppers, Bartha M; Borry, Pascal

    2016-11-01

    The development of next-generation sequencing (NGS) technologies are revolutionizing medical practice, facilitating more accurate, sophisticated and cost-effective genetic testing. NGS is already being implemented in the clinic assisting diagnosis and management of disorders with a strong heritable component. Although considerable attention has been paid to issues regarding return of incidental or secondary findings, matters of consent are less well explored. This is particularly important for the use of NGS in minors. Recent guidelines addressing genomic testing and screening of children and adolescents have suggested that as 'young children' lack decision-making capacity, decisions about testing must be conducted by a surrogate, namely their parents. This prompts consideration of the age at which minors can provide lawful consent to health-care interventions, and consequently NGS performed for diagnostic purposes. Here, we describe the existing legal approaches regarding the rights of minors to consent to health-care interventions, including how laws in the 28 Member States of the European Union and in Canada consider competent minors, and then apply this to the context of NGS. There is considerable variation in the rights afforded to minors across countries. Many legal systems determine that minors would be allowed, or may even be required, to make decisions about interventions such as NGS. However, minors are often considered as one single homogeneous population who always require parental consent, rather than recognizing there are different categories of 'minors' and that capacity to consent or to be involved in discussions and decision-making process is a spectrum rather than a hurdle.

  14. Quantification of CO2 generation in sedimentary basins through carbonate/clays reactions with uncertain thermodynamic parameters

    NASA Astrophysics Data System (ADS)

    Ceriotti, G.; Porta, G. M.; Geloni, C.; Dalla Rosa, M.; Guadagnini, A.

    2017-09-01

    We develop a methodological framework and mathematical formulation which yields estimates of the uncertainty associated with the amounts of CO2 generated by Carbonate-Clays Reactions (CCR) in large-scale subsurface systems to assist characterization of the main features of this geochemical process. Our approach couples a one-dimensional compaction model, providing the dynamics of the evolution of porosity, temperature and pressure along the vertical direction, with a chemical model able to quantify the partial pressure of CO2 resulting from minerals and pore water interaction. The modeling framework we propose allows (i) estimating the depth at which the source of gases is located and (ii) quantifying the amount of CO2 generated, based on the mineralogy of the sediments involved in the basin formation process. A distinctive objective of the study is the quantification of the way the uncertainty affecting chemical equilibrium constants propagates to model outputs, i.e., the flux of CO2. These parameters are considered as key sources of uncertainty in our modeling approach because temperature and pressure distributions associated with deep burial depths typically fall outside the range of validity of commonly employed geochemical databases and typically used geochemical software. We also analyze the impact of the relative abundancy of primary phases in the sediments on the activation of CCR processes. As a test bed, we consider a computational study where pressure and temperature conditions are representative of those observed in real sedimentary formation. Our results are conducive to the probabilistic assessment of (i) the characteristic pressure and temperature at which CCR leads to generation of CO2 in sedimentary systems, (ii) the order of magnitude of the CO2 generation rate that can be associated with CCR processes.

  15. Cirrus cloud model parameterizations: Incorporating realistic ice particle generation

    NASA Technical Reports Server (NTRS)

    Sassen, Kenneth; Dodd, G. C.; Starr, David OC.

    1990-01-01

    Recent cirrus cloud modeling studies have involved the application of a time-dependent, two dimensional Eulerian model, with generalized cloud microphysical parameterizations drawn from experimental findings. For computing the ice versus vapor phase changes, the ice mass content is linked to the maintenance of a relative humidity with respect to ice (RHI) of 105 percent; ice growth occurs both with regard to the introduction of new particles and the growth of existing particles. In a simplified cloud model designed to investigate the basic role of various physical processes in the growth and maintenance of cirrus clouds, these parametric relations are justifiable. In comparison, the one dimensional cloud microphysical model recently applied to evaluating the nucleation and growth of ice crystals in cirrus clouds explicitly treated populations of haze and cloud droplets, and ice crystals. Although these two modeling approaches are clearly incompatible, the goal of the present numerical study is to develop a parametric treatment of new ice particle generation, on the basis of detailed microphysical model findings, for incorporation into improved cirrus growth models. For example, the relation between temperature and the relative humidity required to generate ice crystals from ammonium sulfate haze droplets, whose probability of freezing through the homogeneous nucleation mode are a combined function of time and droplet molality, volume, and temperature. As an example of this approach, the results of cloud microphysical simulations are presented showing the rather narrow domain in the temperature/humidity field where new ice crystals can be generated. The microphysical simulations point out the need for detailed CCN studies at cirrus altitudes and haze droplet measurements within cirrus clouds, but also suggest that a relatively simple treatment of ice particle generation, which includes cloud chemistry, can be incorporated into cirrus cloud growth.

  16. 40 CFR 92.303 - General provisions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...) Averaging involves the generation of credits by a manufacturer or remanufacturer for use by that same... applicable emission standard, subject to the provisions of this subpart. (e) Banking involves the generation...) Trading involves the sale of banked credits for use in certification of new locomotives and new locomotive...

  17. Synthesis and optimization of four bar mechanism with six design parameters

    NASA Astrophysics Data System (ADS)

    Jaiswal, Ankur; Jawale, H. P.

    2018-04-01

    Function generation is synthesis of mechanism for specific task, involves complexity for specially synthesis above five precision of coupler points. Thus pertains to large structural error. The methodology for arriving to better precision solution is to use the optimization technique. Work presented herein considers methods of optimization of structural error in closed kinematic chain with single degree of freedom, for generating functions like log(x), ex, tan(x), sin(x) with five precision points. The equation in Freudenstein-Chebyshev method is used to develop five point synthesis of mechanism. The extended formulation is proposed and results are obtained to verify existing results in literature. Optimization of structural error is carried out using least square approach. Comparative structural error analysis is presented on optimized error through least square method and extended Freudenstein-Chebyshev method.

  18. Antihypertensive Properties of Plant-Based Prebiotics

    PubMed Central

    Yeo, Siok-Koon; Ooi, Lay-Gaik; Lim, Ting-Jin; Liong, Min-Tze

    2009-01-01

    Hypertension is one of the major risk factors for cardiovascular disease. Although various drugs for its treatment have been synthesized, the occurring side effects have generated the need for natural interventions for the treatment and prevention of hypertension. Dietary intervention such as the administration of prebiotics has been seen as a highly acceptable approach. Prebiotics are indigestible food ingredients that bypass digestion and reach the lower gut as substrates for indigenous microflora. Most of the prebiotics used as food adjuncts, such as inulin, fructooligosaccharides, dietary fiber and gums, are derived from plants. Experimental evidence from recent studies has suggested that prebiotics are capable of reducing and preventing hypertension. This paper will discuss some of the mechanisms involved, the evidence generated from both in-vitro experiments and in-vivo trials and some controversial findings that are raised. PMID:20111692

  19. Novel mutants of Erwinia carotovora subsp. carotovora defective in the production of plant cell wall degrading enzymes generated by Mu transpososome-mediated insertion mutagenesis.

    PubMed

    Laasik, Eve; Ojarand, Merli; Pajunen, Maria; Savilahti, Harri; Mäe, Andres

    2005-02-01

    As in Erwinia carotovora subsp. carotovora the regulation details of the main virulence factors, encoding extracellular enzymes that degrade the plant cell wall, is only rudimentally understood, we performed a genetic screen to identify novel candidate genes involved in the process. Initially, we used Mu transpososome-mediated mutagenesis approach to generate a comprehensive transposon insertion mutant library of ca. 10000 clones and screened the clones for the loss of extracellular enzyme production. Extracellular enzymes production was abolished by mutations in the chromosomal helEcc, trkAEcc yheLEcc, glsEcc, igaAEcc and cysQEcc genes. The findings reported here demonstrate that we have isolated six new representatives that belong to the pool of genes modulating the production of virulence factors in E. carotovora.

  20. Quantization of systems with temporally varying discretization. II. Local evolution moves

    NASA Astrophysics Data System (ADS)

    Höhn, Philipp A.

    2014-10-01

    Several quantum gravity approaches and field theory on an evolving lattice involve a discretization changing dynamics generated by evolution moves. Local evolution moves in variational discrete systems (1) are a generalization of the Pachner evolution moves of simplicial gravity models, (2) update only a small subset of the dynamical data, (3) change the number of kinematical and physical degrees of freedom, and (4) generate a dynamical (or canonical) coarse graining or refining of the underlying discretization. To systematically explore such local moves and their implications in the quantum theory, this article suitably expands the quantum formalism for global evolution moves, constructed in Paper I [P. A. Höhn, "Quantization of systems with temporally varying discretization. I. Evolving Hilbert spaces," J. Math. Phys. 55, 083508 (2014); e-print arXiv:1401.6062 [gr-qc

  1. Time- and Cost-Efficient Identification of T-DNA Insertion Sites through Targeted Genomic Sequencing

    PubMed Central

    Lepage, Étienne; Zampini, Éric; Boyle, Brian; Brisson, Normand

    2013-01-01

    Forward genetic screens enable the unbiased identification of genes involved in biological processes. In Arabidopsis, several mutant collections are publicly available, which greatly facilitates such practice. Most of these collections were generated by agrotransformation of a T-DNA at random sites in the plant genome. However, precise mapping of T-DNA insertion sites in mutants isolated from such screens is a laborious and time-consuming task. Here we report a simple, low-cost and time efficient approach to precisely map T-DNA insertions simultaneously in many different mutants. By combining sequence capture, next-generation sequencing and 2D-PCR pooling, we developed a new method that allowed the rapid localization of T-DNA insertion sites in 55 out of 64 mutant plants isolated in a screen for gyrase inhibition hypersensitivity. PMID:23951038

  2. Pleiotrophin-induced endothelial cell migration is regulated by xanthine oxidase-mediated generation of reactive oxygen species.

    PubMed

    Tsirmoula, Sotiria; Lamprou, Margarita; Hatziapostolou, Maria; Kieffer, Nelly; Papadimitriou, Evangelia

    2015-03-01

    Pleiotrophin (PTN) is a heparin-binding growth factor that induces cell migration through binding to its receptor protein tyrosine phosphatase beta/zeta (RPTPβ/ζ) and integrin alpha v beta 3 (ανβ3). In the present work, we studied the effect of PTN on the generation of reactive oxygen species (ROS) in human endothelial cells and the involvement of ROS in PTN-induced cell migration. Exogenous PTN significantly increased ROS levels in a concentration and time-dependent manner in both human endothelial and prostate cancer cells, while knockdown of endogenous PTN expression in prostate cancer cells significantly down-regulated ROS production. Suppression of RPTPβ/ζ through genetic and pharmacological approaches, or inhibition of c-src kinase activity abolished PTN-induced ROS generation. A synthetic peptide that blocks PTN-ανβ3 interaction abolished PTN-induced ROS generation, suggesting that ανβ3 is also involved. The latter was confirmed in CHO cells that do not express β3 or over-express wild-type β3 or mutant β3Y773F/Y785F. PTN increased ROS generation in cells expressing wild-type β3 but not in cells not expressing or expressing mutant β3. Phosphoinositide 3-kinase (PI3K) or Erk1/2 inhibition suppressed PTN-induced ROS production, suggesting that ROS production lays down-stream of PI3K or Erk1/2 activation by PTN. Finally, ROS scavenging and xanthine oxidase inhibition completely abolished both PTN-induced ROS generation and cell migration, while NADPH oxidase inhibition had no effect. Collectively, these data suggest that xanthine oxidase-mediated ROS production is required for PTN-induced cell migration through the cell membrane functional complex of ανβ3 and RPTPβ/ζ and activation of c-src, PI3K and ERK1/2 kinases. Copyright © 2015 Elsevier Inc. All rights reserved.

  3. Buckling Load Calculations of the Isotropic Shell A-8 Using a High-Fidelity Hierarchical Approach

    NASA Technical Reports Server (NTRS)

    Arbocz, Johann; Starnes, James H.

    2002-01-01

    As a step towards developing a new design philosophy, one that moves away from the traditional empirical approach used today in design towards a science-based design technology approach, a test series of 7 isotropic shells carried out by Aristocrat and Babcock at Caltech is used. It is shown how the hierarchical approach to buckling load calculations proposed by Arbocz et al can be used to perform an approach often called 'high fidelity analysis', where the uncertainties involved in a design are simulated by refined and accurate numerical methods. The Delft Interactive Shell DEsign COde (short, DISDECO) is employed for this hierarchical analysis to provide an accurate prediction of the critical buckling load of the given shell structure. This value is used later as a reference to establish the accuracy of the Level-3 buckling load predictions. As a final step in the hierarchical analysis approach, the critical buckling load and the estimated imperfection sensitivity of the shell are verified by conducting an analysis using a sufficiently refined finite element model with one of the current generation two-dimensional shell analysis codes with the advanced capabilities needed to represent both geometric and material nonlinearities.

  4. On a High-Fidelity Hierarchical Approach to Buckling Load Calculations

    NASA Technical Reports Server (NTRS)

    Arbocz, Johann; Starnes, James H.; Nemeth, Michael P.

    2001-01-01

    As a step towards developing a new design philosophy, one that moves away from the traditional empirical approach used today in design towards a science-based design technology approach, a recent test series of 5 composite shells carried out by Waters at NASA Langley Research Center is used. It is shown how the hierarchical approach to buckling load calculations proposed by Arbocz et al can be used to perform an approach often called "high fidelity analysis", where the uncertainties involved in a design are simulated by refined and accurate numerical methods. The Delft Interactive Shell DEsign COde (short, DISDECO) is employed for this hierarchical analysis to provide an accurate prediction of the critical buckling load of the given shell structure. This value is used later as a reference to establish the accuracy of the Level-3 buckling load predictions. As a final step in the hierarchical analysis approach, the critical buckling load and the estimated imperfection sensitivity of the shell are verified by conducting an analysis using a sufficiently refined finite element model with one of the current generation two-dimensional shell analysis codes with the advanced capabilities needed to represent both geometric and material nonlinearities.

  5. Testing the TPF Interferometry Approach before Launch

    NASA Technical Reports Server (NTRS)

    Serabyn, Eugene; Mennesson, Bertrand

    2006-01-01

    One way to directly detect nearby extra-solar planets is via their thermal infrared emission, and with this goal in mind, both NASA and ESA are investigating cryogenic infrared interferometers. Common to both agencies' approaches to faint off-axis source detection near bright stars is the use of a rotating nulling interferometer, such as the Terrestrial Planet Finder interferometer (TPF-I), or Darwin. In this approach, the central star is nulled, while the emission from off-axis sources is transmitted and modulated by the rotation of the off-axis fringes. Because of the high contrasts involved, and the novelty of the measurement technique, it is essential to gain experience with this technique before launch. Here we describe a simple ground-based experiment that can test the essential aspects of the TPF signal measurement and image reconstruction approaches by generating a rotating interferometric baseline within the pupil of a large singleaperture telescope. This approach can mimic potential space-based interferometric configurations, and allow the extraction of signals from off-axis sources using the same algorithms proposed for the space-based missions. This approach should thus allow for testing of the applicability of proposed signal extraction algorithms for the detection of single and multiple near-neighbor companions...

  6. Precison Muon Physics

    NASA Astrophysics Data System (ADS)

    Hertzog, David

    2013-04-01

    The worldwide, vibrant experimental program involving precision measurements with muons will be presented. Recent achievements in this field have greatly improved our knowledge of fundamental parameters: Fermi constant (lifetime), weak-nucleon pseudoscalar coupling (μp capture), Michel decay parameters, and the proton charged radius (Lamb shift). The charged-lepton-violating decay μ->eγ sets new physics limits. Updated Standard Model theory evaluations of the muon anomalous magnetic moment has increased the significance beyond 3 σ for the deviation with respect to experiment. Next-generation experiments are mounting, with ambitious sensitivity goals for the muon-to-electron search approaching 10-17 sensitivity and for a 0.14 ppm determination of g-2. The broad physics reach of these efforts involves atomic, nuclear and particle physics communities. I will select from recent work and outline the most important efforts that are in preparation.

  7. Analysis of enamel development using murine model systems: approaches and limitations

    PubMed Central

    Pugach, Megan K.; Gibson, Carolyn W.

    2014-01-01

    A primary goal of enamel research is to understand and potentially treat or prevent enamel defects related to amelogenesis imperfecta (AI). Rodents are ideal models to assist our understanding of how enamel is formed because they are easily genetically modified, and their continuously erupting incisors display all stages of enamel development and mineralization. While numerous methods have been developed to generate and analyze genetically modified rodent enamel, it is crucial to understand the limitations and challenges associated with these methods in order to draw appropriate conclusions that can be applied translationally, to AI patient care. We have highlighted methods involved in generating and analyzing rodent enamel and potential approaches to overcoming limitations of these methods: (1) generating transgenic, knockout, and knockin mouse models, and (2) analyzing rodent enamel mineral density and functional properties (structure and mechanics) of mature enamel. There is a need for a standardized workflow to analyze enamel phenotypes in rodent models so that investigators can compare data from different studies. These methods include analyses of gene and protein expression, developing enamel histology, enamel pigment, degree of mineralization, enamel structure, and mechanical properties. Standardization of these methods with regard to stage of enamel development and sample preparation is crucial, and ideally investigators can use correlative and complementary techniques with the understanding that developing mouse enamel is dynamic and complex. PMID:25278900

  8. Identifying and understanding the concerns of business: a systematic approach to the development of the Australian WorkHealth Program - Arthritis.

    PubMed

    Reavley, Nicola; Livingston, Jenni; Buchbinder, Rachelle; Osborne, Richard

    2012-07-01

    The aim of the Australian WorkHealth Program - Arthritis was to develop and test an education program designed to minimise risk of arthritis and prevent or reduce absenteeism and presenteeism. The objective of the current study was to use a wide-ranging, multifaceted and interactive approach to engage with stakeholders in order to inform the content and delivery of the intervention. Methods used to inform program design included a concept mapping workshop, interviews, surveys, a steering committee and an industry advisory group. Engaging with a wide range of stakeholders in multiple ways early in program development allowed for the comparison and verification of data to obtain a better overall picture of the needs of participants. It also offered the opportunity to share 'ownership' of the program with stakeholders by generating a program that was tailored to their ethos and needs. The stakeholder engagement process was instrumental in building commitment to the program and establishing an overarching model of action. Interview and survey data indicated that awareness of arthritis was low and musculoskeletal disorders more generally were of greater concern. It was agreed that programs should be relevant, evidence-based, involve senior management education, incorporate a business case, and involve tailored implementation and marketing strategies. The qualitative preparatory phase as well as all the engagement work was key to informing program design. The approach taken in this study has the potential to inform a wide range of workplace interventions. Engaging with a wide range of stakeholders in multiple ways from program inception allowed for the comparison and verification of information to permit the generation of a model of intervention that had the highest possible chance of success. It offered the opportunity to not only define program content and implementation processes, but to build genuine 'ownership' of the program.

  9. Intergenerational Relationship Quality, Gender, and Grandparent Involvement

    ERIC Educational Resources Information Center

    Barnett, Melissa A.; Scaramella, Laura V.; Neppl, Tricia K.; Ontai, Lenna; Conger, Rand D.

    2010-01-01

    This prospective, intergenerational study (N = 181) considered how parent (G1, Generation 1) and child (G2, Generation 2) relationship quality during adolescence and adulthood is associated with G1's level of involvement with their 3- to 4-year-old grandchildren (G3, Generation 3). Path model analyses indicated different patterns of results for…

  10. Mapping the Views of Adolescent Health Stakeholders.

    PubMed

    Ewan, Lindsay A; McLinden, Daniel; Biro, Frank; DeJonckheere, Melissa; Vaughn, Lisa M

    2016-01-01

    Health research that includes youth and family stakeholders increases the contextual relevance of findings, which can benefit both the researchers and stakeholders involved. The goal of this study was to identify youth and family adolescent health priorities and to explore strategies to address these concerns. Stakeholders identified important adolescent health concerns, perceptions of which were then explored using concept mapping. Concept mapping is a mixed-method participatory research approach that invites input from various stakeholders. In response to prompts, stakeholders suggested ways to address the identified health conditions. Adolescent participants then sorted the statements into groups based on content similarity and rated the statements for importance and feasibility. Multidimensional scaling and cluster analysis were then applied to create the concept maps. Stakeholders identified sexually transmitted infections (STIs) and obesity as the health conditions they considered most important. The concept map for STIs identified 7 clusters: General sex education, support and empowerment, testing and treatment, community involvement and awareness, prevention and protection, parental involvement in sex education, and media. The obesity concept map portrayed 8 clusters: Healthy food choices, obesity education, support systems, clinical and community involvement, community support for exercise, physical activity, nutrition support, and nutrition education. Ratings were generally higher for importance than for feasibility. The concept maps demonstrate stakeholder-driven ideas about approaches to target STIs and obesity in this context. Strategies at multiple social ecological levels were emphasized. The concept maps can be used to generate discussion regarding these topics and to identify interventions. Copyright © 2016 Society for Adolescent Health and Medicine. Published by Elsevier Inc. All rights reserved.

  11. Viewing Generativity and Social Capital as Underlying Factors of Parent Involvement

    ERIC Educational Resources Information Center

    Stevens, Sharon; Patel, Nimisha

    2015-01-01

    Parent involvement in education is a multifaceted support that has many well-documented benefits for students of all ages. Parent involvement is also a common expression of generativity as defined in Erik Erikson's theory of psychosocial development. The activities parents engage in during their children's educational pursuits, as well as their…

  12. Abiotic Stress Responses and Microbe-Mediated Mitigation in Plants: The Omics Strategies

    PubMed Central

    Meena, Kamlesh K.; Sorty, Ajay M.; Bitla, Utkarsh M.; Choudhary, Khushboo; Gupta, Priyanka; Pareek, Ashwani; Singh, Dhananjaya P.; Prabha, Ratna; Sahu, Pramod K.; Gupta, Vijai K.; Singh, Harikesh B.; Krishanani, Kishor K.; Minhas, Paramjit S.

    2017-01-01

    Abiotic stresses are the foremost limiting factors for agricultural productivity. Crop plants need to cope up adverse external pressure created by environmental and edaphic conditions with their intrinsic biological mechanisms, failing which their growth, development, and productivity suffer. Microorganisms, the most natural inhabitants of diverse environments exhibit enormous metabolic capabilities to mitigate abiotic stresses. Since microbial interactions with plants are an integral part of the living ecosystem, they are believed to be the natural partners that modulate local and systemic mechanisms in plants to offer defense under adverse external conditions. Plant-microbe interactions comprise complex mechanisms within the plant cellular system. Biochemical, molecular and physiological studies are paving the way in understanding the complex but integrated cellular processes. Under the continuous pressure of increasing climatic alterations, it now becomes more imperative to define and interpret plant-microbe relationships in terms of protection against abiotic stresses. At the same time, it also becomes essential to generate deeper insights into the stress-mitigating mechanisms in crop plants for their translation in higher productivity. Multi-omics approaches comprising genomics, transcriptomics, proteomics, metabolomics and phenomics integrate studies on the interaction of plants with microbes and their external environment and generate multi-layered information that can answer what is happening in real-time within the cells. Integration, analysis and decipherization of the big-data can lead to a massive outcome that has significant chance for implementation in the fields. This review summarizes abiotic stresses responses in plants in-terms of biochemical and molecular mechanisms followed by the microbe-mediated stress mitigation phenomenon. We describe the role of multi-omics approaches in generating multi-pronged information to provide a better understanding of plant–microbe interactions that modulate cellular mechanisms in plants under extreme external conditions and help to optimize abiotic stresses. Vigilant amalgamation of these high-throughput approaches supports a higher level of knowledge generation about root-level mechanisms involved in the alleviation of abiotic stresses in organisms. PMID:28232845

  13. Abiotic Stress Responses and Microbe-Mediated Mitigation in Plants: The Omics Strategies.

    PubMed

    Meena, Kamlesh K; Sorty, Ajay M; Bitla, Utkarsh M; Choudhary, Khushboo; Gupta, Priyanka; Pareek, Ashwani; Singh, Dhananjaya P; Prabha, Ratna; Sahu, Pramod K; Gupta, Vijai K; Singh, Harikesh B; Krishanani, Kishor K; Minhas, Paramjit S

    2017-01-01

    Abiotic stresses are the foremost limiting factors for agricultural productivity. Crop plants need to cope up adverse external pressure created by environmental and edaphic conditions with their intrinsic biological mechanisms, failing which their growth, development, and productivity suffer. Microorganisms, the most natural inhabitants of diverse environments exhibit enormous metabolic capabilities to mitigate abiotic stresses. Since microbial interactions with plants are an integral part of the living ecosystem, they are believed to be the natural partners that modulate local and systemic mechanisms in plants to offer defense under adverse external conditions. Plant-microbe interactions comprise complex mechanisms within the plant cellular system. Biochemical, molecular and physiological studies are paving the way in understanding the complex but integrated cellular processes. Under the continuous pressure of increasing climatic alterations, it now becomes more imperative to define and interpret plant-microbe relationships in terms of protection against abiotic stresses. At the same time, it also becomes essential to generate deeper insights into the stress-mitigating mechanisms in crop plants for their translation in higher productivity. Multi-omics approaches comprising genomics, transcriptomics, proteomics, metabolomics and phenomics integrate studies on the interaction of plants with microbes and their external environment and generate multi-layered information that can answer what is happening in real-time within the cells. Integration, analysis and decipherization of the big-data can lead to a massive outcome that has significant chance for implementation in the fields. This review summarizes abiotic stresses responses in plants in-terms of biochemical and molecular mechanisms followed by the microbe-mediated stress mitigation phenomenon. We describe the role of multi-omics approaches in generating multi-pronged information to provide a better understanding of plant-microbe interactions that modulate cellular mechanisms in plants under extreme external conditions and help to optimize abiotic stresses. Vigilant amalgamation of these high-throughput approaches supports a higher level of knowledge generation about root-level mechanisms involved in the alleviation of abiotic stresses in organisms.

  14. Nanotools for Neuroscience and Brain Activity Mapping

    PubMed Central

    Alivisatos, A. Paul; Andrews, Anne M.; Boyden, Edward S.; Chun, Miyoung; Church, George M.; Deisseroth, Karl; Donoghue, John P.; Fraser, Scott E.; Lippincott-Schwartz, Jennifer; Looger, Loren L.; Masmanidis, Sotiris; McEuen, Paul L.; Nurmikko, Arto V.; Park, Hongkun; Peterka, Darcy S.; Reid, Clay; Roukes, Michael L.; Scherer, Axel; Schnitzer, Mark; Sejnowski, Terrence J.; Shepard, Kenneth L.; Tsao, Doris; Turrigiano, Gina; Weiss, Paul S.; Xu, Chris; Yuste, Rafael; Zhuang, Xiaowei

    2013-01-01

    Neuroscience is at a crossroads. Great effort is being invested into deciphering specific neural interactions and circuits. At the same time, there exist few general theories or principles that explain brain function. We attribute this disparity, in part, to limitations in current methodologies. Traditional neurophysiological approaches record the activities of one neuron or a few neurons at a time. Neurochemical approaches focus on single neurotransmitters. Yet, there is an increasing realization that neural circuits operate at emergent levels, where the interactions between hundreds or thousands of neurons, utilizing multiple chemical transmitters, generate functional states. Brains function at the nanoscale, so tools to study brains must ultimately operate at this scale, as well. Nanoscience and nanotechnology are poised to provide a rich toolkit of novel methods to explore brain function by enabling simultaneous measurement and manipulation of activity of thousands or even millions of neurons. We and others refer to this goal as the Brain Activity Mapping Project. In this Nano Focus, we discuss how recent developments in nanoscale analysis tools and in the design and synthesis of nanomaterials have generated optical, electrical, and chemical methods that can readily be adapted for use in neuroscience. These approaches represent exciting areas of technical development and research. Moreover, unique opportunities exist for nanoscientists, nanotechnologists, and other physical scientists and engineers to contribute to tackling the challenging problems involved in understanding the fundamentals of brain function. PMID:23514423

  15. An Open Source Simulation Model for Soil and Sediment Bioturbation

    PubMed Central

    Schiffers, Katja; Teal, Lorna Rachel; Travis, Justin Mark John; Solan, Martin

    2011-01-01

    Bioturbation is one of the most widespread forms of ecological engineering and has significant implications for the structure and functioning of ecosystems, yet our understanding of the processes involved in biotic mixing remains incomplete. One reason is that, despite their value and utility, most mathematical models currently applied to bioturbation data tend to neglect aspects of the natural complexity of bioturbation in favour of mathematical simplicity. At the same time, the abstract nature of these approaches limits the application of such models to a limited range of users. Here, we contend that a movement towards process-based modelling can improve both the representation of the mechanistic basis of bioturbation and the intuitiveness of modelling approaches. In support of this initiative, we present an open source modelling framework that explicitly simulates particle displacement and a worked example to facilitate application and further development. The framework combines the advantages of rule-based lattice models with the application of parameterisable probability density functions to generate mixing on the lattice. Model parameters can be fitted by experimental data and describe particle displacement at the spatial and temporal scales at which bioturbation data is routinely collected. By using the same model structure across species, but generating species-specific parameters, a generic understanding of species-specific bioturbation behaviour can be achieved. An application to a case study and comparison with a commonly used model attest the predictive power of the approach. PMID:22162997

  16. An open source simulation model for soil and sediment bioturbation.

    PubMed

    Schiffers, Katja; Teal, Lorna Rachel; Travis, Justin Mark John; Solan, Martin

    2011-01-01

    Bioturbation is one of the most widespread forms of ecological engineering and has significant implications for the structure and functioning of ecosystems, yet our understanding of the processes involved in biotic mixing remains incomplete. One reason is that, despite their value and utility, most mathematical models currently applied to bioturbation data tend to neglect aspects of the natural complexity of bioturbation in favour of mathematical simplicity. At the same time, the abstract nature of these approaches limits the application of such models to a limited range of users. Here, we contend that a movement towards process-based modelling can improve both the representation of the mechanistic basis of bioturbation and the intuitiveness of modelling approaches. In support of this initiative, we present an open source modelling framework that explicitly simulates particle displacement and a worked example to facilitate application and further development. The framework combines the advantages of rule-based lattice models with the application of parameterisable probability density functions to generate mixing on the lattice. Model parameters can be fitted by experimental data and describe particle displacement at the spatial and temporal scales at which bioturbation data is routinely collected. By using the same model structure across species, but generating species-specific parameters, a generic understanding of species-specific bioturbation behaviour can be achieved. An application to a case study and comparison with a commonly used model attest the predictive power of the approach.

  17. Waste Management in the Circular Economy. The Case of Romania.

    NASA Astrophysics Data System (ADS)

    Iuga, Anca N.

    2016-11-01

    Applying the principles of sustainable development in Romania involves a new approach to ecological waste using basic concepts of circular economy to weigh accurately the proposed projects in this area taking into account existing environmental resources and zero waste objectives. The paper is focused on: quantitative and qualitative measures of waste prevention in Romania, the changing status of the waste by selling it as product, the mechanisms for paying for treatment and / or disposal which discourage waste generation and the use of financial resources obtained from secondary raw materials for the efficiency of waste management.

  18. The Application of Nanoparticles in Gene Therapy and Magnetic Resonance Imaging

    PubMed Central

    HERRANZ, FERNANDO; ALMARZA, ELENA; RODRÍGUEZ, IGNACIO; SALINAS, BEATRIZ; ROSELL, YAMILKA; DESCO, MANUEL; BULTE, JEFF W.; RUIZ-CABELLO, JESÚS

    2012-01-01

    The combination of nanoparticles, gene therapy, and medical imaging has given rise to a new field known as gene theranostics, in which a nanobioconjugate is used to diagnose and treat the disease. The process generally involves binding between a vector carrying the genetic information and a nanoparticle, which provides the signal for imaging. The synthesis of this probe generates a synergic effect, enhancing the efficiency of gene transduction and imaging contrast. We discuss the latest approaches in the synthesis of nanoparticles for magnetic resonance imaging, gene therapy strategies, and their conjugation and in vivo application. PMID:21484943

  19. Perspective: Proteomic approach to detect biomarkers of human growth hormone

    PubMed Central

    Ding, Juan; List, Edward O.; Okada, Shigeru; Kopchick, John J.

    2009-01-01

    Several serum biomarkers for recombinant human growth hormone (rhGH) have been established, however, none alone or in combination have generate a specific, sensitive, and reproducible ‘kit’ for the detection of rhGH abuse. Thus, the search for additional GH specific biomarkers continues. In this review, we focus on the use of proteomics in general and 2-dimensional electrophoresis (2-DE) in particular for the discovery of new GH induced serum biomarkers. Also, we review some of the protocols involved in 2DE. Finally, the possibility of tissues other than blood for biomarker discovery is discussed. PMID:19501004

  20. 2D spatially controlled polymer micro patterning for cellular behavior studies

    NASA Astrophysics Data System (ADS)

    Dinca, V.; Palla-Papavlu, A.; Paraico, I.; Lippert, T.; Wokaun, A.; Dinescu, M.

    2011-04-01

    A simple and effective method to functionalize glass surfaces that enable polymer micropatterning and subsequent spatially controlled adhesion of cells is reported in this paper. The method involves the application of laser induced forward transfer (LIFT) to achieve polymer patterning in a single step onto cell repellent substrates (i.e. polyethyleneglycol (PEG)). This approach was used to produce micron-size polyethyleneimine (PEI)-patterns alternating with cell-repellent areas. The focus of this work is the ability of SH-SY5Y human neuroblastoma cells to orient, migrate, and produce organized cellular arrangements on laser generated PEI patterns.

  1. A new scanning electron microscopy approach to image aerogels at the nanoscale

    NASA Astrophysics Data System (ADS)

    Solá, F.; Hurwitz, F.; Yang, J.

    2011-04-01

    A new scanning electron microscopy (SEM) technique to image poor electrically conductive aerogels is presented. The process can be performed by non-expert SEM users. We showed that negative charging effects on aerogels can be minimized significantly by inserting dry nitrogen gas close to the region of interest. The process involves the local recombination of accumulated negative charges with positive ions generated from ionization processes. This new technique made possible the acquisition of images of aerogels with pores down to approximately 3 nm in diameter using a positively biased Everhart-Thornley (ET) detector.

  2. Child disaster mental health interventions, part II

    PubMed Central

    Pfefferbaum, Betty; Sweeton, Jennifer L.; Newman, Elana; Varma, Vandana; Noffsinger, Mary A.; Shaw, Jon A.; Chrisman, Allan K.; Nitiéma, Pascal

    2014-01-01

    This review summarizes current knowledge on the timing of child disaster mental health intervention delivery, the settings for intervention delivery, the expertise of providers, and therapeutic approaches. Studies have been conducted on interventions delivered during all phases of disaster management from pre event through many months post event. Many interventions were administered in schools which offer access to large numbers of children. Providers included mental health professionals and school personnel. Studies described individual and group interventions, some with parent involvement. The next generation of interventions and studies should be based on an empirical analysis of a number of key areas. PMID:26295009

  3. KSC-04PD-1996

    NASA Technical Reports Server (NTRS)

    2004-01-01

    KENNEDY SPACE CENTER, FLA. Astronaut Leland Melvin involves students at Ronald E. McNair High School in Atlanta, a NASA Explorer School, during a presentation. He accompanied KSC Deputy Director Dr. Woodrow Whitlow Jr., who is visiting to the school to share The vision for space exploration with the next generation of explorers. Whitlow talked with students about our destiny as explorers, NASAs stepping stone approach to exploring Earth, the Moon, Mars and beyond, how space impacts our lives, and how people and machines rely on each other in space. Melvin talked about the importance of teamwork and what it takes for mission success.

  4. Current state of cartilage tissue engineering

    PubMed Central

    Tuli, Richard; Li, Wan-Ju; Tuan, Rocky S

    2003-01-01

    Damage to cartilage is of great clinical consequence given the tissue's limited intrinsic potential for healing. Current treatments for cartilage repair are less than satisfactory, and rarely restore full function or return the tissue to its native normal state. The rapidly emerging field of tissue engineering holds great promise for the generation of functional cartilage tissue substitutes. The general approach involves a biocompatible, structurally and mechanically sound scaffold, with an appropriate cell source, which is loaded with bioactive molecules that promote cellular differentiation and/or maturation. This review highlights aspects of current progress in cartilage tissue engineering. PMID:12932283

  5. A long PCR–based approach for DNA enrichment prior to next-generation sequencing for systematic studies1

    PubMed Central

    Uribe-Convers, Simon; Duke, Justin R.; Moore, Michael J.; Tank, David C.

    2014-01-01

    • Premise of the study: We present an alternative approach for molecular systematic studies that combines long PCR and next-generation sequencing. Our approach can be used to generate templates from any DNA source for next-generation sequencing. Here we test our approach by amplifying complete chloroplast genomes, and we present a set of 58 potentially universal primers for angiosperms to do so. Additionally, this approach is likely to be particularly useful for nuclear and mitochondrial regions. • Methods and Results: Chloroplast genomes of 30 species across angiosperms were amplified to test our approach. Amplification success varied depending on whether PCR conditions were optimized for a given taxon. To further test our approach, some amplicons were sequenced on an Illumina HiSeq 2000. • Conclusions: Although here we tested this approach by sequencing plastomes, long PCR amplicons could be generated using DNA from any genome, expanding the possibilities of this approach for molecular systematic studies. PMID:25202592

  6. Taxi drivers' views on risky driving behavior in Tehran: a qualitative study using a social marketing approach.

    PubMed

    Shams, Mohsen; Shojaeizadeh, Davoud; Majdzadeh, Reza; Rashidian, Arash; Montazeri, Ali

    2011-05-01

    The use of the social marketing approach for public health issues is increasing. This approach uses marketing concepts borrowed from the principles of commercial marketing to promote beneficial health behaviors. In this qualitative study, four focus groups involving 42 participants were used in consumer research to explore taxi drivers' views on the driving situation and the determinants of risky driving behaviors in Tehran, as well as to gather their ideas for developing a social marketing program to reduce risky driving behaviors among taxi drivers in Tehran, Iran. Participants were asked to respond to questions that would guide the development of a marketing mix, or four Ps (product, price, place and promotion). The discussions determined that the program product should involve avoiding risky driving behaviors through increased attention to driving. They pointed out that developing and communicating with a well-designed persuasive message meant to draw their attention to driving could affect their driving behaviors. In addition, participants identified price, place and promotion strategies. They offered suggestions for marketing nonrisky driving to the target audience. The focus group discussions generated important insights into the values and the motivations that affect consumers' decisions to adopt the product. The focus group guided the development of a social marketing program to reduce risky driving behaviors in taxi drivers in Tehran, Iran. Copyright © 2010 Elsevier Ltd. All rights reserved.

  7. Improving Upon String Methods for Transition State Discovery.

    PubMed

    Chaffey-Millar, Hugh; Nikodem, Astrid; Matveev, Alexei V; Krüger, Sven; Rösch, Notker

    2012-02-14

    Transition state discovery via application of string methods has been researched on two fronts. The first front involves development of a new string method, named the Searching String method, while the second one aims at estimating transition states from a discretized reaction path. The Searching String method has been benchmarked against a number of previously existing string methods and the Nudged Elastic Band method. The developed methods have led to a reduction in the number of gradient calls required to optimize a transition state, as compared to existing methods. The Searching String method reported here places new beads on a reaction pathway at the midpoint between existing beads, such that the resolution of the path discretization in the region containing the transition state grows exponentially with the number of beads. This approach leads to favorable convergence behavior and generates more accurate estimates of transition states from which convergence to the final transition states occurs more readily. Several techniques for generating improved estimates of transition states from a converged string or nudged elastic band have been developed and benchmarked on 13 chemical test cases. Optimization approaches for string methods, and pitfalls therein, are discussed.

  8. Meaningful questions: The acquisition of auxiliary inversion in a connectionist model of sentence production.

    PubMed

    Fitz, Hartmut; Chang, Franklin

    2017-09-01

    Nativist theories have argued that language involves syntactic principles which are unlearnable from the input children receive. A paradigm case of these innate principles is the structure dependence of auxiliary inversion in complex polar questions (Chomsky, 1968, 1975, 1980). Computational approaches have focused on the properties of the input in explaining how children acquire these questions. In contrast, we argue that messages are structured in a way that supports structure dependence in syntax. We demonstrate this approach within a connectionist model of sentence production (Chang, 2009) which learned to generate a range of complex polar questions from a structured message without positive exemplars in the input. The model also generated different types of error in development that were similar in magnitude to those in children (e.g., auxiliary doubling, Ambridge, Rowland, & Pine, 2008; Crain & Nakayama, 1987). Through model comparisons we trace how meaning constraints and linguistic experience interact during the acquisition of auxiliary inversion. Our results suggest that auxiliary inversion rules in English can be acquired without innate syntactic principles, as long as it is assumed that speakers who ask complex questions express messages that are structured into multiple propositions. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Challenges in engineering osteochondral tissue grafts with hierarchical structures.

    PubMed

    Gadjanski, Ivana; Vunjak-Novakovic, Gordana

    2015-01-01

    A major hurdle in treating osteochondral (OC) defects is the different healing abilities of two types of tissues involved - articular cartilage and subchondral bone. Biomimetic approaches to OC-construct engineering, based on recapitulation of biological principles of tissue development and regeneration, have potential for providing new treatments and advancing fundamental studies of OC tissue repair. This review on state of the art in hierarchical OC tissue graft engineering is focused on tissue engineering approaches designed to recapitulate the native milieu of cartilage and bone development. These biomimetic systems are discussed with relevance to bioreactor cultivation of clinically sized, anatomically shaped human cartilage/bone constructs with physiologic stratification and mechanical properties. The utility of engineered OC tissue constructs is evaluated for their use as grafts in regenerative medicine, and as high-fidelity models in biological research. A major challenge in engineering OC tissues is to generate a functionally integrated stratified cartilage-bone structure starting from one single population of mesenchymal cells, while incorporating perfusable vasculature into the bone, and in bone-cartilage interface. To this end, new generations of advanced scaffolds and bioreactors, implementation of mechanical loading regimens and harnessing of inflammatory responses of the host will likely drive the further progress.

  10. Development of spectral indices for roofing material condition status detection using field spectroscopy and WorldView-3 data

    NASA Astrophysics Data System (ADS)

    Samsudin, Sarah Hanim; Shafri, Helmi Z. M.; Hamedianfar, Alireza

    2016-04-01

    Status observations of roofing material degradation are constantly evolving due to urban feature heterogeneities. Although advanced classification techniques have been introduced to improve within-class impervious surface classifications, these techniques involve complex processing and high computation times. This study integrates field spectroscopy and satellite multispectral remote sensing data to generate degradation status maps of concrete and metal roofing materials. Field spectroscopy data were used as bases for selecting suitable bands for spectral index development because of the limited number of multispectral bands. Mapping methods for roof degradation status were established for metal and concrete roofing materials by developing the normalized difference concrete condition index (NDCCI) and the normalized difference metal condition index (NDMCI). Results indicate that the accuracies achieved using the spectral indices are higher than those obtained using supervised pixel-based classification. The NDCCI generated an accuracy of 84.44%, whereas the support vector machine (SVM) approach yielded an accuracy of 73.06%. The NDMCI obtained an accuracy of 94.17% compared with 62.5% for the SVM approach. These findings support the suitability of the developed spectral index methods for determining roof degradation statuses from satellite observations in heterogeneous urban environments.

  11. Use of mRNA expression signatures to discover small molecule inhibitors of skeletal muscle atrophy

    PubMed Central

    Adams, Christopher M.; Ebert, Scott M.; Dyle, Michael C.

    2017-01-01

    Purpose of review Here, we discuss a recently developed experimental strategy for discovering small molecules with potential to prevent and treat skeletal muscle atrophy. Recent findings Muscle atrophy involves and requires widespread changes in skeletal muscle gene expression, which generate complex but measurable patterns of positive and negative changes in skeletal muscle mRNA levels (a.k.a. mRNA expression signatures of muscle atrophy). Many bioactive small molecules generate their own characteristic mRNA expression signatures, and by identifying small molecules whose signatures approximate mirror images of muscle atrophy signatures, one may identify small molecules with potential to prevent and/or reverse muscle atrophy. Unlike a conventional drug discovery approach, this strategy does not rely on a predefined molecular target but rather exploits the complexity of muscle atrophy to identify small molecules that counter the entire spectrum of pathological changes in atrophic muscle. We discuss how this strategy has been used to identify two natural compounds, ursolic acid and tomatidine, that reduce muscle atrophy and improve skeletal muscle function. Summary Discovery strategies based on mRNA expression signatures can elucidate new approaches for preserving and restoring muscle mass and function. PMID:25807353

  12. Use of mRNA expression signatures to discover small molecule inhibitors of skeletal muscle atrophy.

    PubMed

    Adams, Christopher M; Ebert, Scott M; Dyle, Michael C

    2015-05-01

    Here, we discuss a recently developed experimental strategy for discovering small molecules with potential to prevent and treat skeletal muscle atrophy. Muscle atrophy involves and requires widespread changes in skeletal muscle gene expression, which generate complex but measurable patterns of positive and negative changes in skeletal muscle mRNA levels (a.k.a. mRNA expression signatures of muscle atrophy). Many bioactive small molecules generate their own characteristic mRNA expression signatures, and by identifying small molecules whose signatures approximate mirror images of muscle atrophy signatures, one may identify small molecules with potential to prevent and/or reverse muscle atrophy. Unlike a conventional drug discovery approach, this strategy does not rely on a predefined molecular target but rather exploits the complexity of muscle atrophy to identify small molecules that counter the entire spectrum of pathological changes in atrophic muscle. We discuss how this strategy has been used to identify two natural compounds, ursolic acid and tomatidine, that reduce muscle atrophy and improve skeletal muscle function. Discovery strategies based on mRNA expression signatures can elucidate new approaches for preserving and restoring muscle mass and function.

  13. Non-invasive determination of glucose directly in raw fruits using a continuous flow system based on microdialysis sampling and amperometric detection at an integrated enzymatic biosensor.

    PubMed

    Vargas, E; Ruiz, M A; Campuzano, S; Reviejo, A J; Pingarrón, J M

    2016-03-31

    A non-destructive, rapid and simple to use sensing method for direct determination of glucose in non-processed fruits is described. The strategy involved on-line microdialysis sampling coupled with a continuous flow system with amperometric detection at an enzymatic biosensor. Apart from direct determination of glucose in fruit juices and blended fruits, this work describes for the first time the successful application of an enzymatic biosensor-based electrochemical approach to the non-invasive determination of glucose in raw fruits. The methodology correlates, through previous calibration set-up, the amperometric signal generated from glucose in non-processed fruits with its content in % (w/w). The comparison of the obtained results using the proposed approach in different fruits with those provided by other method involving the same commercial biosensor as amperometric detector in stirred solutions pointed out that there were no significant differences. Moreover, in comparison with other available methodologies, this microdialysis-coupled continuous flow system amperometric biosensor-based procedure features straightforward sample preparation, low cost, reduced assay time (sampling rate of 7 h(-1)) and ease of automation. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Bioinformatics tools for the analysis of NMR metabolomics studies focused on the identification of clinically relevant biomarkers.

    PubMed

    Puchades-Carrasco, Leonor; Palomino-Schätzlein, Martina; Pérez-Rambla, Clara; Pineda-Lucena, Antonio

    2016-05-01

    Metabolomics, a systems biology approach focused on the global study of the metabolome, offers a tremendous potential in the analysis of clinical samples. Among other applications, metabolomics enables mapping of biochemical alterations involved in the pathogenesis of diseases, and offers the opportunity to noninvasively identify diagnostic, prognostic and predictive biomarkers that could translate into early therapeutic interventions. Particularly, metabolomics by Nuclear Magnetic Resonance (NMR) has the ability to simultaneously detect and structurally characterize an abundance of metabolic components, even when their identities are unknown. Analysis of the data generated using this experimental approach requires the application of statistical and bioinformatics tools for the correct interpretation of the results. This review focuses on the different steps involved in the metabolomics characterization of biofluids for clinical applications, ranging from the design of the study to the biological interpretation of the results. Particular emphasis is devoted to the specific procedures required for the processing and interpretation of NMR data with a focus on the identification of clinically relevant biomarkers. © The Author 2015. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  15. Statistical Engineering in Air Traffic Management Research

    NASA Technical Reports Server (NTRS)

    Wilson, Sara R.

    2015-01-01

    NASA is working to develop an integrated set of advanced technologies to enable efficient arrival operations in high-density terminal airspace for the Next Generation Air Transportation System. This integrated arrival solution is being validated and verified in laboratories and transitioned to a field prototype for an operational demonstration at a major U.S. airport. Within NASA, this is a collaborative effort between Ames and Langley Research Centers involving a multi-year iterative experimentation process. Designing and analyzing a series of sequential batch computer simulations and human-in-the-loop experiments across multiple facilities and simulation environments involves a number of statistical challenges. Experiments conducted in separate laboratories typically have different limitations and constraints, and can take different approaches with respect to the fundamental principles of statistical design of experiments. This often makes it difficult to compare results from multiple experiments and incorporate findings into the next experiment in the series. A statistical engineering approach is being employed within this project to support risk-informed decision making and maximize the knowledge gained within the available resources. This presentation describes a statistical engineering case study from NASA, highlights statistical challenges, and discusses areas where existing statistical methodology is adapted and extended.

  16. Fluid-structure interaction involving large deformations: 3D simulations and applications to biological systems

    NASA Astrophysics Data System (ADS)

    Tian, Fang-Bao; Dai, Hu; Luo, Haoxiang; Doyle, James F.; Rousseau, Bernard

    2014-02-01

    Three-dimensional fluid-structure interaction (FSI) involving large deformations of flexible bodies is common in biological systems, but accurate and efficient numerical approaches for modeling such systems are still scarce. In this work, we report a successful case of combining an existing immersed-boundary flow solver with a nonlinear finite-element solid-mechanics solver specifically for three-dimensional FSI simulations. This method represents a significant enhancement from the similar methods that are previously available. Based on the Cartesian grid, the viscous incompressible flow solver can handle boundaries of large displacements with simple mesh generation. The solid-mechanics solver has separate subroutines for analyzing general three-dimensional bodies and thin-walled structures composed of frames, membranes, and plates. Both geometric nonlinearity associated with large displacements and material nonlinearity associated with large strains are incorporated in the solver. The FSI is achieved through a strong coupling and partitioned approach. We perform several validation cases, and the results may be used to expand the currently limited database of FSI benchmark study. Finally, we demonstrate the versatility of the present method by applying it to the aerodynamics of elastic wings of insects and the flow-induced vocal fold vibration.

  17. A simple method for finding explicit analytic transition densities of diffusion processes with general diploid selection.

    PubMed

    Song, Yun S; Steinrücken, Matthias

    2012-03-01

    The transition density function of the Wright-Fisher diffusion describes the evolution of population-wide allele frequencies over time. This function has important practical applications in population genetics, but finding an explicit formula under a general diploid selection model has remained a difficult open problem. In this article, we develop a new computational method to tackle this classic problem. Specifically, our method explicitly finds the eigenvalues and eigenfunctions of the diffusion generator associated with the Wright-Fisher diffusion with recurrent mutation and arbitrary diploid selection, thus allowing one to obtain an accurate spectral representation of the transition density function. Simplicity is one of the appealing features of our approach. Although our derivation involves somewhat advanced mathematical concepts, the resulting algorithm is quite simple and efficient, only involving standard linear algebra. Furthermore, unlike previous approaches based on perturbation, which is applicable only when the population-scaled selection coefficient is small, our method is nonperturbative and is valid for a broad range of parameter values. As a by-product of our work, we obtain the rate of convergence to the stationary distribution under mutation-selection balance.

  18. A Simple Method for Finding Explicit Analytic Transition Densities of Diffusion Processes with General Diploid Selection

    PubMed Central

    Song, Yun S.; Steinrücken, Matthias

    2012-01-01

    The transition density function of the Wright–Fisher diffusion describes the evolution of population-wide allele frequencies over time. This function has important practical applications in population genetics, but finding an explicit formula under a general diploid selection model has remained a difficult open problem. In this article, we develop a new computational method to tackle this classic problem. Specifically, our method explicitly finds the eigenvalues and eigenfunctions of the diffusion generator associated with the Wright–Fisher diffusion with recurrent mutation and arbitrary diploid selection, thus allowing one to obtain an accurate spectral representation of the transition density function. Simplicity is one of the appealing features of our approach. Although our derivation involves somewhat advanced mathematical concepts, the resulting algorithm is quite simple and efficient, only involving standard linear algebra. Furthermore, unlike previous approaches based on perturbation, which is applicable only when the population-scaled selection coefficient is small, our method is nonperturbative and is valid for a broad range of parameter values. As a by-product of our work, we obtain the rate of convergence to the stationary distribution under mutation–selection balance. PMID:22209899

  19. Fluid–structure interaction involving large deformations: 3D simulations and applications to biological systems

    PubMed Central

    Tian, Fang-Bao; Dai, Hu; Luo, Haoxiang; Doyle, James F.; Rousseau, Bernard

    2013-01-01

    Three-dimensional fluid–structure interaction (FSI) involving large deformations of flexible bodies is common in biological systems, but accurate and efficient numerical approaches for modeling such systems are still scarce. In this work, we report a successful case of combining an existing immersed-boundary flow solver with a nonlinear finite-element solid-mechanics solver specifically for three-dimensional FSI simulations. This method represents a significant enhancement from the similar methods that are previously available. Based on the Cartesian grid, the viscous incompressible flow solver can handle boundaries of large displacements with simple mesh generation. The solid-mechanics solver has separate subroutines for analyzing general three-dimensional bodies and thin-walled structures composed of frames, membranes, and plates. Both geometric nonlinearity associated with large displacements and material nonlinearity associated with large strains are incorporated in the solver. The FSI is achieved through a strong coupling and partitioned approach. We perform several validation cases, and the results may be used to expand the currently limited database of FSI benchmark study. Finally, we demonstrate the versatility of the present method by applying it to the aerodynamics of elastic wings of insects and the flow-induced vocal fold vibration. PMID:24415796

  20. Prioritizing Genes Related to Nicotine Addiction Via a Multi-source-Based Approach.

    PubMed

    Liu, Xinhua; Liu, Meng; Li, Xia; Zhang, Lihua; Fan, Rui; Wang, Ju

    2015-08-01

    Nicotine has a broad impact on both the central and peripheral nervous systems. Over the past decades, an increasing number of genes potentially involved in nicotine addiction have been identified by different technical approaches. However, the molecular mechanisms underlying nicotine addiction remain largely unknown. Under such situation, prioritizing the candidate genes for further investigation is becoming increasingly important. In this study, we presented a multi-source-based gene prioritization approach for nicotine addiction by utilizing the vast amounts of information generated from for nicotine addiction study during the past years. In this approach, we first collected and curated genes from studies in four categories, i.e., genetic association analysis, genetic linkage analysis, high-throughput gene/protein expression analysis, and literature search of single gene/protein-based studies. Based on these resources, the genes were scored and a weight value was determined for each category. Finally, the genes were ranked by their combined scores, and 220 genes were selected as the prioritized nicotine addiction-related genes. Evaluation suggested the prioritized genes were promising targets for further analysis and replication study.

  1. A community development approach to deal with public drug use in Box Hill.

    PubMed

    Rogers, Neil; Anderson, Warren

    2007-01-01

    The use of alcohol and other drugs in public space is one that generates much heat in the public discourse and in the media. Too often the responses called for to reduce the problems of public amenity involve punitive policing and other responses that aim to engineer (mostly) young people out of these public spaces. Often local retailers are a key stakeholder group calling loudest for punitive action. In this Harm Reduction Digest Rogers and Anderson describe a community development approach taken to address these problems in Box Hill in the City of Whitehorse, near Melbourne. This approach which aimed to develop 'bridging social capital' between community retailers and other stakeholders in the area appears to have been effective in reducing harm associated with public drug use. Moreover these changes have become institutionalised and the approach has been expanded to address other public amenity problems in the area. It is a very nice example of how drug related harm can be reduced by grass roots networks of local councils, business people, law enforcement and health and welfare service providers to address these issues.

  2. Poster — Thur Eve — 69: Computational Study of DVH-guided Cancer Treatment Planning Optimization Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ghomi, Pooyan Shirvani; Zinchenko, Yuriy

    2014-08-15

    Purpose: To compare methods to incorporate the Dose Volume Histogram (DVH) curves into the treatment planning optimization. Method: The performance of three methods, namely, the conventional Mixed Integer Programming (MIP) model, a convex moment-based constrained optimization approach, and an unconstrained convex moment-based penalty approach, is compared using anonymized data of a prostate cancer patient. Three plans we generated using the corresponding optimization models. Four Organs at Risk (OARs) and one Tumor were involved in the treatment planning. The OARs and Tumor were discretized into total of 50,221 voxels. The number of beamlets was 943. We used commercially available optimization softwaremore » Gurobi and Matlab to solve the models. Plan comparison was done by recording the model runtime followed by visual inspection of the resulting dose volume histograms. Conclusion: We demonstrate the effectiveness of the moment-based approaches to replicate the set of prescribed DVH curves. The unconstrained convex moment-based penalty approach is concluded to have the greatest potential to reduce the computational effort and holds a promise of substantial computational speed up.« less

  3. [Medical doctors driving technological innovation: questions about and innovation management approaches to incentive structures for lead users].

    PubMed

    Bohnet-Joschko, Sabine; Kientzler, Fionn

    2010-01-01

    Management science defines user-generated innovations as open innovation and lead user innovation. The medical technology industry finds user-generated innovations profitable and even indispensable. Innovative medical doctors as lead users need medical technology innovations in order to improve patient care. Their motivation to innovate is mostly intrinsic. But innovations may also involve extrinsic motivators such as gain in reputation or monetary incentives. Medical doctors' innovative activities often take place in hospitals and are thus embedded into the hospital's organisational setting. Hospitals find it difficult to gain short-term profits from in-house generated innovations and sometimes hesitate to support them. Strategic investment in medical doctors' innovative activities may be profitable for hospitals in the long run if innovations provide first-mover competitive advantages. Industry co-operations with innovative medical doctors offer chances but also bear potential risks. Innovative ideas generated by expert users may result in even higher complexity of medical devices; this could cause mistakes when applied by less specialised users and thus affect patient safety. Innovations that yield benefits for patients, medical doctors, hospitals and the medical technology industry can be advanced by offering adequate support for knowledge transfer and co-operation models.

  4. Probabilistic Models and Generative Neural Networks: Towards an Unified Framework for Modeling Normal and Impaired Neurocognitive Functions

    PubMed Central

    Testolin, Alberto; Zorzi, Marco

    2016-01-01

    Connectionist models can be characterized within the more general framework of probabilistic graphical models, which allow to efficiently describe complex statistical distributions involving a large number of interacting variables. This integration allows building more realistic computational models of cognitive functions, which more faithfully reflect the underlying neural mechanisms at the same time providing a useful bridge to higher-level descriptions in terms of Bayesian computations. Here we discuss a powerful class of graphical models that can be implemented as stochastic, generative neural networks. These models overcome many limitations associated with classic connectionist models, for example by exploiting unsupervised learning in hierarchical architectures (deep networks) and by taking into account top-down, predictive processing supported by feedback loops. We review some recent cognitive models based on generative networks, and we point out promising research directions to investigate neuropsychological disorders within this approach. Though further efforts are required in order to fill the gap between structured Bayesian models and more realistic, biophysical models of neuronal dynamics, we argue that generative neural networks have the potential to bridge these levels of analysis, thereby improving our understanding of the neural bases of cognition and of pathologies caused by brain damage. PMID:27468262

  5. First non-OEM steam-generator replacement in US a success

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hendsbee, P.M.; Lees, M.D.; Smith, J.C.

    1994-04-01

    In selecting replacements for major powerplant components, a fresh approach can be advantageous--even when complex nuclear components are involved. This was the experience at Unit 2 of Millstone nuclear station, which features an 870-MW pressurized-water reactor (PWR) with two nuclear recirculating steam generators. The unit began operation in 1975. In the early 1980s, pitting problems surfaced in the steam generator tubing; by the mid eighties, tube corrosion had reached an unacceptable level. Virtually all of the 17,000 tubes in the two units were deteriorating, with 2500 plugged and 5000 sleeved. Several new problems also were identified, including secondary-side circumferential crackingmore » of the Alloy 600 tubing near the tubesheet face, and deterioration of the carbon steel egg-crate tube supports. Despite improvements to primary and secondary steam-generator water chemistry, including almost complete copper removal from the condensate and feedwater loops, Northeast Utilities (NU) was unable to completely control degradation of the tube bundles. The utility decided in 1987 that full replacement was the most viable alternative. NU made a bold move, selecting a supplier other than the original equipment manufacturer (OEM).« less

  6. Probabilistic Models and Generative Neural Networks: Towards an Unified Framework for Modeling Normal and Impaired Neurocognitive Functions.

    PubMed

    Testolin, Alberto; Zorzi, Marco

    2016-01-01

    Connectionist models can be characterized within the more general framework of probabilistic graphical models, which allow to efficiently describe complex statistical distributions involving a large number of interacting variables. This integration allows building more realistic computational models of cognitive functions, which more faithfully reflect the underlying neural mechanisms at the same time providing a useful bridge to higher-level descriptions in terms of Bayesian computations. Here we discuss a powerful class of graphical models that can be implemented as stochastic, generative neural networks. These models overcome many limitations associated with classic connectionist models, for example by exploiting unsupervised learning in hierarchical architectures (deep networks) and by taking into account top-down, predictive processing supported by feedback loops. We review some recent cognitive models based on generative networks, and we point out promising research directions to investigate neuropsychological disorders within this approach. Though further efforts are required in order to fill the gap between structured Bayesian models and more realistic, biophysical models of neuronal dynamics, we argue that generative neural networks have the potential to bridge these levels of analysis, thereby improving our understanding of the neural bases of cognition and of pathologies caused by brain damage.

  7. Information Superiority generated through proper application of Geoinformatics

    NASA Astrophysics Data System (ADS)

    Teichmann, F.

    2012-04-01

    Information Superiority generated through proper application of Geoinformatics Information management and especially geoscience information delivery is a very delicate task. If it is carried out successfully, geoscientific data will provide the main foundation of Information Superiority. However, improper implementation of geodata generation, assimilation, distribution or storage will not only waste valuable resources like manpower or money, but could also give rise to crucial deficiency in knowledge and might lead to potentially extremely harmful disasters or wrong decisions. Comprehensive Approach, Effect Based Operations and Network Enabled Capabilities are the current buzz terms in the security regime. However, they also apply to various interdisciplinary tasks like catastrophe relief missions, civil task operations or even in day to day business operations where geo-science data is used. Based on experience in the application of geoscience data for defence applications the following procedure or tool box for generating geodata should lead to the desired information superiority: 1. Understand and analyse the mission, the task and the environment for which the geodata is needed 2. Carry out a Information Exchange Requirement between the user or customer and the geodata provider 3. Implementation of current interoperability standards and a coherent metadata structure 4. Execute innovative data generation, data provision, data assimilation and data storage 5. Apply a cost-effective and reasonable data life cycle 6. Implement IT security by focusing of the three pillar concepts Integrity, Availability and Confidentiality of the critical data 7. Draft and execute a service level agreement or a memorandum of understanding between the involved parties 8. Execute a Continuous Improvement Cycle These ideas from the IT world should be transferred into the geoscience community and applied in a wide set of scenarios. A standardized approach of how to generate, provide, handle, distribute and store geodata will can reduce costs, strengthen the ties between service costumer and geodata provider and improve the contribution geoscience can make for achieving information superiority for decision makers.

  8. Chaos theory perspective for industry clusters development

    NASA Astrophysics Data System (ADS)

    Yu, Haiying; Jiang, Minghui; Li, Chengzhang

    2016-03-01

    Industry clusters have outperformed in economic development in most developing countries. The contributions of industrial clusters have been recognized as promotion of regional business and the alleviation of economic and social costs. It is no doubt globalization is rendering clusters in accelerating the competitiveness of economic activities. In accordance, many ideas and concepts involve in illustrating evolution tendency, stimulating the clusters development, meanwhile, avoiding industrial clusters recession. The term chaos theory is introduced to explain inherent relationship of features within industry clusters. A preferred life cycle approach is proposed for industrial cluster recessive theory analysis. Lyapunov exponents and Wolf model are presented for chaotic identification and examination. A case study of Tianjin, China has verified the model effectiveness. The investigations indicate that the approaches outperform in explaining chaos properties in industrial clusters, which demonstrates industrial clusters evolution, solves empirical issues and generates corresponding strategies.

  9. Editorial: Cognitive Architectures, Model Comparison and AGI

    NASA Astrophysics Data System (ADS)

    Lebiere, Christian; Gonzalez, Cleotilde; Warwick, Walter

    2010-12-01

    Cognitive Science and Artificial Intelligence share compatible goals of understanding and possibly generating broadly intelligent behavior. In order to determine if progress is made, it is essential to be able to evaluate the behavior of complex computational models, especially those built on general cognitive architectures, and compare it to benchmarks of intelligent behavior such as human performance. Significant methodological challenges arise, however, when trying to extend approaches used to compare model and human performance from tightly controlled laboratory tasks to complex tasks involving more open-ended behavior. This paper describes a model comparison challenge built around a dynamic control task, the Dynamic Stocks and Flows. We present and discuss distinct approaches to evaluating performance and comparing models. Lessons drawn from this challenge are discussed in light of the challenge of using cognitive architectures to achieve Artificial General Intelligence.

  10. Computational methods for yeast prion curing curves.

    PubMed

    Ridout, Martin S

    2008-10-01

    If the chemical guanidine hydrochloride is added to a dividing culture of yeast cells in which some of the protein Sup35p is in its prion form, the proportion of cells that carry replicating units of the prion, termed propagons, decreases gradually over time. Stochastic models to describe this process of 'curing' have been developed in earlier work. The present paper investigates the use of numerical methods of Laplace transform inversion to calculate curing curves and contrasts this with an alternative, more direct, approach that involves numerical integration. Transform inversion is found to provide a much more efficient computational approach that allows different models to be investigated with minimal programming effort. The method is used to investigate the robustness of the curing curve to changes in the assumed distribution of cell generation times. Matlab code is available for carrying out the calculations.

  11. Guidelines for adapting cognitive stimulation therapy to other cultures

    PubMed Central

    Aguirre, Elisa; Spector, Aimee; Orrell, Martin

    2014-01-01

    Cognitive stimulation therapy (CST) has been shown to be an useful and cost effective intervention that increases cognition and quality of life of people with mild to moderate dementia. It is increasing in popularity in the UK and worldwide, and a number of research teams have examined its effectiveness in other contexts and cultures. However, it is necessary to develop clear evidence-based guidelines for cultural modification of the intervention. This article describes a community-based developmental approach to adapt CST to different cultures, following the five phases of the formative method for adapting psychotherapy (FMAP), an approach that involves collaborating with service users as a first step to generate and support ideas for therapy adaptation. Examples based on clinical and practical experience are presented, along with suggestions for applying these changes in different cultural contexts. PMID:25061282

  12. Portfolio optimization in enhanced index tracking with goal programming approach

    NASA Astrophysics Data System (ADS)

    Siew, Lam Weng; Jaaman, Saiful Hafizah Hj.; Ismail, Hamizun bin

    2014-09-01

    Enhanced index tracking is a popular form of passive fund management in stock market. Enhanced index tracking aims to generate excess return over the return achieved by the market index without purchasing all of the stocks that make up the index. This can be done by establishing an optimal portfolio to maximize the mean return and minimize the risk. The objective of this paper is to determine the portfolio composition and performance using goal programming approach in enhanced index tracking and comparing it to the market index. Goal programming is a branch of multi-objective optimization which can handle decision problems that involve two different goals in enhanced index tracking, a trade-off between maximizing the mean return and minimizing the risk. The results of this study show that the optimal portfolio with goal programming approach is able to outperform the Malaysia market index which is FTSE Bursa Malaysia Kuala Lumpur Composite Index because of higher mean return and lower risk without purchasing all the stocks in the market index.

  13. Multiphase mean curvature flows with high mobility contrasts: A phase-field approach, with applications to nanowires

    NASA Astrophysics Data System (ADS)

    Bretin, Elie; Danescu, Alexandre; Penuelas, José; Masnou, Simon

    2018-07-01

    The structure of many multiphase systems is governed by an energy that penalizes the area of interfaces between phases weighted by surface tension coefficients. However, interface evolution laws depend also on interface mobility coefficients. Having in mind some applications where highly contrasted or even degenerate mobilities are involved, for which classical phase field models are inapplicable, we propose a new effective phase field approach to approximate multiphase mean curvature flows with mobilities. The key aspect of our model is to incorporate the mobilities not in the phase field energy (which is conventionally the case) but in the metric which determines the gradient flow. We show the consistency of such an approach by a formal analysis of the sharp interface limit. We also propose an efficient numerical scheme which allows us to illustrate the advantages of the model on various examples, as the wetting of droplets on solid surfaces or the simulation of nanowires growth generated by the so-called vapor-liquid-solid method.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cardenas, Rosa E.; Stewart, Kenneth D.; Cowgill, Donald F.

    In our study, the authors developed an approach for accurately quantifying the helium content in a gas mixture also containing hydrogen and methane using commercially available getters. The authors performed a systematic study to examine how both H2 and CH4 can be removed simultaneously from the mixture using two SAES St 172® getters operating at different temperatures. The remaining He within the gas mixture can then be measured directly using a capacitance manometer. Moreover, the optimum combination involved operating one getter at 650°C to decompose the methane, and the second at 110°C to remove the hydrogen. Finally, this approach eliminatedmore » the need to reactivate the getters between measurements, thereby enabling multiple measurements to be made within a short time interval, with accuracy better than 1%. The authors anticipate that such an approach will be particularly useful for quantifying the He-3 in mixtures that include tritium, tritiated methane, and helium-3. The presence of tritiated methane, generated by tritium activity, often complicates such measurements.« less

  15. Understanding social forces involved in diabetes outcomes: a systems science approach to quality-of-life research.

    PubMed

    Lounsbury, David W; Hirsch, Gary B; Vega, Chawntel; Schwartz, Carolyn E

    2014-04-01

    The field of quality-of-life (QOL) research would benefit from learning about and integrating systems science approaches that model how social forces interact dynamically with health and affect the course of chronic illnesses. Our purpose is to describe the systems science mindset and to illustrate the utility of a system dynamics approach to promoting QOL research in chronic disease, using diabetes as an example. We build a series of causal loop diagrams incrementally, introducing new variables and their dynamic relationships at each stage. These causal loop diagrams demonstrate how a common set of relationships among these variables can generate different disease and QOL trajectories for people with diabetes and also lead to a consideration of non-clinical (psychosocial and behavioral) factors that can have implications for program design and policy formulation. The policy implications of the causal loop diagrams are discussed, and empirical next steps to validate the diagrams and quantify the relationships are described.

  16. Graph representation of hepatic vessel based on centerline extraction and junction detection

    NASA Astrophysics Data System (ADS)

    Zhang, Xing; Tian, Jie; Deng, Kexin; Li, Xiuli; Yang, Fei

    2012-02-01

    In the area of computer-aided diagnosis (CAD), segmentation and analysis of hepatic vessel is a prerequisite for hepatic diseases diagnosis and surgery planning. For liver surgery planning, it is crucial to provide the surgeon with a patient-individual three-dimensional representation of the liver along with its vasculature and lesions. The representation allows an exploration of the vascular anatomy and the measurement of vessel diameters, following by intra-patient registration, as well as the analysis of the shape and volume of vascular territories. In this paper, we present an approach for generation of hepatic vessel graph based on centerline extraction and junction detection. The proposed approach involves the following concepts and methods: 1) Flux driven automatic centerline extraction; 2) Junction detection on the centerline using hollow sphere filtering; 3) Graph representation of hepatic vessel based on the centerline and junction. The approach is evaluated on contrast-enhanced liver CT datasets to demonstrate its availability and effectiveness.

  17. Cell-oriented modeling of angiogenesis.

    PubMed

    Guidolin, Diego; Rebuffat, Piera; Albertin, Giovanna

    2011-01-01

    Due to its significant involvement in various physiological and pathological conditions, angiogenesis (the development of new blood vessels from an existing vasculature) represents an important area of the actual biological research and a field in which mathematical modeling proved particularly useful in supporting the experimental work. In this paper, we focus on a specific modeling strategy, known as "cell-centered" approach. This type of mathematical models work at a "mesoscopic scale," assuming the cell as the natural level of abstraction for computational modeling of development. They treat cells phenomenologically, considering their essential behaviors to study how tissue structure and organization emerge from the collective dynamics of multiple cells. The main contributions of the cell-oriented approach to the study of the angiogenic process will be described. From one side, they have generated "basic science understanding" about the process of capillary assembly during development, growth, and pathology. On the other side, models were also developed supporting "applied biomedical research" for the purpose of identifying new therapeutic targets and clinically relevant approaches for either inhibiting or stimulating angiogenesis.

  18. Combining Machine Learning Systems and Multiple Docking Simulation Packages to Improve Docking Prediction Reliability for Network Pharmacology

    PubMed Central

    Hsin, Kun-Yi; Ghosh, Samik; Kitano, Hiroaki

    2013-01-01

    Increased availability of bioinformatics resources is creating opportunities for the application of network pharmacology to predict drug effects and toxicity resulting from multi-target interactions. Here we present a high-precision computational prediction approach that combines two elaborately built machine learning systems and multiple molecular docking tools to assess binding potentials of a test compound against proteins involved in a complex molecular network. One of the two machine learning systems is a re-scoring function to evaluate binding modes generated by docking tools. The second is a binding mode selection function to identify the most predictive binding mode. Results from a series of benchmark validations and a case study show that this approach surpasses the prediction reliability of other techniques and that it also identifies either primary or off-targets of kinase inhibitors. Integrating this approach with molecular network maps makes it possible to address drug safety issues by comprehensively investigating network-dependent effects of a drug or drug candidate. PMID:24391846

  19. Next generation capacity building for the GEOSS community - an European approach

    NASA Astrophysics Data System (ADS)

    Bye, B. L.

    2016-12-01

    The Group on Earth observation embarked on the next 10 year phase with an ambition to streamline and futher develop its achievements in building the Global Earth Observing System of Systems (GEOSS). The NextGEOSS project evolves the European vision of GEOSS data exploitation for innovation and business, relying on the three main pillars of engaging communities, delivering technological developments and advocating the use of GEOSS, in order to support the creation and deployment of Earth observation based innovative research activities and commercial services. In this presentation we will present the new integrated approach to capacity building engaging the various actors involved in the entire value-chain from data providers to decision-makers. A presentation of the general approach together with concrete pilot cases will be included.In this work it will be shown how we integrate new technological development and societial change enabling GEO and GEOSS to adapt to the current environment. The result is important for better decision-making and better use of our limited resources to manage our planet.

  20. Design jeans for recycling: a supply chain case study in The Netherlands.

    PubMed

    van Bommel, Harrie; Goorhuis, Maarten

    2014-11-01

    Because the insight is raising that waste prevention needs an integral product chain approach, a product chain project was awarded with an International Solid Waste Association grant. The project decided to focus on jeans because of the large environmental impacts of cotton and the low recycling rates. The project used an open innovative approach by involving many actors from the different phases of the chain and included student and applied researchers. In a 'design jeans for recycling' students' workshop, prototypes of jeans that are easier to recycle have been developed. Integrating the new generation from different disciplines in the project proved to be very successful. The results show that an open innovation process can lead to very creative ideas and that lessons learned from this project could be used to develop new chain projects for other products. An important condition is that key actors are willing to cooperate in an open innovation approach. © The Author(s) 2014.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Noe, F; Diadone, Isabella; Lollmann, Marc

    There is a gap between kinetic experiment and simulation in their views of the dynamics of complex biomolecular systems. Whereas experiments typically reveal only a few readily discernible exponential relaxations, simulations often indicate complex multistate behavior. Here, a theoretical framework is presented that reconciles these two approaches. The central concept is dynamical fingerprints which contain peaks at the time scales of the dynamical processes involved with amplitudes determined by the experimental observable. Fingerprints can be generated from both experimental and simulation data, and their comparison by matching peaks permits assignment of structural changes present in the simulation to experimentally observedmore » relaxation processes. The approach is applied here to a test case interpreting single molecule fluorescence correlation spectroscopy experiments on a set of fluorescent peptides with molecular dynamics simulations. The peptides exhibit complex kinetics shown to be consistent with the apparent simplicity of the experimental data. Moreover, the fingerprint approach can be used to design new experiments with site-specific labels that optimally probe specific dynamical processes in the molecule under investigation.« less

  2. Learning Assumptions for Compositional Verification

    NASA Technical Reports Server (NTRS)

    Cobleigh, Jamieson M.; Giannakopoulou, Dimitra; Pasareanu, Corina; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Compositional verification is a promising approach to addressing the state explosion problem associated with model checking. One compositional technique advocates proving properties of a system by checking properties of its components in an assume-guarantee style. However, the application of this technique is difficult because it involves non-trivial human input. This paper presents a novel framework for performing assume-guarantee reasoning in an incremental and fully automated fashion. To check a component against a property, our approach generates assumptions that the environment needs to satisfy for the property to hold. These assumptions are then discharged on the rest of the system. Assumptions are computed by a learning algorithm. They are initially approximate, but become gradually more precise by means of counterexamples obtained by model checking the component and its environment, alternately. This iterative process may at any stage conclude that the property is either true or false in the system. We have implemented our approach in the LTSA tool and applied it to the analysis of a NASA system.

  3. Implementing the Next Generation Science Standards: Impacts on Geoscience Education

    NASA Astrophysics Data System (ADS)

    Wysession, M. E.

    2014-12-01

    This is a critical time for the geoscience community. The Next Generation Science Standards (NGSS) have been released and are now being adopted by states (a dozen states and Washington, DC, at the time of writing this), with dramatic implications for national K-12 science education. Curriculum developers and textbook companies are working hard to construct educational materials that match the new standards, which emphasize a hands-on practice-based approach that focuses on working directly with primary data and other forms of evidence. While the set of 8 science and engineering practices of the NGSS lend themselves well to the observation-oriented approach of much of the geosciences, there is currently not a sufficient number of geoscience educational modules and activities geared toward the K-12 levels, and geoscience research organizations need to be mobilizing their education & outreach programs to meet this need. It is a rare opportunity that will not come again in this generation. There are other significant issues surrounding the implementation of the NGSS. The NGSS involves a year of Earth and space science at the high school level, but there does not exist a sufficient workforce is geoscience teachers to meet this need. The form and content of the geoscience standards are also very different from past standards, moving away from a memorization and categorization approach and toward a complex Earth Systems Science approach. Combined with the shift toward practice-based teaching, this means that significant professional development will therefore be required for the existing K-12 geoscience education workforce. How the NGSS are to be assessed is another significant question, with an NRC report providing some guidance but leaving many questions unanswered. There is also an uneasy relationship between the NGSS and the Common Core of math and English, and the recent push-back against the Common Core in many states may impact the implementation of the NGSS.

  4. Loudness perception and speech intensity control in Parkinson's disease.

    PubMed

    Clark, Jenna P; Adams, Scott G; Dykstra, Allyson D; Moodie, Shane; Jog, Mandar

    2014-01-01

    The aim of this study was to examine loudness perception in individuals with hypophonia and Parkinson's disease. The participants included 17 individuals with hypophonia related to Parkinson's disease (PD) and 25 age-equivalent controls. The three loudness perception tasks included a magnitude estimation procedure involving a sentence spoken at 60, 65, 70, 75 and 80 dB SPL, an imitation task involving a sentence spoken at 60, 65, 70, 75 and 80 dB SPL, and a magnitude production procedure involving the production of a sentence at five different loudness levels (habitual, two and four times louder and two and four times quieter). The participants with PD produced a significantly different pattern and used a more restricted range than the controls in their perception of speech loudness, imitation of speech intensity, and self-generated estimates of speech loudness. The results support a speech loudness perception deficit in PD involving an abnormal perception of externally generated and self-generated speech intensity. Readers will recognize that individuals with hypophonia related to Parkinson's disease may demonstrate a speech loudness perception deficit involving the abnormal perception of externally generated and self-generated speech intensity. Copyright © 2014 Elsevier Inc. All rights reserved.

  5. Operators up to dimension seven in standard model effective field theory extended with sterile neutrinos

    NASA Astrophysics Data System (ADS)

    Liao, Yi; Ma, Xiao-Dong

    2017-07-01

    We revisit the effective field theory of the standard model that is extended with sterile neutrinos, N . We examine the basis of complete and independent effective operators involving N up to mass dimension seven (dim-7). By employing equations of motion, integration by parts, and Fierz and group identities, we construct relations among operators that were considered independent in the previous literature, and we find 7 redundant operators at dim-6, as well as 16 redundant operators and two new operators at dim-7. The correct numbers of operators involving N are, without counting Hermitian conjugates, 16 (L ∩B )+1 (L ∩B )+2 (L ∩ B) at dim-6 and 47 (L ∩B )+5 (L ∩ B) at dim-7. Here L /B (L/B) stands for lepton/baryon number conservation (violation). We verify our counting by the Hilbert series approach for nf generations of the standard model fermions and sterile neutrinos. When operators involving different flavors of fermions are counted separately and their Hermitian conjugates are included, we find there are 29 (1614) and 80 (4206) operators involving sterile neutrinos at dim-6 and dim-7, respectively, for nf=1 (3).

  6. LMX1B Mutations Cause Hereditary FSGS without Extrarenal Involvement

    PubMed Central

    Boyer, Olivia; Woerner, Stéphanie; Yang, Fan; Oakeley, Edward J.; Linghu, Bolan; Gribouval, Olivier; Tête, Marie-Josèphe; Duca, José S.; Klickstein, Lloyd; Damask, Amy J.; Szustakowski, Joseph D.; Heibel, Françoise; Matignon, Marie; Baudouin, Véronique; Chantrel, François; Champigneulle, Jacqueline; Martin, Laurent; Nitschké, Patrick; Gubler, Marie-Claire; Johnson, Keith J.; Chibout, Salah-Dine

    2013-01-01

    LMX1B encodes a homeodomain-containing transcription factor that is essential during development. Mutations in LMX1B cause nail-patella syndrome, characterized by dysplasia of the patellae, nails, and elbows and FSGS with specific ultrastructural lesions of the glomerular basement membrane (GBM). By linkage analysis and exome sequencing, we unexpectedly identified an LMX1B mutation segregating with disease in a pedigree of five patients with autosomal dominant FSGS but without either extrarenal features or ultrastructural abnormalities of the GBM suggestive of nail-patella–like renal disease. Subsequently, we screened 73 additional unrelated families with FSGS and found mutations involving the same amino acid (R246) in 2 families. An LMX1B in silico homology model suggested that the mutated residue plays an important role in strengthening the interaction between the LMX1B homeodomain and DNA; both identified mutations would be expected to diminish such interactions. In summary, these results suggest that isolated FSGS could result from mutations in genes that are also involved in syndromic forms of FSGS. This highlights the need to include these genes in all diagnostic approaches to FSGS that involve next-generation sequencing. PMID:23687361

  7. Business Modeling to Implement an eHealth Portal for Infection Control: A Reflection on Co-Creation With Stakeholders

    PubMed Central

    Wentzel, Jobke; Sanderman, Robbert; van Gemert-Pijnen, Lisette

    2015-01-01

    Background It is acknowledged that the success and uptake of eHealth improve with the involvement of users and stakeholders to make technology reflect their needs. Involving stakeholders in implementation research is thus a crucial element in developing eHealth technology. Business modeling is an approach to guide implementation research for eHealth. Stakeholders are involved in business modeling by identifying relevant stakeholders, conducting value co-creation dialogs, and co-creating a business model. Because implementation activities are often underestimated as a crucial step while developing eHealth, comprehensive and applicable approaches geared toward business modeling in eHealth are scarce. Objective This paper demonstrates the potential of several stakeholder-oriented analysis methods and their practical application was demonstrated using Infectionmanager as an example case. In this paper, we aim to demonstrate how business modeling, with the focus on stakeholder involvement, is used to co-create an eHealth implementation. Methods We divided business modeling in 4 main research steps. As part of stakeholder identification, we performed literature scans, expert recommendations, and snowball sampling (Step 1). For stakeholder analyzes, we performed “basic stakeholder analysis,” stakeholder salience, and ranking/analytic hierarchy process (Step 2). For value co-creation dialogs, we performed a process analysis and stakeholder interviews based on the business model canvas (Step 3). Finally, for business model generation, we combined all findings into the business model canvas (Step 4). Results Based on the applied methods, we synthesized a step-by-step guide for business modeling with stakeholder-oriented analysis methods that we consider suitable for implementing eHealth. Conclusions The step-by-step guide for business modeling with stakeholder involvement enables eHealth researchers to apply a systematic and multidisciplinary, co-creative approach for implementing eHealth. Business modeling becomes an active part in the entire development process of eHealth and starts an early focus on implementation, in which stakeholders help to co-create the basis necessary for a satisfying success and uptake of the eHealth technology. PMID:26272510

  8. Principal-Generated YouTube Video as a Method of Improving Parental Involvement

    ERIC Educational Resources Information Center

    Richards, Joey

    2013-01-01

    The purpose of this study was to evaluate the involvement level of parents and reveal whether principal-generated YouTube videos for regular communication would enhance levels of parental involvement at one North Texas Christian Middle School (pseudonym). The following questions guided this study: 1. What is the beginning level of parental…

  9. Optical beam forming techniques for phased array antennas

    NASA Technical Reports Server (NTRS)

    Wu, Te-Kao; Chandler, C.

    1993-01-01

    Conventional phased array antennas using waveguide or coax for signal distribution are impractical for large scale implementation on satellites or spacecraft because they exhibit prohibitively large system size, heavy weight, high attenuation loss, limited bandwidth, sensitivity to electromagnetic interference (EMI) temperature drifts and phase instability. However, optical beam forming systems are smaller, lighter, and more flexible. Three optical beam forming techniques are identified as applicable to large spaceborne phased array antennas. They are (1) the optical fiber replacement of conventional RF phased array distribution and control components, (2) spatial beam forming, and (3) optical beam splitting with integrated quasi-optical components. The optical fiber replacement and the spatial beam forming approaches were pursued by many organizations. Two new optical beam forming architectures are presented. Both architectures involve monolithic integration of the antenna radiating elements with quasi-optical grid detector arrays. The advantages of the grid detector array in the optical process are the higher power handling capability and the dynamic range. One architecture involves a modified version of the original spatial beam forming approach. The basic difference is the spatial light modulator (SLM) device for controlling the aperture field distribution. The original liquid crystal light valve SLM is replaced by an optical shuffling SLM, which was demonstrated for the 'smart pixel' technology. The advantages are the capability of generating the agile beams of a phased array antenna and to provide simultaneous transmit and receive functions. The second architecture considered is the optical beam splitting approach. This architecture involves an alternative amplitude control for each antenna element with an optical beam power divider comprised of mirrors and beam splitters. It also implements the quasi-optical grid phase shifter for phase control and grid amplifier for RF power. The advantages are no SLM is required for this approach, and the complete antenna system is capable of full monolithic integration.

  10. Microbial electricity generation in rice paddy fields: recent advances and perspectives in rhizosphere microbial fuel cells.

    PubMed

    Kouzuma, Atsushi; Kaku, Nobuo; Watanabe, Kazuya

    2014-12-01

    Microbial fuel cells (MFCs) are devices that use living microbes for the conversion of organic matter into electricity. MFC systems can be applied to the generation of electricity at water/sediment interfaces in the environment, such as bay areas, wetlands, and rice paddy fields. Using these systems, electricity generation in paddy fields as high as ∼80 mW m(-2) (based on the projected anode area) has been demonstrated, and evidence suggests that rhizosphere microbes preferentially utilize organic exudates from rice roots for generating electricity. Phylogenetic and metagenomic analyses have been conducted to identify the microbial species and catabolic pathways that are involved in the conversion of root exudates into electricity, suggesting the importance of syntrophic interactions. In parallel, pot cultures of rice and other aquatic plants have been used for rhizosphere MFC experiments under controlled laboratory conditions. The findings from these studies have demonstrated the potential of electricity generation for mitigating methane emission from the rhizosphere. Notably, however, the presence of large amounts of organics in the rhizosphere drastically reduces the effect of electricity generation on methane production. Further studies are necessary to evaluate the potential of these systems for mitigating methane emission from rice paddy fields. We suggest that paddy-field MFCs represent a promising approach for harvesting latent energy of the natural world.

  11. Generation Expansion Planning With Large Amounts of Wind Power via Decision-Dependent Stochastic Programming

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhan, Yiduo; Zheng, Qipeng P.; Wang, Jianhui

    Power generation expansion planning needs to deal with future uncertainties carefully, given that the invested generation assets will be in operation for a long time. Many stochastic programming models have been proposed to tackle this challenge. However, most previous works assume predetermined future uncertainties (i.e., fixed random outcomes with given probabilities). In several recent studies of generation assets' planning (e.g., thermal versus renewable), new findings show that the investment decisions could affect the future uncertainties as well. To this end, this paper proposes a multistage decision-dependent stochastic optimization model for long-term large-scale generation expansion planning, where large amounts of windmore » power are involved. In the decision-dependent model, the future uncertainties are not only affecting but also affected by the current decisions. In particular, the probability distribution function is determined by not only input parameters but also decision variables. To deal with the nonlinear constraints in our model, a quasi-exact solution approach is then introduced to reformulate the multistage stochastic investment model to a mixed-integer linear programming model. The wind penetration, investment decisions, and the optimality of the decision-dependent model are evaluated in a series of multistage case studies. The results show that the proposed decision-dependent model provides effective optimization solutions for long-term generation expansion planning.« less

  12. Increasing power generation in horizontal axis wind turbines using optimized flow control

    NASA Astrophysics Data System (ADS)

    Cooney, John A., Jr.

    In order to effectively realize future goals for wind energy, the efficiency of wind turbines must increase beyond existing technology. One direct method for achieving increased efficiency is by improving the individual power generation characteristics of horizontal axis wind turbines. The potential for additional improvement by traditional approaches is diminishing rapidly however. As a result, a research program was undertaken to assess the potential of using distributed flow control to increase power generation. The overall objective was the development of validated aerodynamic simulations and flow control approaches to improve wind turbine power generation characteristics. BEM analysis was conducted for a general set of wind turbine models encompassing last, current, and next generation designs. This analysis indicated that rotor lift control applied in Region II of the turbine power curve would produce a notable increase in annual power generated. This was achieved by optimizing induction factors along the rotor blade for maximum power generation. In order to demonstrate this approach and other advanced concepts, the University of Notre Dame established the Laboratory for Enhanced Wind Energy Design (eWiND). This initiative includes a fully instrumented meteorological tower and two pitch-controlled wind turbines. The wind turbines are representative in their design and operation to larger multi-megawatt turbines, but of a scale that allows rotors to be easily instrumented and replaced to explore new design concepts. Baseline data detailing typical site conditions and turbine operation is presented. To realize optimized performance, lift control systems were designed and evaluated in CFD simulations coupled with shape optimization tools. These were integrated into a systematic design methodology involving BEM simulations, CFD simulations and shape optimization, and selected experimental validation. To refine and illustrate the proposed design methodology, a complete design cycle was performed for the turbine model incorporated in the wind energy lab. Enhanced power generation was obtained through passive trailing edge shaping aimed at reaching lift and lift-to-drag goals predicted to optimize performance. These targets were determined by BEM analysis to improve power generation characteristics and annual energy production (AEP) for the wind turbine. A preliminary design was validated in wind tunnel experiments on a 2D rotor section in preparation for testing in the full atmospheric environment of the eWiND Laboratory. These tests were performed for the full-scale geometry and atmospheric conditions. Upon making additional improvements to the shape optimization tools, a series of trailing edge additions were designed to optimize power generation. The trailing edge additions were predicted to increase the AEP by up to 4.2% at the White Field site. The pieces were rapid-prototyped and installed on the wind turbine in March, 2014. Field tests are ongoing.

  13. Learning in Earth and space science: a review of conceptual change instructional approaches

    NASA Astrophysics Data System (ADS)

    Mills, Reece; Tomas, Louisa; Lewthwaite, Brian

    2016-03-01

    In response to calls for research into effective instruction in the Earth and space sciences, and to identify directions for future research, this systematic review of the literature explores research into instructional approaches designed to facilitate conceptual change. In total, 52 studies were identified and analyzed. Analysis focused on the general characteristics of the research, the conceptual change instructional approaches that were used, and the methods employed to evaluate the effectiveness of these approaches. The findings of this review support four assertions about the existing research: (1) astronomical phenomena have received greater attention than geological phenomena; (2) most studies have viewed conceptual change from a cognitive perspective only; (3) data about conceptual change were generated pre- and post-intervention only; and (4) the interventions reviewed presented limited opportunities to involve students in the construction and manipulation of multiple representations of the phenomenon being investigated. Based upon these assertions, the authors recommend that new research in the Earth and space science disciplines challenges traditional notions of conceptual change by exploring the role of affective variables on learning, focuses on the learning of geological phenomena through the construction of multiple representations, and employs qualitative data collection throughout the implementation of an instructional approach.

  14. Iron-mediated redox modulation in neural plasticity

    PubMed Central

    Muñoz, Pablo

    2012-01-01

    The role of iron in brain physiology has focused on the neuropathological, effects due to iron-induced oxidative stress. However, our recent work has established a physiological relationship between the iron-mediated oxidative modification and normal neuronal function. Our results obtained from hippocampal neurons, suggest that iron-generated reactive species oxygen (ROS) are involved in calcium signaling initiated by stimulation of NMDA receptors. This signal is amplified by ryanodine receptors (RyR), a redox- sensitive calcium channel, allowing the phosphorylation and nuclear translocation of ERK1/2. Furthermore, using electrophysiological approaches, we showed that iron is required for basal synaptic transmission and full expression of long-term potentiation, a type of synaptic plasticity. Our data combined suggest that the oxidative effect of iron is critical to activate processes that are downstream of NMDAR activation. Finally, due to the high reactivity of DNA with iron-generated ROS, we hypothesize an additional function of iron in gene regulation. PMID:22808323

  15. A Community-Based Participatory Approach to Personalized, Computer-Generated Nutrition Feedback Reports: The Healthy Environments Partnership

    PubMed Central

    Kannan, Srimathi; Schulz, Amy; Israel, Barbara; Ayra, Indira; Weir, Sheryl; Dvonch, Timothy J.; Rowe, Zachary; Miller, Patricia; Benjamin, Alison

    2008-01-01

    Background Computer tailoring and personalizing recommendations for dietary health-promoting behaviors are in accordance with community-based participatory research (CBPR) principles, which emphasizes research that benefits the participants and community involved. Objective To describe the CBPR process utilized to computer-generate and disseminate personalized nutrition feedback reports (NFRs) for Detroit Healthy Environments Partnership (HEP) study participants. METHODS The CBPR process included discussion and feedback from HEP partners on several draft personalized reports. The nutrition feedback process included defining the feedback objectives; prioritizing the nutrients; customizing the report design; reviewing and revising the NFR template and readability; producing and disseminating the report; and participant follow-up. Lessons Learned Application of CBPR principles in designing the NFR resulted in a reader-friendly product with useful recommendations to promote heart health. Conclusions A CBPR process can enhance computer tailoring of personalized NFRs to address racial and socioeconomic disparities in cardiovascular disease (CVD). PMID:19337572

  16. The application of connectionism to query planning/scheduling in intelligent user interfaces

    NASA Technical Reports Server (NTRS)

    Short, Nicholas, Jr.; Shastri, Lokendra

    1990-01-01

    In the mid nineties, the Earth Observing System (EOS) will generate an estimated 10 terabytes of data per day. This enormous amount of data will require the use of sophisticated technologies from real time distributed Artificial Intelligence (AI) and data management. Without regard to the overall problems in distributed AI, efficient models were developed for doing query planning and/or scheduling in intelligent user interfaces that reside in a network environment. Before intelligent query/planning can be done, a model for real time AI planning and/or scheduling must be developed. As Connectionist Models (CM) have shown promise in increasing run times, a connectionist approach to AI planning and/or scheduling is proposed. The solution involves merging a CM rule based system to a general spreading activation model for the generation and selection of plans. The system was implemented in the Rochester Connectionist Simulator and runs on a Sun 3/260.

  17. A laser based frequency modulated NL-OSL phenomenon

    NASA Astrophysics Data System (ADS)

    Mishra, D. R.; Bishnoi, A. S.; Soni, Anuj; Rawat, N. S.; Bhatt, B. C.; Kulkarni, M. S.; Babu, D. A. R.

    2015-01-01

    The detailed theoretical and experimental approach to novel technique of pulse frequency modulated stimulation (PFMS) method has been described for NL-OSL phenomenon. This method involved pulsed frequency modulation with respect to time for fixed pulse width of 532 nm continuous wave (CW)-laser light. The linearly modulated (LM)-, non-linearly (NL)-stimulation profiles have been generated using fast electromagnetic optical shutter. The PFMS parameters have been determined for present experimental setup. The PFMS based LM-, NL-OSL studies have been carried out on dosimetry grade single crystal α-Al2O3:C. The photo ionization cross section of α-Al2O3:C has been found to be ∼9.97 × 10-19 cm2 for 532 nm laser light using PFMS LM-OSL studies under assumption of first order of kinetic. This method of PFMS is found to be a potential alternative to generate different stimulation profiles using CW-light sources.

  18. A Cartesian grid approach with hierarchical refinement for compressible flows

    NASA Technical Reports Server (NTRS)

    Quirk, James J.

    1994-01-01

    Many numerical studies of flows that involve complex geometries are limited by the difficulties in generating suitable grids. We present a Cartesian boundary scheme for two-dimensional, compressible flows that is unfettered by the need to generate a computational grid and so it may be used, routinely, even for the most awkward of geometries. In essence, an arbitrary-shaped body is allowed to blank out some region of a background Cartesian mesh and the resultant cut-cells are singled out for special treatment. This is done within a finite-volume framework and so, in principle, any explicit flux-based integration scheme can take advantage of this method for enforcing solid boundary conditions. For best effect, the present Cartesian boundary scheme has been combined with a sophisticated, local mesh refinement scheme, and a number of examples are shown in order to demonstrate the efficacy of the combined algorithm for simulations of shock interaction phenomena.

  19. Thiolation mediated pegylation platform to generate functional universal red blood cells.

    PubMed

    Nacharaju, Parimala; Manjula, Belur N; Acharya, Seetharama A

    2007-01-01

    The PEGylation that adds an extension arm on protein amino groups with the conservation of their positive charge masks the A and D antigens of erythrocytes efficiently. In the present study, the efficiency of masking the antigens of RBC by PEGylation protocols that do not conserve the charge with and without adding extension arms is compared. The conjugation of PEG-5000 to RBCs through the addition of extension arms masked the D antigen more efficiently than the other protocol. A combination of PEG-5 K and PEG-20 K is needed to mask the A antigen, irrespective of the PEGylation approach. The oxygen affinity of the PEGylated RBCs increased by the extension arm facilitated PEGylation. The protocol involving the conjugation of PEG-chains without adding extension arm did not alter the oxygen affinity of RBCs. A combination of PEGylation protocols is an alternate strategy to generate universal red blood cells with good levels of oxygen affinity.

  20. The contact sport of rough surfaces

    NASA Astrophysics Data System (ADS)

    Carpick, Robert W.

    2018-01-01

    Describing the way two surfaces touch and make contact may seem simple, but it is not. Fully describing the elastic deformation of ideally smooth contacting bodies, under even low applied pressure, involves second-order partial differential equations and fourth-rank elastic constant tensors. For more realistic rough surfaces, the problem becomes a multiscale exercise in surface-height statistics, even before including complex phenomena such as adhesion, plasticity, and fracture. A recent research competition, the “Contact Mechanics Challenge” (1), was designed to test various approximate methods for solving this problem. A hypothetical rough surface was generated, and the community was invited to model contact with this surface with competing theories for the calculation of properties, including contact area and pressure. A supercomputer-generated numerical solution was kept secret until competition entries were received. The comparison of results (2) provides insights into the relative merits of competing models and even experimental approaches to the problem.

Top