Sample records for concept processes solutions

  1. A modular framework for biomedical concept recognition

    PubMed Central

    2013-01-01

    Background Concept recognition is an essential task in biomedical information extraction, presenting several complex and unsolved challenges. The development of such solutions is typically performed in an ad-hoc manner or using general information extraction frameworks, which are not optimized for the biomedical domain and normally require the integration of complex external libraries and/or the development of custom tools. Results This article presents Neji, an open source framework optimized for biomedical concept recognition built around four key characteristics: modularity, scalability, speed, and usability. It integrates modules for biomedical natural language processing, such as sentence splitting, tokenization, lemmatization, part-of-speech tagging, chunking and dependency parsing. Concept recognition is provided through dictionary matching and machine learning with normalization methods. Neji also integrates an innovative concept tree implementation, supporting overlapped concept names and respective disambiguation techniques. The most popular input and output formats, namely Pubmed XML, IeXML, CoNLL and A1, are also supported. On top of the built-in functionalities, developers and researchers can implement new processing modules or pipelines, or use the provided command-line interface tool to build their own solutions, applying the most appropriate techniques to identify heterogeneous biomedical concepts. Neji was evaluated against three gold standard corpora with heterogeneous biomedical concepts (CRAFT, AnEM and NCBI disease corpus), achieving high performance results on named entity recognition (F1-measure for overlap matching: species 95%, cell 92%, cellular components 83%, gene and proteins 76%, chemicals 65%, biological processes and molecular functions 63%, disorders 85%, and anatomical entities 82%) and on entity normalization (F1-measure for overlap name matching and correct identifier included in the returned list of identifiers: species 88%, cell 71%, cellular components 72%, gene and proteins 64%, chemicals 53%, and biological processes and molecular functions 40%). Neji provides fast and multi-threaded data processing, annotating up to 1200 sentences/second when using dictionary-based concept identification. Conclusions Considering the provided features and underlying characteristics, we believe that Neji is an important contribution to the biomedical community, streamlining the development of complex concept recognition solutions. Neji is freely available at http://bioinformatics.ua.pt/neji. PMID:24063607

  2. Dynamics of a Definition: A Framework to Analyse Student Construction of the Concept of Solution to a Differential Equation

    ERIC Educational Resources Information Center

    Raychaudhuri, Debasree

    2008-01-01

    In this note we develop a framework that makes explicit the inherent dynamic structure of certain mathematical definitions by means of the four facets of context-entity-process-object. These facets and their interrelations are then used to capture and interpret specific aspects of student constructions of the concept of solution to first order…

  3. From the past to the future: Integrating work experience into the design process.

    PubMed

    Bittencourt, João Marcos; Duarte, Francisco; Béguin, Pascal

    2017-01-01

    Integrating work activity issues into design process is a broadly discussed theme in ergonomics. Participation is presented as the main means for such integration. However, a late participation can limit the development of both project solutions and future work activity. This article presents the concept of construction of experience aiming at the articulated development of future activities and project solutions. It is a non-teleological approach where the initial concepts will be transformed by the experience built up throughout the design process. The method applied was a case study of an ergonomic participation during the design of a new laboratory complex for biotechnology research. Data was obtained through analysis of records in a simulation process using a Lego scale model and interviews with project participants. The simulation process allowed for developing new ways of working and generating changes in the initial design solutions, which enable workers to adopt their own developed strategies for conducting work more safely and efficiently in the future work system. Each project decision either opens or closes a window of opportunities for developing a future activity. Construction of experience in a non-teleological design process allows for understanding the consequences of project solutions for future work.

  4. Exergie /4th revised and enlarged edition/

    NASA Astrophysics Data System (ADS)

    Baloh, T.; Wittwer, E.

    The theoretical concept of exergy is explained and its practical applications are discussed. Equilibrium and thermal equilibrium are reviewed as background, and exergy is considered as a reference point for solid-liquid, liquid-liquid, and liquid-gas systems. Exergetic calculations and their graphic depictions are covered. The concepts of enthalpy and entropy are reviewed in detail, including their applications to gas mixtures, solutions, and isolated substances. The exergy of gas mixtures, solutions, and isolated substances is discussed, including moist air, liquid water in water vapor, dry air, and saturation-limited solutions. Mollier exergy-enthalpy-entropy diagrams are presented for two-component systems, and exergy losses for throttling, isobaric mixing, and heat transfer are addressed. The relationship of exergy to various processes is covered, including chemical processes, combustion, and nuclear reactions. The optimization of evaporation plants through exergy is discussed. Calculative examples are presented for energy production and heating, industrial chemical processes, separation of liquid air, nuclear reactors, and others.

  5. Practical solution concepts for planning and designing roadways in Kentucky.

    DOT National Transportation Integrated Search

    2008-10-01

    Kentucky's highway agency has embarked upon an initiative tagged "Practical Solutions" which sets its goal toward reducing costs throughout the project development process extended into operations and maintenance of all highway facilities. This study...

  6. Constructing conceptual knowledge and promoting "number sense" from computer-managed practice in rounding whole numbers

    NASA Astrophysics Data System (ADS)

    Hativa, Nira

    1993-12-01

    This study sought to identify how high achievers learn and understand new concepts in arithmetic from computer-based practice which provides full solutions to examples but without verbal explanations. Four high-achieving second graders were observed in their natural school settings throughout all their computer-based practice sessions which involved the concept of rounding whole numbers, a concept which was totally new to them. Immediate post-session interviews inquired into students' strategies for solutions, errors, and their understanding of the underlying mathematical rules. The article describes the process through which the students construct their knowledge of the rounding concepts and the errors and misconceptions encountered in this process. The article identifies the cognitive abilities that promote student self-learning of the rounding concepts, their number concepts and "number sense." Differences in the ability to generalise, "mathematical memory," mindfulness of work and use of cognitive strategies are shown to account for the differences in patterns of, and gains in, learning and in maintaining knowledge among the students involved. Implications for the teaching of estimation concepts and of promoting students' "number sense," as well as for classroom use of computer-based practice are discussed.

  7. A Hypermedia Environment To Explore and Negotiate Students' Conceptions: Animation of the Solution Process of Table Salt.

    ERIC Educational Resources Information Center

    Ebenezer, Jazlin V.

    2001-01-01

    Describes the characteristics and values of hypermedia for learning chemistry. Reports on how a hypermedia environment was used to explore a group of 11th grade chemistry students' conceptions of table salt dissolving in water. Indicates that a hypermedia environment can be used to explore, negotiate, and assess students' conceptions of…

  8. Microwave vision for robots

    NASA Technical Reports Server (NTRS)

    Lewandowski, Leon; Struckman, Keith

    1994-01-01

    Microwave Vision (MV), a concept originally developed in 1985, could play a significant role in the solution to robotic vision problems. Originally our Microwave Vision concept was based on a pattern matching approach employing computer based stored replica correlation processing. Artificial Neural Network (ANN) processor technology offers an attractive alternative to the correlation processing approach, namely the ability to learn and to adapt to changing environments. This paper describes the Microwave Vision concept, some initial ANN-MV experiments, and the design of an ANN-MV system that has led to a second patent disclosure in the robotic vision field.

  9. An Advection-Diffusion Concept for Solute Transport in Heterogeneous Unconsolidated Geological Deposits

    NASA Astrophysics Data System (ADS)

    Gillham, R. W.; Sudicky, E. A.; Cherry, J. A.; Frind, E. O.

    1984-03-01

    In layered permeable deposits with flow predominately parallel to the bedding, advection causes rapid solute transport in the more permeable layers. As the solute advances more rapidly in these layers, solute mass is continually transferred to the less permeable layers as a result of molecular diffusion due to the concentration gradient between the layers. The interlayer solute transfer causes the concentration to decline along the permeable layers at the expense of increasing the concentration in the less permeable layers, which produces strongly dispersed concentration profiles in the direction of flow. The key parameters affecting the dispersive capability of the layered system are the diffusion coefficients for the less permeable layers, the thicknesses of the layers, and the hydraulic conductivity contrasts between the layers. Because interlayer solute transfer by transverse molecular diffusion is a time-dependent process, the advection-diffusion concept predicts a rate of longitudinal spreading during the development of the dispersion process that is inconsistent with the classical Fickian dispersion model. A second consequence of the solute-storage effect offered by transverse diffusion into low-permeability layers is a rate of migration of the frontal portion of a contaminant in the permeable layers that is less than the groundwater velocity. Although various lines of evidence are presented in support of the advection-diffusion concept, more work is required to determine the range of geological materials for which it is applicable and to develop mathematical expressions that will make it useful as a predictive tool for application to field cases of contaminant migration.

  10. Application and Validation of Concept Maturity Assessment Framework

    DTIC Science & Technology

    2011-03-01

    process. The following chapter will discuss a proposed methodology for validation of the concept maturity framwork and its Concept Evaluation and...of each contractor‟s conceptual solution and any gaps in information that may have been overlooked. The organization also commented that the... conceptual and does not have a specific system tied to it is often vulnerable to losing interest and potentially funding from decision makers. However

  11. Conceptual design of a device to measure hand swelling in a micro-gravity environment

    NASA Technical Reports Server (NTRS)

    Hysinger, Christopher L.

    1993-01-01

    In the design of pressurized suits for use by astronauts in space, proper fit is an important consideration. One particularly difficult aspect of the suit design is the design of the gloves. If the gloves of the suit do not fit properly, the grip strength of the astronaut can be decreased by as much as fifty percent. These gloves are designed using an iterative process and can cost over 1.5 million dollars. Glove design is further complicated by the way the body behaves in a micro-gravity environment. In a micro-gravity setting, fluid from the lower body tends to move into the upper body. Some of this fluid collects in the hands and causes the hands to swell. Therefore, a pair of gloves that fit well on earth may not fit well when they are used in space. The conceptual design process for a device which can measure the swelling that occurs in the hands in a micro-gravity environment is described. This process involves developing a specifications list and function structure for the device and generating solution variants for each of the sub functions. The solution variants are then filtered, with the variants that violate any of the specifications being discarded. After acceptable solution variants are obtained, they are combined to form design concepts. These design concepts are evaluated against a set of criteria and the design concepts are ranked in order of preference. Through this process, the two most plausible design concepts were an ultrasonic imaging technique and a laser mapping technique. Both of these methods create a three dimensional model of the hand, from which the amount of swelling can be determined. In order to determine which of the two solutions will actually work best, a further analysis will need to be performed.

  12. Development of a measure of work motivation for a meta-theory of motivation.

    PubMed

    Ryan, James C

    2011-06-01

    This study presents a measure of work motivation designed to assess the motivational concepts of the meta-theory of motivation. These concepts include intrinsic process motivation, goal internalization motivation, instrumental motivation, external self-concept motivation, and internal self-concept motivation. Following a process of statement development and identification, six statements for each concept were presented to a sample of working professionals (N = 330) via a paper-and-pencil questionnaire. Parallel analysis supported a 5-factor solution, with a varimax rotation identifying 5 factors accounting for 48.9% of total variance. All 5 scales had Cronbach alpha coefficients above .70. Limitations of the newly proposed questionnaire and suggestions for its further development and use are discussed.

  13. Spontaneous Group Learning in Ambient Learning Environments

    NASA Astrophysics Data System (ADS)

    Bick, Markus; Jughardt, Achim; Pawlowski, Jan M.; Veith, Patrick

    Spontaneous Group Learning is a concept to form and facilitate face-to-face, ad-hoc learning groups in collaborative settings. We show how to use Ambient Intelligence to identify, support, and initiate group processes. Learners' positions are determined by widely used technologies, e.g., Bluetooth and WLAN. As a second step, learners' positions, tasks, and interests are visualized. Finally, a group process is initiated supported by relevant documents and services. Our solution is a starting point to develop new didactical solutions for collaborative processes.

  14. Information management for commercial aviation - A research perspective

    NASA Technical Reports Server (NTRS)

    Ricks, Wendell R.; Abbott, Kathy H.; Jonsson, Jon E.; Boucek, George; Rogers, William H.

    1991-01-01

    The problem of flight deck information management (IM), defined as processing, controlling, and directing information, for commercial flight decks, and a research effort underway to address this problem, are discussed. The premises provided are utilized to lay the groundwork required for such research by providing a framework to describe IM problems and an avenue to follow when investigating solution concepts. The research issues presented serve to identify specific questions necessary to achieve a better understanding of the IM problem, and to provide assessments of the relative merit of various solution concepts.

  15. An overview on tritium permeation barrier development for WCLL blanket concept

    NASA Astrophysics Data System (ADS)

    Aiello, A.; Ciampichetti, A.; Benamati, G.

    2004-08-01

    The reduction of tritium permeation through blanket structural materials and cooling tubes has to be carefully evaluated to minimise radiological hazards. A strong effort has been made in the past to select the best technological solution for the realisation of tritium permeation barriers (TPB) on complex structures not directly accessible after the completion of the manufacturing process. The best solution was identified in aluminium rich coatings, which form Al 2O 3 at their surface. Two technologies were selected as reference for the realisation of coating in the WCLL blanket concept: the chemical vapour deposition (CVD) process developed on laboratory scale by CEA, and the hot dipping (HD) process developed by FZK. The results obtained during three years of tests on CVD and HD coated specimens in gas and liquid metal phase are summarised and discussed.

  16. Introduction to Command, Control and Communications (C3) Through Comparative Case Analysis

    DTIC Science & Technology

    1990-03-01

    enhancing the process of learning from experience. Case study allows the student to apply concepts , theories, and techniques to an actual incident within...part of the thesis describes selected principles and concepts of 33 related to cormruication management, interoperability, command structure and...The solutions to the cases require applying the principles and concepts presented in the first rart. The four cases are: (1) the Iran hostage rescue

  17. DESCRIPTION OF ATMOSPHERIC TRANSPORT PROCESSES IN EULERIAN AIR QUALITY MODELS

    EPA Science Inventory

    Key differences among many types of air quality models are the way atmospheric advection and turbulent diffusion processes are treated. Gaussian models use analytical solutions of the advection-diffusion equations. Lagrangian models use a hypothetical air parcel concept effecti...

  18. Addressing Student Misconceptions Concerning Electron Flow in Aqueous Solutions with Instruction Including Computer Animations and Conceptual Change Strategies.

    ERIC Educational Resources Information Center

    Sanger, Michael J.; Greenbowe, Thomas J.

    2000-01-01

    Investigates the effects of both computer animations of microscopic chemical processes occurring in a galvanic cell and conceptual-change instruction based on chemical demonstrations on students' conceptions of current flow in electrolyte solutions. Finds that conceptual change instruction was effective at dispelling student misconceptions but…

  19. Optimisation of maintenance concept choice using risk-decision factor - a case study

    NASA Astrophysics Data System (ADS)

    Popovic, Vladimir M.; Vasic, Branko M.; Rakicevic, Branislav B.; Vorotovic, Goran S.

    2012-10-01

    The design of maintenance system and the corresponding logistic support is a very complex process, during which the aim is to find the compromise solutions regarding the relations among different maintenance procedures and the ways of their implementation. As a result of this, various solutions can be adopted, since this is conditioned by a series of important factors and criteria, which can be contradictory sometimes. There are different perspectives on ways of solving practical maintenance problems, that is dilemmas when it comes to the choice of maintenance concept. The principal dilemma is how and when to decide on carrying out maintenance procedures. Should the decision be based on theoretical grounds or experience, how does one reconcile those two extremes, who is to decide upon this? In this article we have offered one, basically new solution as a possibility for maintenance concept choice, based on a significant modification of the widely used failure modes and effects analysis (FMEA) method. This solution is risk-decision factor (RDF). This is a result of seven parameters (of different importance and weight) that have the key impact on the process of production and logistic support. The application of this factor is illustrated by the example of planning, organisation and functioning of the maintenance system applied in The Institute for Manufacturing Banknotes and Coins (ZIN) in Belgrade.

  20. The solusphere-its inferences and study

    USGS Publications Warehouse

    Rainwater, F.H.; White, W.F.

    1958-01-01

    Water is a fundamental geologic agent active in rock decomposition, erosion, and synthesis. Solutes in water are of particular interest to geochemists as sources of raw material for synthesis or as products of decomposition. When geochemical studies move from the laboratory into natural environment many variables relating to solute hydrology must be considered. As a focal point there has been designed a graphical representation of solute hydrology, the solusphere, which embodies the concepts of land-water occurrence and movement on which are superimposed geologic, biologic, physical, chemical, and cultural processes affecting solutes. The solusphere is demonstrated by passing an imaginary plane through the centre of the earth. This plane intercepts concentric zones designated as rock flowage, saturation, aeration, surface activity, and atmosphere. Transport processes carry solutes within and between zones without alteration or conversion. However, whether stationary or in motion, the water's solute character is constantly subject to (1) alteration processes that change concentration by addition or subtraction of solutes or solvent without loss of solute identities, and (2) conversion processes that change the chemical state and form of solutes. The geochemist is concerned with specific conversion processes, but he also must consider transport, alteration, and other conversion processes that are continually modifying the materials with which he is dealing in nature. The solusphere is an attempt to organize processes affecting the chemical quality of land waters into a unified field of science much like the field of marine chemistry. ?? 1958.

  1. Capacity planning for waste management systems: an interval fuzzy robust dynamic programming approach.

    PubMed

    Nie, Xianghui; Huang, Guo H; Li, Yongping

    2009-11-01

    This study integrates the concepts of interval numbers and fuzzy sets into optimization analysis by dynamic programming as a means of accounting for system uncertainty. The developed interval fuzzy robust dynamic programming (IFRDP) model improves upon previous interval dynamic programming methods. It allows highly uncertain information to be effectively communicated into the optimization process through introducing the concept of fuzzy boundary interval and providing an interval-parameter fuzzy robust programming method for an embedded linear programming problem. Consequently, robustness of the optimization process and solution can be enhanced. The modeling approach is applied to a hypothetical problem for the planning of waste-flow allocation and treatment/disposal facility expansion within a municipal solid waste (MSW) management system. Interval solutions for capacity expansion of waste management facilities and relevant waste-flow allocation are generated and interpreted to provide useful decision alternatives. The results indicate that robust and useful solutions can be obtained, and the proposed IFRDP approach is applicable to practical problems that are associated with highly complex and uncertain information.

  2. Meaningful Solutions for the Unemployed or Their Counsellors? The Role of Case Managers' Conceptions of Their Work

    ERIC Educational Resources Information Center

    Värk, Aare; Reino, Anne

    2018-01-01

    This article reports the outcomes of a phenomenographical study of case managers' conceptions of case management work and its influence on the process and performance of the work of counselling the unemployed. A heterogeneous sample of 11 Estonian case managers was selected for in-depth interviews. Analysis of the interviews revealed three…

  3. Discovery Reconceived: Product before Process

    ERIC Educational Resources Information Center

    Abrahamson, Dor

    2012-01-01

    Motivated by the question, "What exactly about a mathematical concept should students discover, when they study it via discovery learning?", I present and demonstrate an interpretation of discovery pedagogy that attempts to address its criticism. My approach hinges on decoupling the solution process from its resultant product. Whereas theories of…

  4. Place as a social space: fields of encounter relating to the local sustainability process.

    PubMed

    Dumreicher, Heidi; Kolb, Bettina

    2008-04-01

    The paper shows how sustainability questions relate to the local space. The local place is not a static entity, but a dynamic one, undergoing constant changes, and it is the rapid social and material processes within the given local situation that is a challenge for the Chinese villages and their integrity. The following article considers the cohesion between the dwellers' emotional co-ownership of their local space and the sustainability process as a driving force in social, economic and ecological development. We bring together the classification of the seven fields of encounter, which were developed out of the empirical data of the Chinese case study villages, and sustainability oriented management considerations for all levels of this concept. We do not pretend to know the solutions, but describe a set of interrelated fields that can be anchor points for placing the solutions and show in which fields action and intervention is possible. In our concept of sustainability, every spatial field has its special meaning, needs special measures and policies and has different connotations to concepts like responsibility, family values or communication systems. We see the social sustainability process as a support for the empowerment of the local dwellers, and the SUCCESS research has encouraged the villages to find suitable sustainability oriented solutions for their natural and societal situation. Before entering the discussion about the chances and potential of a sustainability approach for the Chinese villages, it is first necessary to accept the fact that rural villages play a primordial role in Chinese society and that their potential can strengthen future pathways for China.

  5. a New Initiative for Tiling, Stitching and Processing Geospatial Big Data in Distributed Computing Environments

    NASA Astrophysics Data System (ADS)

    Olasz, A.; Nguyen Thai, B.; Kristóf, D.

    2016-06-01

    Within recent years, several new approaches and solutions for Big Data processing have been developed. The Geospatial world is still facing the lack of well-established distributed processing solutions tailored to the amount and heterogeneity of geodata, especially when fast data processing is a must. The goal of such systems is to improve processing time by distributing data transparently across processing (and/or storage) nodes. These types of methodology are based on the concept of divide and conquer. Nevertheless, in the context of geospatial processing, most of the distributed computing frameworks have important limitations regarding both data distribution and data partitioning methods. Moreover, flexibility and expendability for handling various data types (often in binary formats) are also strongly required. This paper presents a concept for tiling, stitching and processing of big geospatial data. The system is based on the IQLib concept (https://github.com/posseidon/IQLib/) developed in the frame of the IQmulus EU FP7 research and development project (http://www.iqmulus.eu). The data distribution framework has no limitations on programming language environment and can execute scripts (and workflows) written in different development frameworks (e.g. Python, R or C#). It is capable of processing raster, vector and point cloud data. The above-mentioned prototype is presented through a case study dealing with country-wide processing of raster imagery. Further investigations on algorithmic and implementation details are in focus for the near future.

  6. A continuous quality improvement team approach to adverse drug reaction reporting.

    PubMed

    Flowers, P; Dzierba, S; Baker, O

    1992-07-01

    Crossfunctional teams can generate more new ideas, concepts, and possible solutions than does a department-based process alone. Working collaboratively can increase knowledge of teams using CQI approaches and appropriate tools. CQI produces growth and development at multiple levels resulting from involvement in the process of incremental improvement.

  7. Facilitating an Elementary Engineering Design Process Module

    ERIC Educational Resources Information Center

    Hill-Cunningham, P. Renee; Mott, Michael S.; Hunt, Anna-Blair

    2018-01-01

    STEM education in elementary school is guided by the understanding that engineering represents the application of science and math concepts to make life better for people. The Engineering Design Process (EDP) guides the application of creative solutions to problems. Helping teachers understand how to apply the EDP to create lessons develops a…

  8. Preliminary Feasibility Testing of the BRIC Brine Water Recovery Concept

    NASA Technical Reports Server (NTRS)

    Callahan, Michael R.; Pensinger, Stuart; Pickering, Karen D.

    2011-01-01

    The Brine Residual In-Containment (BRIC) concept was developed as a new technology to recover water from spacecraft wastewater brines. Such capability is considered critical to closing the water loop and achieving a sustained human presence in space. The intention of the BRIC concept is to increase the robustness and efficiency of the dewatering process by performing drying inside the container used for the final disposal of the residual brine solid. Recent efforts in the development of BRIC have focused on preliminary feasibility testing using a laboratory- assembled pre-prototype unit. Observations of the drying behavior of actual brine solutions processed under BRIC-like conditions has been of particular interest. To date, experiments conducted with three types of analogue spacecraft wastewater brines have confirmed the basic premise behind the proposed application of in-place drying for these solutions. Specifically, the dried residual mass from these solutions have tended to exhibit characteristics of adhesion and flow that are expected to continue to challenge process stream management in spacecraft brine dewatering system designs. Yet, these same characteristics may favor the development of capillary- and surface-tension-based approaches envisioned as part of an ultimate microgravity-compatible BRIC design. In addition, preliminary feasibility testing of the BRIC pre-prototype confirmed that high rates of water recovery, up to 98% of the available brine water, may be possible while still removing the majority of the brine contaminants from the influent brine stream. These and other observations from testing are reported.

  9. Electronic Handbooks Simplify Process Management

    NASA Technical Reports Server (NTRS)

    2012-01-01

    Getting a multitude of people to work together to manage processes across many organizations for example, flight projects, research, technologies, or data centers and others is not an easy task. Just ask Dr. Barry E. Jacobs, a research computer scientist at Goddard Space Flight Center. He helped NASA develop a process management solution that provided documenting tools for process developers and participants to help them quickly learn, adapt, test, and teach their views. Some of these tools included editable files for subprocess descriptions, document descriptions, role guidelines, manager worksheets, and references. First utilized for NASA's Headquarters Directives Management process, the approach led to the invention of a concept called the Electronic Handbook (EHB). This EHB concept was successfully applied to NASA's Small Business Innovation Research (SBIR) and Small Business Technology Transfer (STTR) programs, among other NASA programs. Several Federal agencies showed interest in the concept, so Jacobs and his team visited these agencies to show them how their specific processes could be managed by the methodology, as well as to create mockup versions of the EHBs.

  10. Use of cloud computing in biomedicine.

    PubMed

    Sobeslav, Vladimir; Maresova, Petra; Krejcar, Ondrej; Franca, Tanos C C; Kuca, Kamil

    2016-12-01

    Nowadays, biomedicine is characterised by a growing need for processing of large amounts of data in real time. This leads to new requirements for information and communication technologies (ICT). Cloud computing offers a solution to these requirements and provides many advantages, such as cost savings, elasticity and scalability of using ICT. The aim of this paper is to explore the concept of cloud computing and the related use of this concept in the area of biomedicine. Authors offer a comprehensive analysis of the implementation of the cloud computing approach in biomedical research, decomposed into infrastructure, platform and service layer, and a recommendation for processing large amounts of data in biomedicine. Firstly, the paper describes the appropriate forms and technological solutions of cloud computing. Secondly, the high-end computing paradigm of cloud computing aspects is analysed. Finally, the potential and current use of applications in scientific research of this technology in biomedicine is discussed.

  11. Development of inquiry behavior in concept identification.

    PubMed

    Vassilopoulos, C A; Dickerson, D J

    1992-08-01

    We studied inquiry behavior in concept identification in first-, fifth-, eighth-grade, and college students with problems involving eight four-letter strings. The task was to identify the correct string by asking questions related to either one letter or four letters that were answered by yes or no. Processing demands were manipulated by comparing (a) a condition in which letter strings were removed from view as feedback eliminated them as possible solutions with a condition in which strings remained in view and (b) problems that were structured so that relevant letter categories were easy to identify with problems that were not. Problem solving generally improved with age. First graders tended to ask questions that eliminated solutions one by one, whereas the older groups asked more informative questions. At the three upper grade levels, strategies for selecting queries were adapted to situations, with less demanding strategies being used when processing demands were higher.

  12. RadWorks Storm Shelter Design for Solar Particle Event Shielding

    NASA Technical Reports Server (NTRS)

    Simon, Matthew A.; Cerro, Jeffrey; Clowdsley, Martha

    2013-01-01

    In order to enable long-duration human exploration beyond low-Earth orbit, the risks associated with exposure of astronaut crews to space radiation must be mitigated with practical and affordable solutions. The space radiation environment beyond the magnetosphere is primarily a combination of two types of radiation: galactic cosmic rays (GCR) and solar particle events (SPE). While mitigating GCR exposure remains an open issue, reducing astronaut exposure to SPEs is achievable through material shielding because they are made up primarily of medium-energy protons. In order to ensure astronaut safety for long durations beyond low-Earth orbit, SPE radiation exposure must be mitigated. However, the increasingly demanding spacecraft propulsive performance for these ambitious missions requires minimal mass and volume radiation shielding solutions which leverage available multi-functional habitat structures and logistics as much as possible. This paper describes the efforts of NASA's RadWorks Advanced Exploration Systems (AES) Project to design minimal mass SPE radiation shelter concepts leveraging available resources. Discussion items include a description of the shelter trade space, the prioritization process used to identify the four primary shelter concepts chosen for maturation, a summary of each concept's design features, a description of the radiation analysis process, and an assessment of the parasitic mass of each concept.

  13. A Review of Solution Chemistry Studies: Insights into Students' Conceptions

    ERIC Educational Resources Information Center

    Calyk, Muammer; Ayas, Alipa; Ebenezer, Jazlin V.

    2005-01-01

    This study has reviewed the last two decades of student conception research in solution chemistry pertaining to aims, methods of exploring students' conception, general knowledge claims, students' conceptions and difficulties, and conceptual change studies. The aims of solution chemistry studies have been to assess students' understanding level of…

  14. MHD processes in the outer heliosphere

    NASA Technical Reports Server (NTRS)

    Burlaga, L. F.

    1984-01-01

    The magnetic field measurements from Voyager and the magnetohydrodynamic (MHD) processes in the outer heliosphere are reviewed. A bibliography of the experimental and theoretical work concerning magnetic fields and plasmas observed in the outer heliosphere is given. Emphasis in this review is on basic concepts and dynamical processes involving the magnetic field. The theory that serves to explain and unify the interplanetary magnetic field and plasma observations is magnetohydrodynamics. Basic physical processes and observations that relate directly to solutions of the MHD equations are emphasized, but obtaining solutions of this complex system of equations involves various assumptions and approximations. The spatial and temporal complexity of the outer heliosphere and some approaches for dealing with this complexity are discussed.

  15. A heat transfer model for a hot helium airship

    NASA Astrophysics Data System (ADS)

    Rapert, R. M.

    1987-06-01

    Basic heat transfer empirical and analytic equations are applied to a double envelope airship concept which uses heated Helium in the inner envelope to augment and control gross lift. The convective and conductive terms lead to a linear system of five equations for the concept airship, with the nonlinear radiation terms included by an iterative solution process. The graphed results from FORTRAN program solutions are presented for the variables of interest. These indicate that a simple use of airship engine exhaust heat gives more than a 30 percent increase in gross airship lift. Possibly more than 100 percent increase can be achieved if a 'stream injection' heating system, with associated design problems, is used.

  16. Solid-solution CrCoCuFeNi high-entropy alloy thin films synthesized by sputter deposition

    DOE PAGES

    An, Zhinan; Jia, Haoling; Wu, Yueying; ...

    2015-05-04

    The concept of high configurational entropy requires that the high-entropy alloys (HEAs) yield single-phase solid solutions. However, phase separations are quite common in bulk HEAs. A five-element alloy, CrCoCuFeNi, was deposited via radio frequency magnetron sputtering and confirmed to be a single-phase solid solution through the high-energy synchrotron X-ray diffraction, energy-dispersive spectroscopy, wavelength-dispersive spectroscopy, and transmission electron microscopy. The formation of the solid-solution phase is presumed to be due to the high cooling rate of the sputter-deposition process.

  17. Hydrated Cations in the General Chemistry Course.

    ERIC Educational Resources Information Center

    Kauffman, George B.; Baxter, John F., Jr.

    1981-01-01

    Presents selected information regarding the descriptive chemistry of the common metal ions and their compounds, including the concepts of process of solution, polar molecules, ionic size and charge, complex ions, coordination number, and the Bronsted-Lowry acid-base theory. (CS)

  18. A study on the indirect urea dosing method in the Selective Catalytic Reduction system

    NASA Astrophysics Data System (ADS)

    Brzeżański, M.; Sala, R.

    2016-09-01

    This article presents the results of studies on concept solution of dosing urea in a gas phase in a selective catalytic reduction system. The idea of the concept was to heat-up and evaporate the water urea solution before introducing it into the exhaust gas stream. The aim was to enhance the processes of urea converting into ammonia, what is the target reductant for nitrogen oxides treatment. The study was conducted on a medium-duty Euro 5 diesel engine with exhaust line consisting of DOC catalyst, DPF filter and an SCR system with a changeable setup allowing to dose the urea in liquid phase (regular solution) and to dose it in a gas phase (concept solution). The main criteria was to assess the effect of physical state of urea dosed on the NOx conversion ratio in the SCR catalyst. In order to compare both urea dosing methods a special test procedure was developed which consisted of six test steps covering a wide temperature range of exhaust gas generated at steady state engine operation condition. Tests were conducted for different urea dosing quantities defined by the a equivalence ratio. Based on the obtained results, a remarkable improvement in NOx reduction was found for gas urea application in comparison to the standard liquid urea dosing. Measured results indicate a high potential to increase an efficiency of the SCR catalyst by using a gas phase urea and provide the basis for further scientific research on this type of concept.

  19. Mechanochemical Energy Conversion

    ERIC Educational Resources Information Center

    Pines, E.; And Others

    1973-01-01

    Summarizes the thermodynamics of macromolecular systems, including theories and experiments of cyclic energy conversion with rubber and collagen as working substances. Indicates that an early introduction into the concept of chemical potential and solution thermodynamics is made possible through the study of the cyclic processes. (CC)

  20. Sustainable solutions for solid waste management in Southeast Asian countries.

    PubMed

    Ngoc, Uyen Nguyen; Schnitzer, Hans

    2009-06-01

    Human activities generate waste and the amounts tend to increase as the demand for quality of life increases. Today's rate in the Southeast Asian Nations (ASEANs) is alarming, posing a challenge to governments regarding environmental pollution in the recent years. The expectation is that eventually waste treatment and waste prevention approaches will develop towards sustainable waste management solutions. This expectation is for instance reflected in the term 'zero emission systems'. The concept of zero emissions can be applied successfully with today's technical possibilities in the agro-based processing industry. First, the state-of-the-art of waste management in Southeast Asian countries will be outlined in this paper, followed by waste generation rates, sources, and composition, as well as future trends of waste. Further on, solutions for solid waste management will be reviewed in the discussions of sustainable waste management. The paper emphasizes the concept of waste prevention through utilization of all wastes as process inputs, leading to the possibility of creating an ecosystem in a loop of materials. Also, a case study, focusing on the citrus processing industry, is displayed to illustrate the application of the aggregated material input-output model in a widespread processing industry in ASEAN. The model can be shown as a closed cluster, which permits an identification of opportunities for reducing environmental impacts at the process level in the food processing industry. Throughout the discussion in this paper, the utilization of renewable energy and economic aspects are considered to adapt to environmental and economic issues and the aim of eco-efficiency. Additionally, the opportunities and constraints of waste management will be discussed.

  1. Morphological control of inter-penetrating polymer networks

    NASA Technical Reports Server (NTRS)

    Hansen, Marion

    1989-01-01

    Synthetic organic polymer chemistry has been successful in producing composition of matter with thermal oxidation stability and progressively higher glass transition temperatures. In part, this was done by increasing the steric-hindrance of moieties in the chain of a macromolecule. The resulting polymers are usually quite insoluble and produce molten polymers of very high viscosities. These types of polymers are not easily processed into graphite fiber prepregs by melt or solution impregnation methods. Hence, a technological need exists to produce new knowledge of how to produce polymer-fiber composites from this class of polymers. The concept of freeze drying amic-acid prepolymers with reactive thermoplastic was proposed as a research topic for the ASEE/NASA Summer Faculty Program of 1989 as a means of producing polymer-fiber composites. This process scheme has the thermodynamic attribute that the magnitude of phase separation due to differences in solubility of two organic constituents in solution will be greatly reduced by removing a solvent not by evaporation but by sublimation. Progress to date on evaluating this polymer processing concept is briefly outlined.

  2. Learning basic programming using CLIS through gamification

    NASA Astrophysics Data System (ADS)

    Prabawa, H. W.; Sutarno, H.; Kusnendar, J.; Rahmah, F.

    2018-05-01

    The difficulty of understanding programming concept is a major problem in basic programming lessons. Based on the results of preliminary studies, 60% of students reveal the monotonous of learning process caused by the limited number of media. Children Learning in Science (CLIS) method was chosen as solution because CLIS has facilitated students’ initial knowledge to be optimized into conceptual knowledge. Technological involvement in CLIS (gamification) helped students to understand basic programming concept. This research developed a media using CLIS method with gamification elements to increase the excitement of learning process. This research declared that multimedia is considered good by students, especially regarding the mechanical aspects of multimedia, multimedia elements and aspects of multimedia information structure. Multimedia gamification learning with the CLIS model showed increased number of students’ concept understanding.

  3. A Fault-Tolerant Radiation-Robust Mass Storage Concept for Highly Scaled Flash Memory

    NASA Astrophysics Data System (ADS)

    Fuchs, Cristian M.; Trinitis, Carsten; Appel, Nicolas; Langer, Martin

    2015-09-01

    Future spacemissions will require vast amounts of data to be stored and processed aboard spacecraft. While satisfying operational mission requirements, storage systems must guarantee data integrity and recover damaged data throughout the mission. NAND-flash memories have become popular for space-borne high performance mass memory scenarios, though future storage concepts will rely upon highly scaled flash or other memory technologies. With modern flash memory, single bit erasure coding and RAID based concepts are insufficient. Thus, a fully run-time configurable, high performance, dependable storage concept, requiring a minimal set of logic or software. The solution is based on composite erasure coding and can be adjusted for altered mission duration or changing environmental conditions.

  4. Conception of the first magnetic resonance imaging contrast agents: a brief history.

    PubMed

    de Haën, C

    2001-08-01

    About 20 years ago, a technological innovation process started that eventually led to the affirmation of magnetic resonance imaging (MRI) contrast agents, which are used today in about 25% of all MRI procedures, as medical diagnostic tools. The process began with exploration of various technical possibilities and the conception in the years 1981 to 1982 of two types of agents (soluble paramagnetic chelates and protection colloid-stabilized colloidal particle solutions of magnetite) that eventually found embodiments in commercially available products. The pioneering products that eventually reached the market were gadopentetate dimeglumine (Magnevist, Schering AG) and the ferumoxides (Endorem, Guerbet SA; or Ferridex , Berlex Laboratories Inc.). The history of the conception phase of the technology is reconstructed here, focusing on the social dynamics rather than on technological aspects. In the period 1981 to 1982, a number of independent inventors from industry and academia conceived of water-soluble paramagnetic chelates and protection colloid-stabilized colloidal solutions of small particles of magnetite, both of acceptable tolerability, as contrast agents for MRI. Priorities on patents conditioned the further course of events. The analyzed history helps in understanding the typical roles of different institutions in technological innovation. The foundation of MRI contrast agent technology in basic science clearly was laid in academia. During the conception of practical products, industry assumed a dominant role. Beginning with the radiological evaluation of candidate products, the collaboration between industry and academia became essential.

  5. Requirements for the structured recording of surgical device data in the digital operating room.

    PubMed

    Rockstroh, Max; Franke, Stefan; Neumuth, Thomas

    2014-01-01

    Due to the increasing complexity of the surgical working environment, increasingly technical solutions must be found to help relieve the surgeon. This objective is supported by a structured storage concept for all relevant device data. In this work, we present a concept and prototype development of a storage system to address intraoperative medical data. The requirements of such a system are described, and solutions for data transfer, processing, and storage are presented. In a subsequent study, a prototype based on the presented concept is tested for correct and complete data transmission and storage and for the ability to record a complete neurosurgical intervention with low processing latencies. In the final section, several applications for the presented data recorder are shown. The developed system based on the presented concept is able to store the generated data correctly, completely, and quickly enough even if much more data than expected are sent during a surgical intervention. The Surgical Data Recorder supports automatic recognition of the interventional situation by providing a centralized data storage and access interface to the OR communication bus. In the future, further data acquisition technologies should be integrated. Therefore, additional interfaces must be developed. The data generated by these devices and technologies should also be stored in or referenced by the Surgical Data Recorder to support the analysis of the OR situation.

  6. Reaction paths and equilibrium end-points in solid-solution aqueous-solution systems

    USGS Publications Warehouse

    Glynn, P.D.; Reardon, E.J.; Plummer, Niel; Busenberg, E.

    1990-01-01

    Equations are presented describing equilibrium in binary solid-solution aqueous-solution (SSAS) systems after a dissolution, precipitation, or recrystallization process, as a function of the composition and relative proportion of the initial phases. Equilibrium phase diagrams incorporating the concept of stoichiometric saturation are used to interpret possible reaction paths and to demonstrate relations between stoichiometric saturation, primary saturation, and thermodynamic equilibrium states. The concept of stoichiometric saturation is found useful in interpreting and putting limits on dissolution pathways, but there currently is no basis for possible application of this concept to the prediction and/ or understanding of precipitation processes. Previously published dissolution experiments for (Ba, Sr)SO4 and (Sr, Ca)C??O3orth. solids are interpreted using equilibrium phase diagrams. These studies show that stoichiometric saturation can control, or at least influence, initial congruent dissolution pathways. The results for (Sr, Ca)CO3orth. solids reveal that stoichiometric saturation can also control the initial stages of incongruent dissolution, despite the intrinsic instability of some of the initial solids. In contrast, recrystallisation experiments in the highly soluble KCl-KBr-H2O system demonstrate equilibrium. The excess free energy of mixing calculated for K(Cl, Br) solids is closely modeled by the relation GE = ??KBr??KClRT[a0 + a1(2??KBr-1)], where a0 is 1.40 ?? 0.02, a1, is -0.08 ?? 0.03 at 25??C, and ??KBr and ??KCl are the mole fractions of KBr and KCl in the solids. The phase diagram constructed using this fit reveals an alyotropic maximum located at ??KBr = 0.676 and at a total solubility product, ???? = [K+]([Cl-] + [Br-]) = 15.35. ?? 1990.

  7. EGSIEM: Combination of GRACE monthly gravity models on normal equation level

    NASA Astrophysics Data System (ADS)

    Meyer, Ulrich; Jean, Yoomin; Jäggi, Adrian; Mayer-Gürr, Torsten; Neumayer, Hans; Lemoine, Jean-Michel

    2016-04-01

    One of the three geodetic services to be realized in the frame of the EGSIEM project is a scientific combination service. Each associated processing center (AC) will follow a set of common processing standards but will apply its own, independent analysis method. Therefore the quality, robustness and reliability of the combined monthly gravity fields is expected to improve significantly compared to the individual solutions. The Monthly GRACE gravity fields of all ACs are combined on normal equation level. The individual normal equations are weighted depending on pairwise comparisons of the individual gravity field solutions. To derive these weights and for quality control of the individual contributions first a combination of the monthly gravity fields on solution level is performed. The concept of weighting and of the combination on normal equation level is introduced and the formats used for normal equation exchange and gravity field solutions is described. First results of the combination on normal equation level are presented and compared to the corresponding combinations on solution level. EGSIEM has an open data policy and all processing centers of GRACE gravity fields are invited to participate in the combination.

  8. The usability axiom of medical information systems.

    PubMed

    Pantazi, Stefan V; Kushniruk, Andre; Moehr, Jochen R

    2006-12-01

    In this article we begin by connecting the concept of simplicity of user interfaces of information systems with that of usability, and the concept of complexity of the problem-solving in information systems with the concept of usefulness. We continue by stating "the usability axiom" of medical information technology: information systems must be, at the same time, usable and useful. We then try to show why, given existing technology, the axiom is a paradox and we continue with analysing and reformulating it several times, from more fundamental information processing perspectives. We underline the importance of the concept of representation and demonstrate the need for context-dependent representations. By means of thought experiments and examples, we advocate the need for context-dependent information processing and argue for the relevance of algorithmic information theory and case-based reasoning in this context. Further, we introduce the notion of concept spaces and offer a pragmatic perspective on context-dependent representations. We conclude that the efficient management of concept spaces may help with the solution to the medical information technology paradox. Finally, we propose a view of informatics centred on the concepts of context-dependent information processing and management of concept spaces that aligns well with existing knowledge centric definitions of informatics in general and medical informatics in particular. In effect, our view extends M. Musen's proposal and proposes a definition of Medical Informatics as context-dependent medical information processing. The axiom that medical information systems must be, at the same time, useful and usable, is a paradox and its investigation by means of examples and thought experiments leads to the recognition of the crucial importance of context-dependent information processing. On the premise that context-dependent information processing equates to knowledge processing, this view defines Medical Informatics as a context-dependent medical information processing which aligns well with existing knowledge centric definitions of our field.

  9. Desired Precision in Multi-Objective Optimization: Epsilon Archiving or Rounding Objectives?

    NASA Astrophysics Data System (ADS)

    Asadzadeh, M.; Sahraei, S.

    2016-12-01

    Multi-objective optimization (MO) aids in supporting the decision making process in water resources engineering and design problems. One of the main goals of solving a MO problem is to archive a set of solutions that is well-distributed across a wide range of all the design objectives. Modern MO algorithms use the epsilon dominance concept to define a mesh with pre-defined grid-cell size (often called epsilon) in the objective space and archive at most one solution at each grid-cell. Epsilon can be set to the desired precision level of each objective function to make sure that the difference between each pair of archived solutions is meaningful. This epsilon archiving process is computationally expensive in problems that have quick-to-evaluate objective functions. This research explores the applicability of a similar but computationally more efficient approach to respect the desired precision level of all objectives in the solution archiving process. In this alternative approach each objective function is rounded to the desired precision level before comparing any new solution to the set of archived solutions that already have rounded objective function values. This alternative solution archiving approach is compared to the epsilon archiving approach in terms of efficiency and quality of archived solutions for solving mathematical test problems and hydrologic model calibration problems.

  10. Conversations With the Community.

    ERIC Educational Resources Information Center

    Urschel, Jane W.

    1998-01-01

    Public deliberation is a little-used concept that gets people talking about education and working together to improve it. Study circles discuss each solution's pros and cons, explore people's deeper motivations, weigh others' views carefully, work through conflicting emotions, and identify common ground. Pueblo, Colorado's process is profiled.…

  11. Stability of Mixed-Strategy-Based Iterative Logit Quantal Response Dynamics in Game Theory

    PubMed Central

    Zhuang, Qian; Di, Zengru; Wu, Jinshan

    2014-01-01

    Using the Logit quantal response form as the response function in each step, the original definition of static quantal response equilibrium (QRE) is extended into an iterative evolution process. QREs remain as the fixed points of the dynamic process. However, depending on whether such fixed points are the long-term solutions of the dynamic process, they can be classified into stable (SQREs) and unstable (USQREs) equilibriums. This extension resembles the extension from static Nash equilibriums (NEs) to evolutionary stable solutions in the framework of evolutionary game theory. The relation between SQREs and other solution concepts of games, including NEs and QREs, is discussed. Using experimental data from other published papers, we perform a preliminary comparison between SQREs, NEs, QREs and the observed behavioral outcomes of those experiments. For certain games, we determine that SQREs have better predictive power than QREs and NEs. PMID:25157502

  12. Towards an ontological representation of morbidity and mortality in Description Logics.

    PubMed

    Santana, Filipe; Freitas, Fred; Fernandes, Roberta; Medeiros, Zulma; Schober, Daniel

    2012-09-21

    Despite the high coverage of biomedical ontologies, very few sound definitions of death can be found. Nevertheless, this concept has its relevance in epidemiology, such as for data integration within mortality notification systems. We here introduce an ontological representation of the complex biological qualities and processes that inhere in organisms transitioning from life to death. We further characterize them by causal processes and their temporal borders. Several representational difficulties were faced, mainly regarding kinds of processes with blurred or fiat borders that change their type in a continuous rather than discrete mode. Examples of such hard to grasp concepts are life, death and its relationships with injuries and diseases. We illustrate an iterative optimization of definitions within four versions of the ontology, so as to stress the typical problems encountered in representing complex biological processes. We point out possible solutions for representing concepts related to biological life cycles, preserving identity of participating individuals, i.e. for a patient in transition from life to death. This solution however required the use of extended description logics not yet supported by tools. We also focus on the interdependencies and need to change further parts if one part is changed. The axiomatic definition of mortality we introduce allows the description of biologic processes related to the transition from healthy to diseased or injured, and up to a final death state. Exploiting such definitions embedded into descriptions of pathogen transmissions by arthropod vectors, the complete sequence of infection and disease processes can be described, starting from the inoculation of a pathogen by a vector, until the death of an individual, preserving the identity of the patient.

  13. Design of an airborne lidar for stratospheric aerosol measurements

    NASA Technical Reports Server (NTRS)

    Evans, W. E.

    1977-01-01

    A modular, multiple-telescope receiving concept is developed to gain a relatively large receiver collection aperture without requiring extensive modifications to the aircraft. This concept, together with the choice of a specific photodetector, signal processing, and data recording system capable of maintaining approximately 1% precision over the required large signal amplitude range, is found to be common to all of the options. It is recommended that development of the lidar begin by more detailed definition of solutions to these important common signal detection and recording problems.

  14. Picoliter Drop-On-Demand Dispensing for Multiplex Liquid Cell Transmission Electron Microscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patterson, Joseph P.; Parent, Lucas R.; Cantlon, Joshua

    2016-05-03

    Abstract Liquid cell transmission electron microscopy (LCTEM) provides a unique insight into the dynamics of nanomaterials in solution. Controlling the addition of multiple solutions to the liquid cell remains a key hurdle in our ability to increase throughput and to study processes dependent on solution mixing including chemical reactions. Here, we report that a piezo dispensing technique allows for mixing of multiple solutions directly within the viewing area. This technique permits deposition of 50 pL droplets of various aqueous solutions onto the liquid cell window, before assembly of the cell in a fully controlled manner. This proof-of-concept study highlights themore » great potential of picoliter dispensing in combination with LCTEM for observing nanoparticle mixing in the solution phase and the creation of chemical gradients.« less

  15. Design Tool Using a New Optimization Method Based on a Stochastic Process

    NASA Astrophysics Data System (ADS)

    Yoshida, Hiroaki; Yamaguchi, Katsuhito; Ishikawa, Yoshio

    Conventional optimization methods are based on a deterministic approach since their purpose is to find out an exact solution. However, such methods have initial condition dependence and the risk of falling into local solution. In this paper, we propose a new optimization method based on the concept of path integrals used in quantum mechanics. The method obtains a solution as an expected value (stochastic average) using a stochastic process. The advantages of this method are that it is not affected by initial conditions and does not require techniques based on experiences. We applied the new optimization method to a hang glider design. In this problem, both the hang glider design and its flight trajectory were optimized. The numerical calculation results prove that performance of the method is sufficient for practical use.

  16. Tail mean and related robust solution concepts

    NASA Astrophysics Data System (ADS)

    Ogryczak, Włodzimierz

    2014-01-01

    Robust optimisation might be viewed as a multicriteria optimisation problem where objectives correspond to the scenarios although their probabilities are unknown or imprecise. The simplest robust solution concept represents a conservative approach focused on the worst-case scenario results optimisation. A softer concept allows one to optimise the tail mean thus combining performances under multiple worst scenarios. We show that while considering robust models allowing the probabilities to vary only within given intervals, the tail mean represents the robust solution for only upper bounded probabilities. For any arbitrary intervals of probabilities the corresponding robust solution may be expressed by the optimisation of appropriately combined mean and tail mean criteria thus remaining easily implementable with auxiliary linear inequalities. Moreover, we use the tail mean concept to develope linear programming implementable robust solution concepts related to risk averse optimisation criteria.

  17. Silicon Cations Intermixed Indium Zinc Oxide Interface for High-Performance Thin-Film Transistors Using a Solution Process.

    PubMed

    Na, Jae Won; Rim, You Seung; Kim, Hee Jun; Lee, Jin Hyeok; Hong, Seonghwan; Kim, Hyun Jae

    2017-09-06

    Solution-processed amorphous metal-oxide thin-film transistors (TFTs) utilizing an intermixed interface between a metal-oxide semiconductor and a dielectric layer are proposed. In-depth physical characterizations are carried out to verify the existence of the intermixed interface that is inevitably formed by interdiffusion of cations originated from a thermal process. In particular, when indium zinc oxide (IZO) semiconductor and silicon dioxide (SiO 2 ) dielectric layer are in contact and thermally processed, a Si 4+ intermixed IZO (Si/IZO) interface is created. On the basis of this concept, a high-performance Si/IZO TFT having both a field-effect mobility exceeding 10 cm 2 V -1 s -1 and a on/off current ratio over 10 7 is successfully demonstrated.

  18. Design Environment for Novel Vertical Lift Vehicles: DELIVER

    NASA Technical Reports Server (NTRS)

    Theodore, Colin

    2016-01-01

    This is a 20 minute presentation discussing the DELIVER vision. DELIVER is part of the ARMD Transformative Aeronautics Concepts Program, particularly the Convergent Aeronautics Solutions Project. The presentation covers the DELIVER vision, transforming markets, conceptual design process, challenges addressed, technical content, and FY2016 key activities.

  19. Sustainable solutions for solid waste management in Southeast Asian countries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Uyen Nguyen Ngoc; Schnitzer, Hans

    2009-06-15

    Human activities generate waste and the amounts tend to increase as the demand for quality of life increases. Today's rate in the Southeast Asian Nations (ASEANs) is alarming, posing a challenge to governments regarding environmental pollution in the recent years. The expectation is that eventually waste treatment and waste prevention approaches will develop towards sustainable waste management solutions. This expectation is for instance reflected in the term 'zero emission systems'. The concept of zero emissions can be applied successfully with today's technical possibilities in the agro-based processing industry. First, the state-of-the-art of waste management in Southeast Asian countries will bemore » outlined in this paper, followed by waste generation rates, sources, and composition, as well as future trends of waste. Further on, solutions for solid waste management will be reviewed in the discussions of sustainable waste management. The paper emphasizes the concept of waste prevention through utilization of all wastes as process inputs, leading to the possibility of creating an ecosystem in a loop of materials. Also, a case study, focusing on the citrus processing industry, is displayed to illustrate the application of the aggregated material input-output model in a widespread processing industry in ASEAN. The model can be shown as a closed cluster, which permits an identification of opportunities for reducing environmental impacts at the process level in the food processing industry. Throughout the discussion in this paper, the utilization of renewable energy and economic aspects are considered to adapt to environmental and economic issues and the aim of eco-efficiency. Additionally, the opportunities and constraints of waste management will be discussed.« less

  20. Shared communication processes within healthcare teams for rare diseases and their influence on healthcare professionals' innovative behavior and patient satisfaction

    PubMed Central

    2011-01-01

    Background A rare disease is a pattern of symptoms that afflicts less than five in 10,000 patients. However, as about 6,000 different rare disease patterns exist, they still have significant epidemiological relevance. We focus on rare diseases that affect multiple organs and thus demand that multidisciplinary healthcare professionals (HCPs) work together. In this context, standardized healthcare processes and concepts are mainly lacking, and a deficit of knowledge induces uncertainty and ambiguity. As such, individualized solutions for each patient are needed. This necessitates an intensive level of innovative individual behavior and thus, adequate idea generation. The final implementation of new healthcare concepts requires the integration of the expertise of all healthcare team members, including that of the patients. Therefore, knowledge sharing between HCPs and shared decision making between HCPs and patients are important. The objective of this study is to assess the contribution of shared communication and decision-making processes in patient-centered healthcare teams to the generation of innovative concepts and consequently to improvements in patient satisfaction. Methods A theoretical framework covering interaction processes and explorative outcomes, and using patient satisfaction as a measure for operational performance, was developed based on healthcare management, innovation, and social science literature. This theoretical framework forms the basis for a three-phase, mixed-method study. Exploratory phase I will first involve collecting qualitative data to detect central interaction barriers within healthcare teams. The results are related back to theory, and testable hypotheses will be derived. Phase II then comprises the testing of hypotheses through a quantitative survey of patients and their HCPs in six different rare disease patterns. For each of the six diseases, the sample should comprise an average of 30 patients with six HCP per patient-centered healthcare team. Finally, in phase III, qualitative data will be generated via semi-structured telephone interviews with patients to gain a deeper understanding of the communication processes and initiatives that generate innovative solutions. Discussion The findings of this proposed study will help to elucidate the necessity of individualized innovative solutions for patients with rare diseases. Therefore, this study will pinpoint the primary interaction and communication processes in multidisciplinary teams, as well as the required interplay between exploratory outcomes and operational performance. Hence, this study will provide healthcare institutions and HCPs with results and information essential for elaborating and implementing individual care solutions through the establishment of appropriate interaction and communication structures and processes within patient-centered healthcare teams. PMID:21510848

  1. Shared communication processes within healthcare teams for rare diseases and their influence on healthcare professionals' innovative behavior and patient satisfaction.

    PubMed

    Hannemann-Weber, Henrike; Kessel, Maura; Budych, Karolina; Schultz, Carsten

    2011-04-21

    A rare disease is a pattern of symptoms that afflicts less than five in 10,000 patients. However, as about 6,000 different rare disease patterns exist, they still have significant epidemiological relevance. We focus on rare diseases that affect multiple organs and thus demand that multidisciplinary healthcare professionals (HCPs) work together. In this context, standardized healthcare processes and concepts are mainly lacking, and a deficit of knowledge induces uncertainty and ambiguity. As such, individualized solutions for each patient are needed. This necessitates an intensive level of innovative individual behavior and thus, adequate idea generation. The final implementation of new healthcare concepts requires the integration of the expertise of all healthcare team members, including that of the patients. Therefore, knowledge sharing between HCPs and shared decision making between HCPs and patients are important. The objective of this study is to assess the contribution of shared communication and decision-making processes in patient-centered healthcare teams to the generation of innovative concepts and consequently to improvements in patient satisfaction. A theoretical framework covering interaction processes and explorative outcomes, and using patient satisfaction as a measure for operational performance, was developed based on healthcare management, innovation, and social science literature. This theoretical framework forms the basis for a three-phase, mixed-method study. Exploratory phase I will first involve collecting qualitative data to detect central interaction barriers within healthcare teams. The results are related back to theory, and testable hypotheses will be derived. Phase II then comprises the testing of hypotheses through a quantitative survey of patients and their HCPs in six different rare disease patterns. For each of the six diseases, the sample should comprise an average of 30 patients with six HCP per patient-centered healthcare team. Finally, in phase III, qualitative data will be generated via semi-structured telephone interviews with patients to gain a deeper understanding of the communication processes and initiatives that generate innovative solutions. The findings of this proposed study will help to elucidate the necessity of individualized innovative solutions for patients with rare diseases. Therefore, this study will pinpoint the primary interaction and communication processes in multidisciplinary teams, as well as the required interplay between exploratory outcomes and operational performance. Hence, this study will provide healthcare institutions and HCPs with results and information essential for elaborating and implementing individual care solutions through the establishment of appropriate interaction and communication structures and processes within patient-centered healthcare teams.

  2. Space Network Control Conference on Resource Allocation Concepts and Approaches

    NASA Technical Reports Server (NTRS)

    Moe, Karen L. (Editor)

    1991-01-01

    The results are presented of the Space Network Control (SNC) Conference. In the late 1990s, when the Advanced Tracking and Data Relay Satellite System is operational, Space Network communication services will be supported and controlled by the SNC. The goals of the conference were to survey existing resource allocation concepts and approaches, to identify solutions applicable to the Space Network, and to identify avenues of study in support of the SNC development. The conference was divided into three sessions: (1) Concepts for Space Network Allocation; (2) SNC and User Payload Operations Control Center (POCC) Human-Computer Interface Concepts; and (3) Resource Allocation Tools, Technology, and Algorithms. Key recommendations addressed approaches to achieving higher levels of automation in the scheduling process.

  3. Security Systems Consideration: A Total Security Approach

    NASA Astrophysics Data System (ADS)

    Margariti, S. V.; Meletiou, G.; Stergiou, E.; Vasiliadis, D. C.; Rizos, G. E.

    2007-12-01

    The "safety" problem for protection systems is to determine in a given situation whether a subject can acquire a particular right to an object. Security and audit operation face the process of securing the application on computing and network environment; however, storage security has been somewhat overlooked due to other security solutions. This paper identifies issues for data security, threats and attacks, summarizes security concepts and relationships, and also describes storage security strategies. It concludes with recommended storage security plan for a total security solution.

  4. Software Acquisition: Evolution, Total Quality Management, and Applications to the Army Tactical Missile System

    DTIC Science & Technology

    1992-06-01

    presents the concept of software Total Quality Management (TQM) which focuses on the entire process of software acquisition, as a partial solution to...software TQM can be applied to software acquisition. Software Development, Software Acquisition, Total Quality management (TQM), Army Tactical Missile

  5. Not Just for Computation: Basic Calculators Can Advance the Process Standards

    ERIC Educational Resources Information Center

    Moss, Laura J.; Grover, Barbara W.

    2007-01-01

    Simple nongraphing calculators can be powerful tools to enhance students' conceptual understanding of mathematics concepts. Students have opportunities to develop (1) a broad repertoire of problem-solving strategies by observing multiple solution strategies; (2) respect for other students' abilities and ways of thinking about mathematics; (3) the…

  6. Managing Returns in a Catalog Distribution Center

    ERIC Educational Resources Information Center

    Gates, Joyce; Stuart, Julie Ann; Bonawi-tan, Winston; Loehr, Sarah

    2004-01-01

    The research team of the Purdue University in the United States developed an algorithm that considers several different factors, in addition to cost, to help catalog distribution centers process their returns more efficiently. A case study to teach the students important concepts involved in developing a solution to the returns disposition problem…

  7. A Template-Based Short Course Concept on Android Application Development

    ERIC Educational Resources Information Center

    Akopian, David; Melkonyan, Arsen; Golgani, Santosh C.; Yuen, Timothy T.; Saygin, Can

    2013-01-01

    Smartphones are a common accessory to provide rich user experience due to superior memory, advanced software-hardware support, fast processing, and multimedia capabilities. Responding to this trend, advanced engineering systems tend to integrate mobile devices with their solutions to facilitate usability. With many young students showing interest…

  8. Middle School Children's Problem-Solving Behavior: A Cognitive Analysis from a Reading Comprehension Perspective

    ERIC Educational Resources Information Center

    Pape, Stephen J.

    2004-01-01

    Many children read mathematics word problems and directly translate them to arithmetic operations. More sophisticated problem solvers transform word problems into object-based or mental models. Subsequent solutions are often qualitatively different because these models differentially support cognitive processing. Based on a conception of problem…

  9. Teaching the Concept of Gibbs Energy Minimization through Its Application to Phase-Equilibrium Calculation

    ERIC Educational Resources Information Center

    Privat, Romain; Jaubert, Jean-Noe¨l; Berger, Etienne; Coniglio, Lucie; Lemaitre, Ce´cile; Meimaroglou, Dimitrios; Warth, Vale´rie

    2016-01-01

    Robust and fast methods for chemical or multiphase equilibrium calculation are routinely needed by chemical-process engineers working on sizing or simulation aspects. Yet, while industrial applications essentially require calculation tools capable of discriminating between stable and nonstable states and converging to nontrivial solutions,…

  10. Development of an Electrochemistry Teaching Sequence using a Phenomenographic Approach

    NASA Astrophysics Data System (ADS)

    Rodriguez-Velazquez, Sorangel

    Electrochemistry is the area of chemistry that studies electron transfer reactions across an interface. Chemistry education researchers have acknowledged that difficulties in electrochemistry instruction arise due to the level of abstraction of the topic, lack of adequate explanations and representations found in textbooks, and a quantitative emphasis in the application of concepts. Studies have identified conceptions (also referred to as misconceptions, alternative conceptions, etc.) about the electrochemical process that transcends academic and preparation levels (e.g., students and instructors) as well as cultural and educational settings. Furthermore, conceptual understanding of the electrochemical process requires comprehension of concepts usually studied in physics such as electric current, resistance and potential and often neglected in introductory chemistry courses. The lack of understanding of physical concepts leads to students. conceptions with regards to the relation between the concepts of redox reactions and electric circuits. The need for instructional materials to promote conceptual understanding of the electrochemical process motivated the development of the electrochemistry teaching sequence presented in this dissertation. Teaching sequences are educational tools that aim to bridge the gap between student conceptions and the scientific acceptable conceptions that instructors expect students to learn. This teaching sequence explicitly addresses known conceptions in electrochemistry and departs from traditional instruction in electrochemistry to reinforce students. previous knowledge in thermodynamics providing the foundation for the explicit relation of redox reactions and electric circuits during electrochemistry instruction. The scientific foundations of the electrochemical process are explained based on the Gibbs free energy (G) involved rather than on the standard redox potential values (E° ox/red) of redox half-reactions. Representations of the core concepts from discipline-specific models and theories serve as visual tools to describe reversible redox half-reactions at equilibrium, predict the spontaneity of the electrochemical process and explain interfacial equilibrium between redox species and electrodes in solution. The integration of physics concepts into electrochemistry instruction facilitated describing the interactions between the chemical system (e.g., redox species) and the external circuit (e.g., voltmeter). The "Two worlds" theoretical framework was chosen to anchor a robust educational design where the world of objects and events is deliberately connected to the world of theories and models. The core concepts in Marcus theory and density of states (DOS) provided the scientific foundations to connect both worlds. The design of this teaching sequence involved three phases; the selection of the content to be taught, the determination of a coherent and explicit connection among concepts and the development of educational activities to engage students in the learning process. The reduction-oxidation and electrochemistry chapters of three of the most popular general chemistry textbooks were revised in order to identify potential gaps during instruction, taking into consideration learning and teaching difficulties. The electrochemistry curriculum was decomposed into manageable sections contained in modules. Thirteen modules were developed and each module addresses specific conceptions with regard to terminology, redox reactions in electrochemical cells, and the function of the external circuit in electrochemical process. The electrochemistry teaching sequence was evaluated using a phenomenographic approach. This approach allows describing the qualitative variation in instructors' consciousness about the teaching of electrochemistry. A phenomenographic analysis revealed that the most relevant aspect of variation came from instructors' expertise. Participant A expertise (electrochemist) promoted in-depth discussions of fundamental theories and models that explain the electrochemical process while participant B expertise (general chemistry instruction) emphasized a coherent and explicit presentation of such theories and models to students. Other categories of variation were identified as: recognizing students' conceptions, the use of teaching resources and instructors' expectations for the teaching sequence. For example, while Participant B depended heavily on representations and explanations found in textbooks, participant A recognized misleading representations and oversimplified statements in general chemistry textbooks. Participant A was also more inclined to question the significance of some conceptions such as the correlation between the use of the term circuit and students' conceptions related to the movement of electrons in solution in an electrochemical cell. The electrochemistry teaching sequence in this dissertation fulfils each of the instructors' expectations with regards to the content that incorporated discipline-specific theories and models, explicit connections and flow among concepts, and addressing students' conceptions via the educational activities developed.

  11. Concept mapping as an approach for expert-guided model building: The example of health literacy.

    PubMed

    Soellner, Renate; Lenartz, Norbert; Rudinger, Georg

    2017-02-01

    Concept mapping served as the starting point for the aim of capturing the comprehensive structure of the construct of 'health literacy.' Ideas about health literacy were generated by 99 experts and resulted in 105 statements that were subsequently organized by 27 experts in an unstructured card sorting. Multidimensional scaling was applied to the sorting data and a two and three-dimensional solution was computed. The three dimensional solution was used in subsequent cluster analysis and resulted in a concept map of nine "clusters": (1) self-regulation, (2) self-perception, (3) proactive approach to health, (4) basic literacy and numeracy skills, (5) information appraisal, (6) information search, (7) health care system knowledge and acting, (8) communication and cooperation, and (9) beneficial personality traits. Subsequently, this concept map served as a starting point for developing a "qualitative" structural model of health literacy and a questionnaire for the measurement of health literacy. On the basis of questionnaire data, a "quantitative" structural model was created by first applying exploratory factor analyses (EFA) and then cross-validating the model with confirmatory factor analyses (CFA). Concept mapping proved to be a highly valuable tool for the process of model building up to translational research in the "real world". Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. [From program to metaphor--managing the needs of children in a separation or divorce process involving their parents].

    PubMed

    Frey, E

    2000-02-01

    In German-speaking regions there are several independent intervention programs, derived from concepts originating in the USA, which are designed to assist children whose parents are separated or divorced. Two of these programs will be presented here. Consistent with the thematic implication that divorce entails trauma and stress, the childrens' need of advice and help is placed at the conceptual center of the intervention. This results in a behavioral-cognitive training program, which should enable children to overcome the stress of their situation. In contrast to these two programs, an understanding of divorce may be achieved, through phenomenological analysis, which gives due consideration to the various aspects and meanings of the divorce process. The concept of intervention derived from this approach is to assist children in further developing their own search for solutions, focused on the parents-child-relation. The primary emphasis of this approach is not the childrens' difficulty in dealing with their parents' separation and divorce, but rather their own attempt to deal with the problem, as is visible in the metaphor of their spontaneous descriptions and images of experiences and events. The central concept of the proposed course is the further development of this creative process in the form of a dynamic-communicative group happening. Furthermore it is shown how children can be assisted in a practical way, and encouraged to create their individual and personally adequate solution to the experience of divorce.

  13. Preliminary Feasibility Testing of the BRIC Brine Water Recovery Concept

    NASA Technical Reports Server (NTRS)

    Callahan, Michael R.; Pensinger, Stuart J.; Pickering, Karen D.

    2012-01-01

    The Brine Residual In-Containment (BRIC) concept is being developed as a new technology to recover water from spacecraft wastewater brines. Such capability is considered critical to closing the water loop and achieving a sustained human presence in space. The intention of the BRIC concept is to increase the robustness and efficiency of the dewatering process by performing drying inside the container used for the final disposal of the residual brine solid. Recent efforts in the development of BRIC have focused on preliminary feasibility testing using a laboratory- assembled pre-prototype unit. Observations of the drying behavior of actual brine solutions processed under BRIC-like conditions has been of particular interest. To date, experiments conducted with three types of analogue spacecraft wastewater brines have confirmed the basic premise behind the proposed application of in-place drying. Specifically, the dried residual mass from these solutions have tended to exhibit characteristics of adhesion and flow that are expected to continue to challenge process stream management designs typically used in spacecraft systems. Yet, these same characteristics may favor the development of capillary- and surface-tension-based approaches currently envisioned as part of an ultimate microgravity-compatible BRIC design. In addition, preliminary feasibility testing of the BRIC pre-prototype confirmed that high rates of water recovery, up to 98% of the available brine water, may be possible while still removing the majority of the brine contaminants from the influent brine stream. These and other early observations from testing are reported.

  14. A service concept and tools to improve maternal and newborn health in Nigeria and Uganda.

    PubMed

    Salgado, Mariana; Wendland, Melanie; Rodriguez, Damaris; Bohren, Meghan A; Oladapo, Olufemi T; Ojelade, Olubunmi A; Mugerwa, Kidza; Fawole, Bukola

    2017-12-01

    The "Better Outcomes in Labor Difficulty" (BOLD) project used a service design process to design a set of tools to improve quality of care during childbirth by strengthening linkages between communities and health facilities in Nigeria and Uganda. This paper describes the Passport to Safer Birth concept and the tools developed as a result. Service design methods were used to identify facilitators and barriers to quality care, and to develop human-centered solutions. The service design process had three phases: Research for Design, Concept Design, and Detail Design, undertaken in eight hospitals and catchment communities. The service concept "Better Beginnings" comprises three tools. The "Pregnancy Purse" provides educational information to women throughout pregnancy. The "Birth Board" is a visual communication tool that presents the labor and childbirth process. The "Family Pass" is a set of wearable passes for the woman and her supporter to facilitate communication of care preferences. The Better Beginnings service concept and tools form the basis for the promotion of access to information and knowledge acquisition, and could improve communication between the healthcare provider, the woman, and her family during childbirth. © 2017 International Federation of Gynecology and Obstetrics. The World Health Organization retains copyright and all other rights in the manuscript of this article as submitted for publication.

  15. The Tool for Designing Engineering Systems Using a New Optimization Method Based on a Stochastic Process

    NASA Astrophysics Data System (ADS)

    Yoshida, Hiroaki; Yamaguchi, Katsuhito; Ishikawa, Yoshio

    The conventional optimization methods were based on a deterministic approach, since their purpose is to find out an exact solution. However, these methods have initial condition dependence and risk of falling into local solution. In this paper, we propose a new optimization method based on a concept of path integral method used in quantum mechanics. The method obtains a solutions as an expected value (stochastic average) using a stochastic process. The advantages of this method are not to be affected by initial conditions and not to need techniques based on experiences. We applied the new optimization method to a design of the hang glider. In this problem, not only the hang glider design but also its flight trajectory were optimized. The numerical calculation results showed that the method has a sufficient performance.

  16. Orbit Determination Strategy and Simulation Performance for OSIRIS-REx Proximity Operations

    NASA Technical Reports Server (NTRS)

    Leonard, Jason M.; Antreasian, Peter G.; Jackman, Coralie D.; Page, Brian; Wibben, Daniel R.; Moreau, Michael C.

    2017-01-01

    The Origins Spectral Interpretation Resource Identification Security Regolith Explorer (OSIRISREx)is a NASA New Frontiers mission to the near-earth asteroid Bennu that will rendez vousin 2018, create a comprehensive and detailed set of observations over several years, collect a regolith sample, and return the sample to Earth in 2023. The Orbit Determination (OD) team isa sub-section of the Flight Dynamics System responsible for generating precise reconstructions and predictions of the spacecraft trajectory. The OD team processes radiometric data, LIDAR, as well as center-finding and landmark-based Optical Navigation images throughout the proximity operations phase to estimate and predict the spacecraft location within several meters. Stringent knowledge requirements stress the OD teams concept of operations and procedures to produce verified and consistent high quality solutions for observation planning, maneuver planning, and onboard sequencing. This paper will provide insight into the OD concept of operations and summarize the OD performance expected during the approach and early proximity operation phases,based on our pre-encounter knowledge of Bennu. Strategies and methods used to compare and evaluate predicted and reconstructed solutions are detailed. The use of high fidelity operational tests during early 2017 will stress the teams concept of operations and ability to produce precise OD solutions with minimal turn-around delay.

  17. An Investigation of Effectiveness of Conceptual Change Text-oriented Instruction on Students' Understanding of Solution Concepts

    NASA Astrophysics Data System (ADS)

    Pinarbaşi; , Tacettin; Canpolat, Nurtaç; Bayrakçeken, Samih; Geban, Ömer

    2006-12-01

    This study investigated the effect of conceptual change text-oriented instruction over traditional instruction on students' understanding of solution concepts (e.g., dissolving, solubility, factors affecting solubility, concentrations of solutions, types of solutions, physical properties of solutions) and their attitudes towards chemistry. The sample of this study consisted of 87 undergraduate students from two classes enrolled in an introductory chemistry course. One of the classes was assigned randomly to the control group, and the other class were assigned randomly to the experimental group. During teaching the topic of solution concepts in the chemistry curriculum, a conceptual change text-oriented instruction was applied in the experimental group whereas traditional instruction was followed in the control group. The results showed that the students in the experimental group performed better with respect to solution concepts. In addition, it has been found that there was no significant difference between the attitudes of students in the experimental and control groups towards chemistry.

  18. Infant feeding: the interfaces between interaction design and cognitive ergonomics in user-centered design.

    PubMed

    Lima, Flavia; Araújo, Lilian Kely

    2012-01-01

    This text presents a discussion on the process of developing interactive products focused on infant behavior, which result was an interactive game for encouraging infant feeding. For that, it describes the use of cognitive psychology concepts added to interaction design methodology. Through this project, this article sustains how the cooperative use of these concepts provides adherent solutions to users' needs, whichever they are. Besides that, it verifies the closeness of those methodologies to boundary areas of knowledge, such as design focused on user and ergonomics.

  19. A&R challenges for in-space operations. [Automation and Robotic technologies

    NASA Technical Reports Server (NTRS)

    Underwood, James

    1990-01-01

    Automation and robotics (A&R) challenges for in-space operations are examined, with emphasis on the interaction between developing requirements, developing solutions, design concepts, and the nature of the applicability of automation in robotic technologies. Attention is first given to the use of A&R in establishing outposts on the moon and Mars. Then emphasis is placed on the requirements for the assembly of transportation systems in low earth orbit. Concepts of the Space Station which show how the assembly, processing, and checkout of systems in LEO might be accommodated are examined.

  20. CFD Code Validation of Wall Heat Fluxes for a G02/GH2 Single Element Combustor

    NASA Technical Reports Server (NTRS)

    Lin, Jeff; West, Jeff S.; Williams, Robert W.; Tucker, P. Kevin

    2005-01-01

    This paper puts forth the case for the need for improved injector design tools to meet NASA s Vision for Space Exploration goals. Requirements for this improved tool are outlined and discussed. The potential for Computational Fluid Dynamics (CFD) to meet these requirements is noted along with its current shortcomings, especially relative to demonstrated solution accuracy. The concept of verification and validation is introduced as the primary process for building and quantifying the confidence necessary for CFD to be useful as an injector design tool. The verification and validation process is considered in the context of the Marshall Space Flight Center (MSFC) Combustion Devices CFD Simulation Capability Roadmap via the Simulation Readiness Level (SRL) concept. The portion of the validation process which demonstrates the ability of a CFD code to simulate heat fluxes to a rocket engine combustor wall is the focus of the current effort. The FDNS and Loci-CHEM codes are used to simulate a shear coaxial single element G02/GH2 injector experiment. The experiment was conducted a t a chamber pressure of 750 psia using hot propellants from preburners. A measured wall temperature profile is used as a boundary condition to facilitate the calculations. Converged solutions, obtained from both codes by using wall functions with the K-E turbulence model and integrating to the wall using Mentor s baseline turbulence model, are compared to the experimental data. The initial solutions from both codes revealed significant issues with the wall function implementation associated with the recirculation zone between the shear coaxial jet and the chamber wall. The FDNS solution with a corrected implementation shows marked improvement in overall character and level of comparison to the data. With the FDNS code, integrating to the wall with Mentor s baseline turbulence model actually produce a degraded solution when compared to the wall function solution with the K--E model. The Loci-CHEM solution, produced by integrating to the wall with Mentor s baseline turbulence model, matches both the heat flux rise rate in the near injector region and the peak heat flux level very well. However, it moderately over predicts the heat fluxes downstream of the reattachment point. The Loci-CHEM solution achieved by integrating to the wall with Mentor s baseline turbulence model was clearly superior to the other solutions produced in this effort.

  1. Pseudomaster equation for the no-count process in a continuous photodetection

    NASA Technical Reports Server (NTRS)

    Lee, Ching-Tsung

    1994-01-01

    The detection of cavity radiation with the detector placed outside the cavity is studied. Each leaked photon has a certain probability of propagating away without being detected. It is viewed as a continuous quantum measurement in which the density matrix is continuously revised according to the readout of the detector. The concept of pseudomaster equation for the no-count process is introduced; its solution leads to the discovery of the superoperator for the same process. It has the potential to become the key equation for continuous measurement process.

  2. Bombs Away: Visual Thinking and Students' Engagement in Design Studios Contexts

    ERIC Educational Resources Information Center

    Chamorro-Koc, Marianella; Scott, Andrew; Coombs, Gretchen

    2015-01-01

    In design studio, sketching or visual thinking is part of processes that assist students to achieve final design solutions. At Queensland University of Technology's (QUT's) First and Third Year industrial design studio classes we engage in a variety of teaching pedagogies from which we identify "Concept Bombs" as instrumental in the…

  3. Cognitive Activity-based Design Methodology for Novice Visual Communication Designers

    ERIC Educational Resources Information Center

    Kim, Hyunjung; Lee, Hyunju

    2016-01-01

    The notion of design thinking is becoming more concrete nowadays, as design researchers and practitioners study the thinking processes involved in design and employ the concept of design thinking to foster better solutions to complex and ill-defined problems. The goal of the present research is to develop a cognitive activity-based design…

  4. Fracture mechanics and parapsychology

    NASA Astrophysics Data System (ADS)

    Cherepanov, G. P.

    2010-08-01

    The problem of postcritical deformation of materials beyond the ultimate strength is considered a division of fracture mechanics. A simple example is used to show the relationship between this problem and parapsychology, which studies phenomena and processes where the causality principle fails. It is shown that the concept of postcritical deformation leads to problems with no solution

  5. Direct microscopic observation of forward osmosis membrane fouling.

    PubMed

    Wang, Yining; Wicaksana, Filicia; Tang, Chuyang Y; Fane, Anthony G

    2010-09-15

    This study describes the application of a noninvasive direct microscopic observation method for characterizing fouling of a forward osmosis (FO) membrane. The effect of the draw solution concentration, membrane orientation, and feed spacer on FO fouling was systematically investigated in a cross-flow setup using latex particles as model foulant in the feedwater. Higher draw solution (DS) concentrations (and thus increased flux levels) resulted in dramatic increase in the surface coverage by latex particles, suggesting that the critical flux concept might be applicable even for the osmotically driven FO process. Under identical draw solution concentrations, the active-layer-facing-the-feed-solution orientation (AL-FS) experienced significantly less fouling compared to the alternative orientation. This may be explained by the lower water flux in AL-FS, which is consistent with the critical flux concept. The use of a feed spacer not only dramatically enhanced the initial flux of the FO membrane, but also significantly improved the flux stability during FO fouling. Despite such beneficial effects of using the feed spacer, a significant amount of particle accumulation was found near the spacer filament, suggesting further opportunities for improved spacer design. To the best of the authors' knowledge, this is the first direct microscopic observation study on FO fouling.

  6. Glass transition of aqueous solutions involving annealing-induced ice recrystallization resolves liquid-liquid transition puzzle of water

    PubMed Central

    Zhao, Li-Shan; Cao, Ze-Xian; Wang, Qiang

    2015-01-01

    Liquid-liquid transition of water is an important concept in condensed-matter physics. Recently, it was claimed to have been confirmed in aqueous solutions based on annealing-induced upshift of glass-liquid transition temperature, . Here we report a universal water-content, , dependence of for aqueous solutions. Solutions with vitrify/devitrify at a constant temperature, , referring to freeze-concentrated phase with left behind ice crystallization. Those solutions with totally vitrify at under conventional cooling/heating process though, of the samples annealed at temperatures   to effectively evoke ice recrystallization is stabilized at . Experiments on aqueous glycerol and 1,2,4-butanetriol solutions in literature were repeated, and the same samples subject to other annealing treatments equally reproduce the result. The upshift of by annealing is attributable to freeze-concentrated phase of solutions instead of ‘liquid II phase of water’. Our work also provides a reliable method to determine hydration formula and to scrutinize solute-solvent interaction in solution. PMID:26503911

  7. Towards a future robotic home environment: a survey.

    PubMed

    Güttler, Jörg; Georgoulas, Christos; Linner, Thomas; Bock, Thomas

    2015-01-01

    Demographic change has resulted in an increase of elderly people, while at the same time the number of active working people is falling. In the future, there will be less caretaking, which is necessary to support the aging population. In order to enable the aged population to live in dignity, they should be able to perform activities of daily living (ADLs) as independently as possible. The aim of this paper is to describe several solutions and concepts that can support elderly people in their ADLs in a way that allows them to stay self-sufficient for as long as possible. To reach this goal, the Building Realization and Robotics Lab is researching in the field of ambient assisted living. The idea is to implement robots and sensors in the home environment so as to efficiently support the inhabitants in their ADLs and eventually increase their independence. Through embedding vital sensors into furniture and using ICT technologies, the health status of elderly people can be remotely evaluated by a physician or family members. By investigating ergonomic aspects specific to elderly people (e.g. via an age-simulation suit), it is possible to develop and test new concepts and novel applications, which will offer innovative solutions. Via the introduction of mechatronics and robotics, the home environment can be made able to seamlessly interact with the inhabitant through gestures, vocal commands, and visual recognition algorithms. Meanwhile, several solutions have been developed that address how to build a smart home environment in order to create an ambient assisted environment. This article describes how these concepts were developed. The approach for each concept, proposed in this article, was performed as follows: (1) research of needs, (2) creating definitions of requirements, (3) identification of necessary technology and processes, (4) building initial concepts, (5) experiments in a real environment, and (6) development of the final concepts. To keep these concepts cost-effective, the suggested solutions are modular. Therefore, it will be possible to straightforwardly install the proposed devices in an existing home environment in a 'plug and play' manner once the terminals can be prefabricated off-site. This article shows a variety of concepts that have been developed to support elderly people in their ADLs. The prototypes of the proposed concepts in this paper have been tested with elderly people. The results of the tests show that robots embedded in furniture, walls, ceiling, etc. offer enhanced support, properly addressing elderly as well as disabled people to individually and independently manage their ADLs. In order to make the concepts realizable in terms of cost, it will be necessary to standardize and modularize these concepts for industrial fabrication. © 2014 S. Karger AG, Basel

  8. Preliminary study of fusion reactor: Solution of Grad Shapranov equation

    NASA Astrophysics Data System (ADS)

    Setiawan, Y.; Fermi, N.; Su'ud, Z.

    2012-06-01

    Nuclear fussion is prospective energy sources for the future due to the abundance of the fuel and can be categorized and clean energy sources. The problem is how to contain very hot plasma of temperature few hundreed million degrees safety and reliably. Tokamax type fussion reactors is considered as the most prospective concept. To analyze the plasma confining process and its movement Grad-Shavranov equation must be solved. This paper discuss about solution of Grad-Shavranov equation using Whittaker function. The formulation is then applied to the ITER design and example.

  9. Improved waste water vapor compression distillation technology. [for Spacelab

    NASA Technical Reports Server (NTRS)

    Johnson, K. L.; Nuccio, P. P.; Reveley, W. F.

    1977-01-01

    The vapor compression distillation process is a method of recovering potable water from crewman urine in a manned spacecraft or space station. A description is presented of the research and development approach to the solution of the various problems encountered with previous vapor compression distillation units. The design solutions considered are incorporated in the preliminary design of a vapor compression distillation subsystem. The new design concepts are available for integration in the next generation of support systems and, particularly, the regenerative life support evaluation intended for project Spacelab.

  10. Shape optimization of three-dimensional stamped and solid automotive components

    NASA Technical Reports Server (NTRS)

    Botkin, M. E.; Yang, R.-J.; Bennett, J. A.

    1987-01-01

    The shape optimization of realistic, 3-D automotive components is discussed. The integration of the major parts of the total process: modeling, mesh generation, finite element and sensitivity analysis, and optimization are stressed. Stamped components and solid components are treated separately. For stamped parts a highly automated capability was developed. The problem description is based upon a parameterized boundary design element concept for the definition of the geometry. Automatic triangulation and adaptive mesh refinement are used to provide an automated analysis capability which requires only boundary data and takes into account sensitivity of the solution accuracy to boundary shape. For solid components a general extension of the 2-D boundary design element concept has not been achieved. In this case, the parameterized surface shape is provided using a generic modeling concept based upon isoparametric mapping patches which also serves as the mesh generator. Emphasis is placed upon the coupling of optimization with a commercially available finite element program. To do this it is necessary to modularize the program architecture and obtain shape design sensitivities using the material derivative approach so that only boundary solution data is needed.

  11. Extreme groundwater levels caused by extreme weather conditions - the highest ever measured groundwater levels in Middle Germany and their management

    NASA Astrophysics Data System (ADS)

    Reinstorf, F.; Kramer, S.; Koch, T.; Pfützner, B.

    2017-12-01

    Extreme weather conditions during the years 2009 - 2011 in combination with changes in the regional water management led to maximum groundwater levels in large areas of Germany in 2011. This resulted in extensive water logging, with problems especially in urban areas near rivers, where water logging produced huge problems for buildings and infrastructure. The acute situation still exists in many areas and requires the development of solution concepts. Taken the example of the Elbe-Saale-Region in the Federal State of Saxony-Anhalt, were a pilot research project was carried out, the analytical situation, the development of a management tool and the implementation of a groundwater management concept are shown. The central tool is a coupled water budget - groundwater flow model. In combination with sophisticated multi-scale parameter estimation, a high-resolution groundwater level simulation was carried out. A decision support process with an intensive stakeholder interaction combined with high-resolution simulations enables the development of a management concept for extreme groundwater situations in consideration of sustainable and environmentally sound solutions mainly on the base of passive measures.

  12. Mathematics teachers' conceptions about modelling activities and its reflection on their beliefs about mathematics

    NASA Astrophysics Data System (ADS)

    Shahbari, Juhaina Awawdeh

    2018-07-01

    The current study examines whether the engagement of mathematics teachers in modelling activities and subsequent changes in their conceptions about these activities affect their beliefs about mathematics. The sample comprised 52 mathematics teachers working in small groups in four modelling activities. The data were collected from teachers' Reports about features of each activity, interviews and questionnaires on teachers' beliefs about mathematics. The findings indicated changes in teachers' conceptions about the modelling activities. Most teachers referred to the first activity as a mathematical problem but emphasized only the mathematical notions or the mathematical operations in the modelling process; changes in their conceptions were gradual. Most of the teachers referred to the fourth activity as a mathematical problem and emphasized features of the whole modelling process. The results of the interviews indicated that changes in the teachers' conceptions can be attributed to structure of the activities, group discussions, solution paths and elicited models. These changes about modelling activities were reflected in teachers' beliefs about mathematics. The quantitative findings indicated that the teachers developed more constructive beliefs about mathematics after engagement in the modelling activities and that the difference was significant, however there was no significant difference regarding changes in their traditional beliefs.

  13. Solution-processed parallel tandem polymer solar cells using silver nanowires as intermediate electrode.

    PubMed

    Guo, Fei; Kubis, Peter; Li, Ning; Przybilla, Thomas; Matt, Gebhard; Stubhan, Tobias; Ameri, Tayebeh; Butz, Benjamin; Spiecker, Erdmann; Forberich, Karen; Brabec, Christoph J

    2014-12-23

    Tandem architecture is the most relevant concept to overcome the efficiency limit of single-junction photovoltaic solar cells. Series-connected tandem polymer solar cells (PSCs) have advanced rapidly during the past decade. In contrast, the development of parallel-connected tandem cells is lagging far behind due to the big challenge in establishing an efficient interlayer with high transparency and high in-plane conductivity. Here, we report all-solution fabrication of parallel tandem PSCs using silver nanowires as intermediate charge collecting electrode. Through a rational interface design, a robust interlayer is established, enabling the efficient extraction and transport of electrons from subcells. The resulting parallel tandem cells exhibit high fill factors of ∼60% and enhanced current densities which are identical to the sum of the current densities of the subcells. These results suggest that solution-processed parallel tandem configuration provides an alternative avenue toward high performance photovoltaic devices.

  14. Adaptive building skin structures

    NASA Astrophysics Data System (ADS)

    Del Grosso, A. E.; Basso, P.

    2010-12-01

    The concept of adaptive and morphing structures has gained considerable attention in the recent years in many fields of engineering. In civil engineering very few practical applications are reported to date however. Non-conventional structural concepts like deployable, inflatable and morphing structures may indeed provide innovative solutions to some of the problems that the construction industry is being called to face. To give some examples, searches for low-energy consumption or even energy-harvesting green buildings are amongst such problems. This paper first presents a review of the above problems and technologies, which shows how the solution to these problems requires a multidisciplinary approach, involving the integration of architectural and engineering disciplines. The discussion continues with the presentation of a possible application of two adaptive and dynamically morphing structures which are proposed for the realization of an acoustic envelope. The core of the two applications is the use of a novel optimization process which leads the search for optimal solutions by means of an evolutionary technique while the compatibility of the resulting configurations of the adaptive envelope is ensured by the virtual force density method.

  15. An Outlook on Biothermodynamics: Needs, Problems, and New Developments. I. Stability and Hydration of Proteins

    NASA Astrophysics Data System (ADS)

    Keller, Jürgen U.

    2008-12-01

    The application of concepts, principles, and methods of thermodynamics of equilibria and processes to bioengineering systems has led to a new and growing field: engineering biothermodynamics. This article, which is meant as the first in a series, gives an outline of basic aspects, changes, and actual examples in this field. After a few introductory remarks, the basic concepts and laws of thermodynamics extended to systems with internal variables, which serve as models for biofluids and other biosystems, are given. The method of thermodynamics is then applied to the problem of thermal stability of aqueous protein solutions, especially to that of myoglobin solutions. After this, the phenomenon of hydration of proteins by adsorption and intrusion of water molecules is considered. Several other phenomena like the adsorption of proteins on solid surfaces or cell membranes and their temperature and pressure-related behavior represented by an equation of state, or the thermodynamics of bacterial solutions including chemical reactions like wine fermentation, etc., will be presented in Parts II and III of this article.

  16. Ensembles of NLP Tools for Data Element Extraction from Clinical Notes

    PubMed Central

    Kuo, Tsung-Ting; Rao, Pallavi; Maehara, Cleo; Doan, Son; Chaparro, Juan D.; Day, Michele E.; Farcas, Claudiu; Ohno-Machado, Lucila; Hsu, Chun-Nan

    2016-01-01

    Natural Language Processing (NLP) is essential for concept extraction from narrative text in electronic health records (EHR). To extract numerous and diverse concepts, such as data elements (i.e., important concepts related to a certain medical condition), a plausible solution is to combine various NLP tools into an ensemble to improve extraction performance. However, it is unclear to what extent ensembles of popular NLP tools improve the extraction of numerous and diverse concepts. Therefore, we built an NLP ensemble pipeline to synergize the strength of popular NLP tools using seven ensemble methods, and to quantify the improvement in performance achieved by ensembles in the extraction of data elements for three very different cohorts. Evaluation results show that the pipeline can improve the performance of NLP tools, but there is high variability depending on the cohort. PMID:28269947

  17. Ensembles of NLP Tools for Data Element Extraction from Clinical Notes.

    PubMed

    Kuo, Tsung-Ting; Rao, Pallavi; Maehara, Cleo; Doan, Son; Chaparro, Juan D; Day, Michele E; Farcas, Claudiu; Ohno-Machado, Lucila; Hsu, Chun-Nan

    2016-01-01

    Natural Language Processing (NLP) is essential for concept extraction from narrative text in electronic health records (EHR). To extract numerous and diverse concepts, such as data elements (i.e., important concepts related to a certain medical condition), a plausible solution is to combine various NLP tools into an ensemble to improve extraction performance. However, it is unclear to what extent ensembles of popular NLP tools improve the extraction of numerous and diverse concepts. Therefore, we built an NLP ensemble pipeline to synergize the strength of popular NLP tools using seven ensemble methods, and to quantify the improvement in performance achieved by ensembles in the extraction of data elements for three very different cohorts. Evaluation results show that the pipeline can improve the performance of NLP tools, but there is high variability depending on the cohort.

  18. Translating Vision into Design: A Method for Conceptual Design Development

    NASA Technical Reports Server (NTRS)

    Carpenter, Joyce E.

    2003-01-01

    One of the most challenging tasks for engineers is the definition of design solutions that will satisfy high-level strategic visions and objectives. Even more challenging is the need to demonstrate how a particular design solution supports the high-level vision. This paper describes a process and set of system engineering tools that have been used at the Johnson Space Center to analyze and decompose high-level objectives for future human missions into design requirements that can be used to develop alternative concepts for vehicles, habitats, and other systems. Analysis and design studies of alternative concepts and approaches are used to develop recommendations for strategic investments in research and technology that support the NASA Integrated Space Plan. In addition to a description of system engineering tools, this paper includes a discussion of collaborative design practices for human exploration mission architecture studies used at the Johnson Space Center.

  19. Clinical reasoning and its application to nursing: concepts and research studies.

    PubMed

    Banning, Maggi

    2008-05-01

    Clinical reasoning may be defined as "the process of applying knowledge and expertise to a clinical situation to develop a solution" [Carr, S., 2004. A framework for understanding clinical reasoning in community nursing. J. Clin. Nursing 13 (7), 850-857]. Several forms of reasoning exist each has its own merits and uses. Reasoning involves the processes of cognition or thinking and metacognition. In nursing, clinical reasoning skills are an expected component of expert and competent practise. Nurse research studies have identified concepts, processes and thinking strategies that might underpin the clinical reasoning used by pre-registration nurses and experienced nurses. Much of the available research on reasoning is based on the use of the think aloud approach. Although this is a useful method, it is dependent on ability to describe and verbalise the reasoning process. More nursing research is needed to explore the clinical reasoning process. Investment in teaching and learning methods is needed to enhance clinical reasoning skills in nurses.

  20. An enhanced reliability-oriented workforce planning model for process industry using combined fuzzy goal programming and differential evolution approach

    NASA Astrophysics Data System (ADS)

    Ighravwe, D. E.; Oke, S. A.; Adebiyi, K. A.

    2018-03-01

    This paper draws on the "human reliability" concept as a structure for gaining insight into the maintenance workforce assessment in a process industry. Human reliability hinges on developing the reliability of humans to a threshold that guides the maintenance workforce to execute accurate decisions within the limits of resources and time allocations. This concept offers a worthwhile point of deviation to encompass three elegant adjustments to literature model in terms of maintenance time, workforce performance and return-on-workforce investments. These fully explain the results of our influence. The presented structure breaks new grounds in maintenance workforce theory and practice from a number of perspectives. First, we have successfully implemented fuzzy goal programming (FGP) and differential evolution (DE) techniques for the solution of optimisation problem in maintenance of a process plant for the first time. The results obtained in this work showed better quality of solution from the DE algorithm compared with those of genetic algorithm and particle swarm optimisation algorithm, thus expressing superiority of the proposed procedure over them. Second, the analytical discourse, which was framed on stochastic theory, focusing on specific application to a process plant in Nigeria is a novelty. The work provides more insights into maintenance workforce planning during overhaul rework and overtime maintenance activities in manufacturing systems and demonstrated capacity in generating substantially helpful information for practice.

  1. Floaters and Sinkers: Solutions for Math and Science. Densities and Volumes. Book 5.

    ERIC Educational Resources Information Center

    Wiebe, Arthur, Ed.; And Others

    Developed to serve as a way to integrate mathematics skills and science processes, this booklet provides activities which demonstrate the concept of density for students of grades five through nine. Investigations are offered on the densities of water, salt, salt water, and woods. Opportunities are also provided in computing volumes of cylinders…

  2. Robotics Projects and Learning Concepts in Science, Technology and Problem Solving

    ERIC Educational Resources Information Center

    Barak, Moshe; Zadok, Yair

    2009-01-01

    This paper presents a study about learning and the problem solving process identified among junior high school pupils participating in robotics projects in the Lego Mindstorm environment. The research was guided by the following questions: (1) How do pupils come up with inventive solutions to problems in the context of robotics activities? (2)…

  3. Wooden Spaceships: Human-Centered Vehicle Design for Space

    NASA Technical Reports Server (NTRS)

    Twyford, Evan

    2009-01-01

    Presentation will focus on creative human centered design solutions in relation to manned space vehicle design and development in the NASA culture. We will talk about design process, iterative prototyping, mockup building and user testing and evaluation. We will take an inside look at how new space vehicle concepts are developed and designed for real life exploration scenarios.

  4. Collaborative Discourse and the Modeling of Solution Chemistry with Magnetic 3D Physical Models--Impact and Characterization

    ERIC Educational Resources Information Center

    Warfa, Abdi-Rizak M.; Roehrig, Gillian H.; Schneider, Jamie L.; Nyachwaya, James

    2014-01-01

    A significant body of the literature in science education examines students' conceptions of the dissolution of ionic solids in water, often showing that students lack proper understanding of the particulate nature of dissolving materials as well as holding numerous misconceptions about the dissolution process. Consequently, chemical educators have…

  5. Knowledge Utilization Strategies in the Design and Implementation of New Schools--Symbolic Functions.

    ERIC Educational Resources Information Center

    Sieber, Sam D.

    An examination of case studies suggests that rational processes were not entirely at work in the planning and conception of new, innovative schools. The rational model that serves as the foundation of our information systems assumes that a compelling professional need triggers a search for solutions; and, therefore, school personnel are eager to…

  6. Identification of Technologies for Provision of Future Aeronautical Communications

    NASA Technical Reports Server (NTRS)

    Gilbert, Tricia; Dyer, Glen; Henriksen, Steve; Berger, Jason; Jin, Jenny; Boci, Tony

    2006-01-01

    This report describes the process, findings, and recommendations of the second of three phases of the Future Communications Study (FCS) technology investigation conducted by NASA Glenn Research Center and ITT Advanced Engineering & Sciences Division for the Federal Aviation Administration (FAA). The FCS is a collaborative research effort between the FAA and Eurocontrol to address frequency congestion and spectrum depletion for safety critical airground communications. The goal of the technology investigation is to identify technologies that can support the longterm aeronautical mobile communication operating concept. A derived set of evaluation criteria traceable to the operating concept document is presented. An adaptation of the analytical hierarchy process is described and recommended for selecting candidates for detailed evaluation. Evaluations of a subset of technologies brought forward from the prescreening process are provided. Five of those are identified as candidates with the highest potential for continental airspace solutions in L-band (P-34, W-CDMA, LDL, B-VHF, and E-TDMA). Additional technologies are identified as best performers in the unique environments of remote/oceanic airspace in the satellite bands (Inmarsat SBB and a custom satellite solution) and the airport flight domain in C-band (802.16e). Details of the evaluation criteria, channel models, and the technology evaluations are provided in appendixes.

  7. Poka Yoke system based on image analysis and object recognition

    NASA Astrophysics Data System (ADS)

    Belu, N.; Ionescu, L. M.; Misztal, A.; Mazăre, A.

    2015-11-01

    Poka Yoke is a method of quality management which is related to prevent faults from arising during production processes. It deals with “fail-sating” or “mistake-proofing”. The Poka-yoke concept was generated and developed by Shigeo Shingo for the Toyota Production System. Poka Yoke is used in many fields, especially in monitoring production processes. In many cases, identifying faults in a production process involves a higher cost than necessary cost of disposal. Usually, poke yoke solutions are based on multiple sensors that identify some nonconformities. This means the presence of different equipment (mechanical, electronic) on production line. As a consequence, coupled with the fact that the method itself is an invasive, affecting the production process, would increase its price diagnostics. The bulky machines are the means by which a Poka Yoke system can be implemented become more sophisticated. In this paper we propose a solution for the Poka Yoke system based on image analysis and identification of faults. The solution consists of a module for image acquisition, mid-level processing and an object recognition module using associative memory (Hopfield network type). All are integrated into an embedded system with AD (Analog to Digital) converter and Zync 7000 (22 nm technology).

  8. Automating expert role to determine design concept in Kansei Engineering

    NASA Astrophysics Data System (ADS)

    Lokman, Anitawati Mohd; Haron, Mohammad Bakri Che; Abidin, Siti Zaleha Zainal; Khalid, Noor Elaiza Abd

    2016-02-01

    Affect has become imperative in product quality. In affective design field, Kansei Engineering (KE) has been recognized as a technology that enables discovery of consumer's emotion and formulation of guide to design products that win consumers in the competitive market. Albeit powerful technology, there is no rule of thumb in its analysis and interpretation process. KE expertise is required to determine sets of related Kansei and the significant concept of emotion. Many research endeavors become handicapped with the limited number of available and accessible KE experts. This work is performed to simulate the role of experts with the use of Natphoric algorithm thus providing sound solution to the complexity and flexibility in KE. The algorithm is designed to learn the process by implementing training datasets taken from previous KE research works. A framework for automated KE is then designed to realize the development of automated KE system. A comparative analysis is performed to determine feasibility of the developed prototype to automate the process. The result shows that the significant Kansei is determined by manual KE implementation and the automated process is highly similar. KE research advocates will benefit this system to automatically determine significant design concepts.

  9. Tuning coercive force by adjusting electric potential in solution processed Co/Pt(111) and the mechanism involved

    PubMed Central

    Chang, Cheng-Hsun-Tony; Kuo, Wei-Hsu; Chang, Yu-Chieh; Tsay, Jyh-Shen; Yau, Shueh-Lin

    2017-01-01

    A combination of a solution process and the control of the electric potential for magnetism represents a new approach to operating spintronic devices with a highly controlled efficiency and lower power consumption with reduced production cost. As a paradigmatic example, we investigated Co/Pt(111) in the Bloch-wall regime. The depression in coercive force was detected by applying a negative electric potential in an electrolytic solution. The reversible control of coercive force by varying the electric potential within few hundred millivolts is demonstrated. By changing the electric potential in ferromagnetic layers with smaller thicknesses, the efficiency for controlling the tunable coercive force becomes higher. Assuming that the pinning domains are independent of the applied electric potential, an electric potential tuning-magnetic anisotropy energy model was derived and provided insights into our knowledge of the relation between the electric potential tuning coercive force and the thickness of the ferromagnetic layer. Based on the fact that the coercive force can be tuned by changing the electric potential using a solution process, we developed a novel concept of electric-potential-tuned magnetic recording, resulting in a stable recording media with a high degree of writing ability. PMID:28255160

  10. Two-color holography concept (T-CHI)

    NASA Technical Reports Server (NTRS)

    Vikram, C. S.; Caulfield, H. J.; Workman, G. L.; Trolinger, J. D.; Wood, C. P.; Clark, R. L.; Kathman, A. D.; Ruggiero, R. M.

    1990-01-01

    The Material Processing in the Space Program of NASA-MSFC was active in developing numerous optical techniques for the characterization of fluids in the vicinity of various materials during crystallization and/or solidification. Two-color holographic interferometry demonstrates that temperature and concentration separation in transparent (T-CHI) model systems is possible. The experiments were performed for particular (succinonitrile) systems. Several solutions are possible in Microgravity Sciences and Applications (MSA) experiments on future Shuttle missions. The theory of the T-CHI concept is evaluated. Although particular cases are used for explanations, the concepts developed will be universal. A breadboard system design is also presented for ultimate fabrication and testing of theoretical findings. New developments in holography involving optical fibers and diode lasers are also incorporated.

  11. Compact Fuel-Cell System Would Consume Neat Methanol

    NASA Technical Reports Server (NTRS)

    Narayanan, Sekharipuram; Kindler, Andrew; Valdez, Thomas

    2007-01-01

    In a proposed direct methanol fuel-cell electric-power-generating system, the fuel cells would consume neat methanol, in contradistinction to the dilute aqueous methanol solutions consumed in prior direct methanol fuel-cell systems. The design concept of the proposed fuel-cell system takes advantage of (1) electro-osmotic drag and diffusion processes to manage the flows of hydrogen and water between the anode and the cathode and (2) evaporative cooling for regulating temperature. The design concept provides for supplying enough water to the anodes to enable the use of neat methanol while ensuring conservation of water for the whole fuel-cell system.

  12. Approach to an Affordable and Sustainable Space Transportation System

    NASA Technical Reports Server (NTRS)

    McCleskey, Caey M.; Rhodes, R. E.; Robinson, J. W.; Henderson, E. M.

    2012-01-01

    This paper describes an approach and a general procedure for creating space transportation architectural concepts that are at once affordable and sustainable. Previous papers by the authors and other members of the Space Propulsion Synergy Team (SPST) focused on a functional system breakdown structure for an architecture and definition of high-payoff design techniques with a technology integration strategy. This paper follows up by using a structured process that derives architectural solutions focused on achieving life cycle affordability and sustainability. Further, the paper includes an example concept that integrates key design techniques discussed in previous papers. !

  13. 'Beautiful' unconventional synthesis and processing technologies of superconductors and some other materials.

    PubMed

    Badica, Petre; Crisan, Adrian; Aldica, Gheorghe; Endo, Kazuhiro; Borodianska, Hanna; Togano, Kazumasa; Awaji, Satoshi; Watanabe, Kazuo; Sakka, Yoshio; Vasylkiv, Oleg

    2011-02-01

    Superconducting materials have contributed significantly to the development of modern materials science and engineering. Specific technological solutions for their synthesis and processing helped in understanding the principles and approaches to the design, fabrication and application of many other materials. In this review, we explore the bidirectional relationship between the general and particular synthesis concepts. The analysis is mostly based on our studies where some unconventional technologies were applied to different superconductors and some other materials. These technologies include spray-frozen freeze-drying, fast pyrolysis, field-assisted sintering (or spark plasma sintering), nanoblasting, processing in high magnetic fields, methods of control of supersaturation and migration during film growth, and mechanical treatments of composite wires. The analysis provides future research directions and some key elements to define the concept of 'beautiful' technology in materials science. It also reconfirms the key position and importance of superconductors in the development of new materials and unconventional synthesis approaches.

  14. Study of water recovery and solid waste processing for aerospace and domestic applications. Volume 2: Final report

    NASA Technical Reports Server (NTRS)

    Guarneri, C. A.; Reed, A.; Renman, R. E.

    1972-01-01

    The manner in which current and advanced technology can be applied to develop practical solutions to existing and emerging water supply and waste disposal problems is evaluated. An overview of water resource factors as they affect new community planning, and requirements imposed on residential waste treatment systems are presented. The results of equipment surveys contain information describing: commercially available devices and appliances designed to conserve water; devices and techniques for monitoring water quality and controlling back contamination; and advanced water and waste processing equipment. System concepts are developed and compared on the basis of current and projected costs. Economic evaluations are based on community populations of from 2,000 to 250,000. The most promising system concept is defined in sufficient depth to initiate detailed design.

  15. Concept maps: A tool for knowledge management and synthesis in web-based conversational learning.

    PubMed

    Joshi, Ankur; Singh, Satendra; Jaswal, Shivani; Badyal, Dinesh Kumar; Singh, Tejinder

    2016-01-01

    Web-based conversational learning provides an opportunity for shared knowledge base creation through collaboration and collective wisdom extraction. Usually, the amount of generated information in such forums is very huge, multidimensional (in alignment with the desirable preconditions for constructivist knowledge creation), and sometimes, the nature of expected new information may not be anticipated in advance. Thus, concept maps (crafted from constructed data) as "process summary" tools may be a solution to improve critical thinking and learning by making connections between the facts or knowledge shared by the participants during online discussion This exploratory paper begins with the description of this innovation tried on a web-based interacting platform (email list management software), FAIMER-Listserv, and generated qualitative evidence through peer-feedback. This process description is further supported by a theoretical construct which shows how social constructivism (inclusive of autonomy and complexity) affects the conversational learning. The paper rationalizes the use of concept map as mid-summary tool for extracting information and further sense making out of this apparent intricacy.

  16. From Metacognition to Practice Cognition: The DNP e-Portfolio to Promote Integrated Learning.

    PubMed

    Anderson, Kelley M; DesLauriers, Patricia; Horvath, Catherine H; Slota, Margaret; Farley, Jean Nelson

    2017-08-01

    Educating Doctor of Nursing Practice (DNP) students for an increasingly complex health care environment requires novel applications of learning concepts and technology. A deliberate and thoughtful process is required to integrate concepts of the DNP program into practice paradigm changes to subsequently improve students' abilities to innovate solutions to complex practice problems. The authors constructed or participated in electronic portfolio development inspired by theories of metacognition and integrated learning. The objective was to develop DNP student's reflection, integration of concepts, and technological capabilities to foster the deliberative competencies related to the DNP Essentials and the foundations of the DNP program. The pedagogical process demonstrates how e-portfolios adapted into the doctoral-level curriculum for DNP students can address the Essentials and foster the development of metacognitive capabilities, which translates into practice changes. The authors suggest that this pedagogical approach has the potential to optimize reflective and deliberative competencies among DNP students. [J Nurs Educ. 2017;56(8):497-500.]. Copyright 2017, SLACK Incorporated.

  17. Aquifer-yield continuum as a guide and typology for science-based groundwater management

    NASA Astrophysics Data System (ADS)

    Pierce, Suzanne A.; Sharp, John M.; Guillaume, Joseph H. A.; Mace, Robert E.; Eaton, David J.

    2013-03-01

    Groundwater availability is at the core of hydrogeology as a discipline and, simultaneously, the concept is the source of ambiguity for management and policy. Aquifer yield has undergone multiple definitions resulting in a range of scientific methods to calculate and model availability reflecting the complexity of combined scientific, management, policy, and stakeholder processes. The concept of an aquifer-yield continuum provides an approach to classify groundwater yields along a spectrum, from non-use through permissive sustained, sustainable, maximum sustained, safe, permissive mining to maximum mining yields, that builds on existing literature. Additionally, the aquifer-yield continuum provides a systems view of groundwater availability to integrate physical and social aspects in assessing management options across aquifer settings. Operational yield describes the candidate solutions for operational or technical implementation of policy, often relating to a consensus yield that incorporates human dimensions through participatory or adaptive governance processes. The concepts of operational and consensus yield address both the social and the technical nature of science-based groundwater management and governance.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cresap, D.A.; Halverson, D.S.

    In the Fluorinel Dissolution Process (FDP) upgrade, excess hydrofluoric acid in the dissolver product must be complexed with aluminum nitrate (ANN) to eliminate corrosion concerns, adjusted with nitrate to facilitate extraction, and diluted with water to ensure solution stability. This is currently accomplished via batch processing in large vessels. However, to accommodate increases in projected throughput and reduce water production in a cost-effective manner, a semi-continuous system (In-line Complexing (ILC)) has been developed. The major conclusions drawn from tests demonstrating the feasibility of this concept are given in this report.

  19. Integrated Technology Assessment Center (ITAC) Update

    NASA Technical Reports Server (NTRS)

    Taylor, J. L.; Neely, M. A.; Curran, F. M.; Christensen, E. R.; Escher, D.; Lovell, N.; Morris, Charles (Technical Monitor)

    2002-01-01

    The Integrated Technology Assessment Center (ITAC) has developed a flexible systems analysis framework to identify long-term technology needs, quantify payoffs for technology investments, and assess the progress of ASTP-sponsored technology programs in the hypersonics area. For this, ITAC has assembled an experienced team representing a broad sector of the aerospace community and developed a systematic assessment process complete with supporting tools. Concepts for transportation systems are selected based on relevance to the ASTP and integrated concept models (ICM) of these concepts are developed. Key technologies of interest are identified and projections are made of their characteristics with respect to their impacts on key aspects of the specific concepts of interest. Both the models and technology projections are then fed into the ITAC's probabilistic systems analysis framework in ModelCenter. This framework permits rapid sensitivity analysis, single point design assessment, and a full probabilistic assessment of each concept with respect to both embedded and enhancing technologies. Probabilistic outputs are weighed against metrics of interest to ASTP using a multivariate decision making process to provide inputs for technology prioritization within the ASTP. ITAC program is currently finishing the assessment of a two-stage-to-orbit (TSTO), rocket-based combined cycle (RBCC) concept and a TSTO turbine-based combined cycle (TBCC) concept developed by the team with inputs from NASA. A baseline all rocket TSTO concept is also being developed for comparison. Boeing has recently submitted a performance model for their Flexible Aerospace System Solution for Tomorrow (FASST) concept and the ISAT program will provide inputs for a single-stage-to-orbit (SSTO) TBCC based concept in the near-term. Both of these latter concepts will be analyzed within the ITAC framework over the summer. This paper provides a status update of the ITAC program.

  20. [Global MED-NET].

    PubMed

    Slatina, E

    2000-01-01

    This poster will show a concept as well as some practical solutions for the database which is intended to be used by all members and institutions of the Emergency Medical Care international institutions network. This poster shows how data is automatically processed in a rapid and good quality way, while in the same time the lives are saved and time and money are saved too.

  1. Group Solutions, Too! More Cooperative Logic Activities for Grades K-4. Teacher's Guide. LHS GEMS.

    ERIC Educational Resources Information Center

    Goodman, Jan M.; Kopp, Jaine

    There is evidence that structured cooperative logic is an effective way to introduce or reinforce mathematics concepts, explore thinking processes basic to both math and science, and develop the important social skills of cooperative problem-solving. This book contains a number of cooperative logic activities for grades K-4 in order to improve…

  2. Behavioral Reference Model for Pervasive Healthcare Systems.

    PubMed

    Tahmasbi, Arezoo; Adabi, Sahar; Rezaee, Ali

    2016-12-01

    The emergence of mobile healthcare systems is an important outcome of application of pervasive computing concepts for medical care purposes. These systems provide the facilities and infrastructure required for automatic and ubiquitous sharing of medical information. Healthcare systems have a dynamic structure and configuration, therefore having an architecture is essential for future development of these systems. The need for increased response rate, problem limited storage, accelerated processing and etc. the tendency toward creating a new generation of healthcare system architecture highlight the need for further focus on cloud-based solutions for transfer data and data processing challenges. Integrity and reliability of healthcare systems are of critical importance, as even the slightest error may put the patients' lives in danger; therefore acquiring a behavioral model for these systems and developing the tools required to model their behaviors are of significant importance. The high-level designs may contain some flaws, therefor the system must be fully examined for different scenarios and conditions. This paper presents a software architecture for development of healthcare systems based on pervasive computing concepts, and then models the behavior of described system. A set of solutions are then proposed to improve the design's qualitative characteristics including, availability, interoperability and performance.

  3. A new approach to the concept of "relevance" in information retrieval (IR).

    PubMed

    Kagolovsky, Y; Möhr, J R

    2001-01-01

    The concept of "relevance" is the fundamental concept of information science in general and information retrieval, in particular. Although "relevance" is extensively used in evaluation of information retrieval, there are considerable problems associated with reaching an agreement on its definition, meaning, evaluation, and application in information retrieval. There are a number of different views on "relevance" and its use for evaluation. Based on a review of the literature the main problems associated with the concept of "relevance" in information retrieval are identified. The authors argue that the proposal for the solution of the problems can be based on the conceptual IR framework built using a systems analytic approach to IR. Using this framework different kinds of "relevance" relationships in the IR process are identified, and a methodology for evaluation of "relevance" based on methods of semantics capturing and comparison is proposed.

  4. Quantifying solute transport processes: are chemically "conservative" tracers electrically conservative?

    USGS Publications Warehouse

    Singha, Kamini; Li, Li; Day-Lewis, Frederick D.; Regberg, Aaron B.

    2012-01-01

    The concept of a nonreactive or conservative tracer, commonly invoked in investigations of solute transport, requires additional study in the context of electrical geophysical monitoring. Tracers that are commonly considered conservative may undergo reactive processes, such as ion exchange, thus changing the aqueous composition of the system. As a result, the measured electrical conductivity may reflect not only solute transport but also reactive processes. We have evaluated the impacts of ion exchange reactions, rate-limited mass transfer, and surface conduction on quantifying tracer mass, mean arrival time, and temporal variance in laboratory-scale column experiments. Numerical examples showed that (1) ion exchange can lead to resistivity-estimated tracer mass, velocity, and dispersivity that may be inaccurate; (2) mass transfer leads to an overestimate in the mobile tracer mass and an underestimate in velocity when using electrical methods; and (3) surface conductance does not notably affect estimated moments when high-concentration tracers are used, although this phenomenon may be important at low concentrations or in sediments with high and/or spatially variable cation-exchange capacity. In all cases, colocated groundwater concentration measurements are of high importance for interpreting geophysical data with respect to the controlling transport processes of interest.

  5. Utilization of Aluminum Waste with Hydrogen and Heat Generation

    NASA Astrophysics Data System (ADS)

    Buryakovskaya, O. A.; Meshkov, E. A.; Vlaskin, M. S.; Shkolnokov, E. I.; Zhuk, A. Z.

    2017-10-01

    A concept of energy generation via hydrogen and heat production from aluminum containing wastes is proposed. The hydrogen obtained by oxidation reaction between aluminum waste and aqueous solutions can be supplied to fuel cells and/or infrared heaters for electricity or heat generation in the region of waste recycling. The heat released during the reaction also can be effectively used. The proposed method of aluminum waste recycling may represent a promising and cost-effective solution in cases when waste transportation to recycling plants involves significant financial losses (e.g. remote areas). Experiments with mechanically dispersed aluminum cans demonstrated that the reaction rate in alkaline solution is high enough for practical use of the oxidation process. In theexperiments aluminum oxidation proceeds without any additional aluminum activation.

  6. The greenhouse gas and energy balance of different treatment concepts for bio-waste.

    PubMed

    Ortner, Maria E; Müller, Wolfgang; Bockreis, Anke

    2013-10-01

    The greenhouse gas (GHG) and energy performance of bio-waste treatment plants been investigated for three characteristic bio-waste treatment concepts: composting; biological drying for the production of biomass fuel fractions; and anaerobic digestion. Compared with other studies about the environmental impacts of bio-waste management, this study focused on the direct comparison of the latest process concepts and state-of-the-art emission control measures. To enable a comparison, the mass balance and products were modelled for all process concepts assuming the same bio-waste amounts and properties. In addition, the value of compost as a soil improver was included in the evaluation, using straw as a reference system. This aspect has rarely been accounted for in other studies. The study is based on data from operational facilities combined with literature data. The results show that all three concepts contribute to a reduction of GHG emissions and show a positive balance for cumulated energy demand. However, in contrast to other studies, the advantage of anaerobic digestion compared with composting is smaller as a result of accounting for the soil improving properties of compost. Still, anaerobic digestion is the environmentally superior solution. The results are intended to inform decision makers about the relevant aspects of bio-waste treatment regarding the environmental impacts of different bio-waste management strategies.

  7. Extreme groundwater levels caused by extreme weather conditions - the highest ever measured groundwater levels in Middle Germany and their management

    NASA Astrophysics Data System (ADS)

    Reinstorf, F.

    2016-12-01

    Extreme weather conditions during the years 2009 - 2011 in combination with changes in the regional water management and possible impacts of climate change led to maximum groundwater levels in large areas of Germany in 2011. This resulted in extensive water logging, with problems especially in urban areas near rivers, where water logging produced huge problems for buildings and infrastructure. The acute situation still exists in many areas and requires the development of solution concepts. Taken the example of the Elbe-Saale-Region in the Federal State of Saxony-Anhalt, were a pilot research project was carried out, the analytical situation, the development of a management tool and the implementation of a groundwater management concept are shown. The central tool is a coupled water budget - groundwater flow model. In combination with sophisticated multi-scale parameter estimation, a high resolution groundwater level simulation was carried out. A decision support process with a very intensive stakeholder interaction combined with high resolution simulations enables the development of a management concept for extreme groundwater situations in consideration of sustainable and environmentally sound solutions mainly on the base of passive measures.

  8. Extreme groundwater levels caused by extreme weather conditions - the highest ever measured groundwater levels in Middle Germany and their management

    NASA Astrophysics Data System (ADS)

    Reinstorf, Frido; Kramer, Stefanie; Koch, Thomas; Seifert, Sven; Monninkhoff, Bertram; Pfützner, Bernd

    2017-04-01

    Extreme weather conditions during the years 2009 - 2011 in combination with changes in the regional water management and possible impacts of climate change led to maximum groundwater levels in large areas of Germany in 2011. This resulted in extensive water logging, with problems especially in urban areas near rivers, where water logging produced huge problems for buildings and infrastructure. The acute situation still exists in many areas and requires the development of solution concepts. Taken the example of the Elbe-Saale-Region in the Federal State of Saxony-Anhalt, were a pilot research project was carried out, the analytical situation, the development of a management tool and the implementation of a groundwater management concept are shown. The central tool is a coupled water budget - groundwater flow model. In combination with sophisticated multi-scale parameter estimation, a high resolution groundwater level simulation was carried out. A decision support process with a very intensive stakeholder interaction combined with high resolution simulations enables the development of a management concept for extreme groundwater situations in consideration of sustainable and environmentally sound solutions mainly on the base of passive measures.

  9. [Locus of control and self-concept in interpersonal conflict resolution approaches].

    PubMed

    Hisli Sahin, Nesrin; Basim, H Nejat; Cetin, Fatih

    2009-01-01

    The purpose of this study was to investigate the relationship between self-concept and locus of control in interpersonal conflict resolution approaches and to determine the predictors of conflict resolution approach choices. The study included 345 students aged between 18 and 28 years that were studying at universities in Ankara. Data were collected using the Interpersonal Conflict Resolution Approaches Scale to measure conflict resolution approaches, the Social Comparison Scale to measure self-concept, and the Internal-External Locus of Control Scale to measure locus of control. It was observed that confrontation approach to interpersonal conflict was predicted by self-concept (beta = 0.396, P < 0.001) Moreover, self-concept was related to self-disclosure (beta = 0.180, P < 0.01) and emotional expression (beta = 0.196, P < 0.001) approaches. Locus of control played a role in the choice of all resolution approaches. In addition to these findings, it was observed that females used self-disclosure (beta = -0.163, P < 0.01) and emotional expression (beta = -0.219, P < 0.001), while males used approach (beta = 0.395, P < 0.001) and public behavior (beta = 0.270, P < 0.001) approaches in the resolution processes. Self-concept and locus of control were related to the behaviors adopted in the interpersonal conflict resolution process. Individuals with a positive self-concept and an internal locus of control adopted solutions to interpersonal conflict resolution that were more effective and constructive.

  10. [Patient-centered approaches to understanding, transformation and solution of team conflicts in the psychiatric clinic within the scope of the Balint group concept].

    PubMed

    Drees, A

    1987-08-01

    The working climate and therapeutic possibilities in a hospital are determined, among other factors, by emotional processes in everyday ward routine. Team conflicts and their solution are not infrequently reflections of the open-mindedness of a hospital towards the complexity of these processes. However, the complex interlocking of transference processes with rôle-specific and personality-conditioned behaviour patterns makes it more difficult to understand and make use of these emotional processes within the team. We present a specific attempt to working up emotional conflicts in a patient-centred approach via focussing on self-rating of the team workers in respect of mood, feeling tone and imagination. Specific internal Balint groups are the fulcrum. To distinguish this method from the theory of object-directed transference of emotions and constructions of relations, the theoretical basis of this group method is seen in the systemic paradigm with which patient-focussed solution functions are obtained in respect of process orientation and instrumental part functions of the team workers. In this connection it was explored to what extent the following factors can be interpreted as patient-induced phenomena: therapeutic and rôle behaviour, hospital structures and administrative squabbles, internal and external walls of a mental hospital.

  11. Identifying Key Issues and Potential Solutions for Integrated Arrival, Departure, Surface Operations by Surveying Stakeholder Preferences

    NASA Technical Reports Server (NTRS)

    Aponso, Bimal; Coppenbarger, Richard A.; Jung, Yoon; Quon, Leighton; Lohr, Gary; O’Connor, Neil; Engelland, Shawn

    2015-01-01

    NASA's Aeronautics Research Mission Directorate (ARMD) collaborates with the FAA and industry to provide concepts and technologies that enhance the transition to the next-generation air-traffic management system (NextGen). To facilitate this collaboration, ARMD has a series of Airspace Technology Demonstration (ATD) sub-projects that develop, demonstrate, and transitions NASA technologies and concepts for implementation in the National Airspace System (NAS). The second of these sub-projects, ATD-2, is focused on the potential benefits to NAS stakeholders of integrated arrival, departure, surface (IADS) operations. To determine the project objectives and assess the benefits of a potential solution, NASA surveyed NAS stakeholders to understand the existing issues in arrival, departure, and surface operations, and the perceived benefits of better integrating these operations. NASA surveyed a broad cross-section of stakeholders representing the airlines, airports, air-navigation service providers, and industry providers of NAS tools. The survey indicated that improving the predictability of flight times (schedules) could improve efficiency in arrival, departure, and surface operations. Stakeholders also mentioned the need for better strategic and tactical information on traffic constraints as well as better information sharing and a coupled collaborative planning process that allows stakeholders to coordinate IADS operations. To assess the impact of a potential solution, NASA sketched an initial departure scheduling concept and assessed its viability by surveying a select group of stakeholders for a second time. The objective of the departure scheduler was to enable flights to move continuously from gate to cruise with minimal interruption in a busy metroplex airspace environment using strategic and tactical scheduling enhanced by collaborative planning between airlines and service providers. The stakeholders agreed that this departure concept could improve schedule predictability and suggested several key attributes that were necessary to make the concept successful. The goals and objectives of the planned ATD-2 sub-project will incorporate the results of this stakeholder feedback.

  12. Facilitating the Concept of Universal Design Among Design Students - Changes in Teaching in the Last Decade.

    PubMed

    Vavik, Tom

    2016-01-01

    This short paper describes and reflects on how the teaching of the concept of Universal Design (UD) has developed in the last decade at the Institute of Design at the Oslo School of Architecture and Design (AHO). Four main changes are described. Firstly, the curriculum has evolved from teaching guidelines and principles to focusing on design processes. Secondly, an increased emphasis is put on cognitive accessibility. Thirdly, non-stigmatizing aesthetics expressions and solutions that communicate through different senses have become more important subjects. Fourthly the teaching of UD has moved from the second to the first year curriculum.

  13. Healthcare information system approaches based on middleware concepts.

    PubMed

    Holena, M; Blobel, B

    1997-01-01

    To meet the challenges for efficient and high-level quality, health care systems must implement the "Shared Care" paradigm of distributed co-operating systems. To this end, both the newly developed and legacy applications must be fully integrated into the care process. These requirements can be fulfilled by information systems based on middleware concepts. In the paper, the middleware approaches HL7, DHE, and CORBA are described. The relevance of those approaches to the healthcare domain is documented. The description presented here is complemented through two other papers in this volume, concentrating on the evaluation of the approaches, and on their security threats and solutions.

  14. Eco-innovative design approach: Integrating quality and environmental aspects in prioritizing and solving engineering problems

    NASA Astrophysics Data System (ADS)

    Chakroun, Mahmoud; Gogu, Grigore; Pacaud, Thomas; Thirion, François

    2014-09-01

    This study proposes an eco-innovative design process taking into consideration quality and environmental aspects in prioritizing and solving technical engineering problems. This approach provides a synergy between the Life Cycle Assessment (LCA), the nonquality matrix, the Theory of Inventive Problem Solving (TRIZ), morphological analysis and the Analytical Hierarchy Process (AHP). In the sequence of these tools, LCA assesses the environmental impacts generated by the system. Then, for a better consideration of environmental aspects, a new tool is developed, the non-quality matrix, which defines the problem to be solved first from an environmental point of view. The TRIZ method allows the generation of new concepts and contradiction resolution. Then, the morphological analysis offers the possibility of extending the search space of solutions in a design problem in a systematic way. Finally, the AHP identifies the promising solution(s) by providing a clear logic for the choice made. Their usefulness has been demonstrated through their application to a case study involving a centrifugal spreader with spinning discs.

  15. Unifying hydrotropy under Gibbs phase rule.

    PubMed

    Shimizu, Seishi; Matubayasi, Nobuyuki

    2017-09-13

    The task of elucidating the mechanism of solubility enhancement using hydrotropes has been hampered by the wide variety of phase behaviour that hydrotropes can exhibit, encompassing near-ideal aqueous solution, self-association, micelle formation, and micro-emulsions. Instead of taking a field guide or encyclopedic approach to classify hydrotropes into different molecular classes, we take a rational approach aiming at constructing a unified theory of hydrotropy based upon the first principles of statistical thermodynamics. Achieving this aim can be facilitated by the two key concepts: (1) the Gibbs phase rule as the basis of classifying the hydrotropes in terms of the degrees of freedom and the number of variables to modulate the solvation free energy; (2) the Kirkwood-Buff integrals to quantify the interactions between the species and their relative contributions to the process of solubilization. We demonstrate that the application of the two key concepts can in principle be used to distinguish the different molecular scenarios at work under apparently similar solubility curves observed from experiments. In addition, a generalization of our previous approach to solutes beyond dilution reveals the unified mechanism of hydrotropy, driven by a strong solute-hydrotrope interaction which overcomes the apparent per-hydrotrope inefficiency due to hydrotrope self-clustering.

  16. Electronic Health Records: An Enhanced Security Paradigm to Preserve Patient's Privacy

    NASA Astrophysics Data System (ADS)

    Slamanig, Daniel; Stingl, Christian

    In recent years, demographic change and increasing treatment costs demand the adoption of more cost efficient, highly qualitative and integrated health care processes. The rapid growth and availability of the Internet facilitate the development of eHealth services and especially of electronic health records (EHRs) which are promising solutions to meet the aforementioned requirements. Considering actual web-based EHR systems, patient-centric and patient moderated approaches are widely deployed. Besides, there is an emerging market of so called personal health record platforms, e.g. Google Health. Both concepts provide a central and web-based access to highly sensitive medical data. Additionally, the fact that these systems may be hosted by not fully trustworthy providers necessitates to thoroughly consider privacy issues. In this paper we define security and privacy objectives that play an important role in context of web-based EHRs. Furthermore, we discuss deployed solutions as well as concepts proposed in the literature with respect to this objectives and point out several weaknesses. Finally, we introduce a system which overcomes the drawbacks of existing solutions by considering an holistic approach to preserve patient's privacy and discuss the applied methods.

  17. The distributed agent-based approach in the e-manufacturing environment

    NASA Astrophysics Data System (ADS)

    Sękala, A.; Kost, G.; Dobrzańska-Danikiewicz, A.; Banaś, W.; Foit, K.

    2015-11-01

    The deficiency of a coherent flow of information from a production department causes unplanned downtime and failures of machines and their equipment, which in turn results in production planning process based on incorrect and out-of-date information. All of these factors entail, as the consequence, the additional difficulties associated with the process of decision-making. They concern, among other, the coordination of components of a distributed system and providing the access to the required information, thereby generating unnecessary costs. The use of agent technology significantly speeds up the flow of information within the virtual enterprise. This paper includes the proposal of a multi-agent approach for the integration of processes within the virtual enterprise concept. The presented concept was elaborated to investigate the possible solutions of the ways of transmission of information in the production system taking into account the self-organization of constituent components. Thus it implicated the linking of the concept of multi-agent system with the system of managing the production information, based on the idea of e-manufacturing. The paper presents resulting scheme that should be the base for elaborating an informatics model of the target virtual system. The computer system itself is intended to be developed next.

  18. Time-to-impact sensors in robot vision applications based on the near-sensor image processing concept

    NASA Astrophysics Data System (ADS)

    Åström, Anders; Forchheimer, Robert

    2012-03-01

    Based on the Near-Sensor Image Processing (NSIP) concept and recent results concerning optical flow and Time-to- Impact (TTI) computation with this architecture, we show how these results can be used and extended for robot vision applications. The first case involves estimation of the tilt of an approaching planar surface. The second case concerns the use of two NSIP cameras to estimate absolute distance and speed similar to a stereo-matching system but without the need to do image correlations. Going back to a one-camera system, the third case deals with the problem to estimate the shape of the approaching surface. It is shown that the previously developed TTI method not only gives a very compact solution with respect to hardware complexity, but also surprisingly high performance.

  19. Too Much Fun for Therapy: Therapeutic Recreation as an Intervention Tool with At-Risk Youth. A Series of Solutions and Strategies.

    ERIC Educational Resources Information Center

    Brooks, Katherine Walker

    This publication introduces the concept of therapeutic recreation (TR), illustrating its natural fit into the educational process and its use with at-risk students, and providing resources for further use. Section 1 examines what places a child at risk, focusing on educational goals, student behaviors, and home life. Section 2 defines TR as a…

  20. Mediation as a Leadership Strategy to Deal with Conflict in Schools

    ERIC Educational Resources Information Center

    Ntho-Ntho, Maitumeleng Albertina; Nieuwenhuis, Frederik Jan

    2016-01-01

    The concept of mediation is a process that is frequently used in the labour field and is under-developed in a field such as education. Mediation as a strategy to resolve conflict in an amicable way has gained support in various other fields but seems not to be seen as a mainstream solution to resolving conflict in education. This article reports…

  1. Designing Interactive Electronic Module in Chemistry Lessons

    NASA Astrophysics Data System (ADS)

    Irwansyah, F. S.; Lubab, I.; Farida, I.; Ramdhani, M. A.

    2017-09-01

    This research aims to design electronic module (e-module) oriented to the development of students’ chemical literacy on the solution colligative properties material. This research undergoes some stages including concept analysis, discourse analysis, storyboard design, design development, product packaging, validation, and feasibility test. Overall, this research undertakes three main stages, namely, Define (in the form of preliminary studies); Design (designing e-module); Develop (including validation and model trial). The concept presentation and visualization used in this e-module is oriented to chemical literacy skills. The presentation order carries aspects of scientific context, process, content, and attitude. Chemists and multi media experts have done the validation to test the initial quality of the products and give a feedback for the product improvement. The feasibility test results stated that the content presentation and display are valid and feasible to be used with the value of 85.77% and 87.94%. These values indicate that this e-module oriented to students’ chemical literacy skills for the solution colligative properties material is feasible to be used.

  2. Metaphor and analogy in everyday problem solving.

    PubMed

    Keefer, Lucas A; Landau, Mark J

    2016-11-01

    Early accounts of problem solving focused on the ways people represent information directly related to target problems and possible solutions. Subsequent theory and research point to the role of peripheral influences such as heuristics and bodily states. We discuss how metaphor and analogy similarly influence stages of everyday problem solving: Both processes mentally map features of a target problem onto the structure of a relatively more familiar concept. When individuals apply this structure, they use a well-known concept as a framework for reasoning about real world problems and candidate solutions. Early studies found that analogy use helped people gain insight into novel problems. More recent research on metaphor goes further to show that activating mappings has subtle, sometimes surprising effects on judgment and reasoning in everyday problem solving. These findings highlight situations in which mappings can help or hinder efforts to solve problems. WIREs Cogn Sci 2016, 7:394-405. doi: 10.1002/wcs.1407 For further resources related to this article, please visit the WIREs website. © 2016 Wiley Periodicals, Inc.

  3. Context-Aware Adaptive Hybrid Semantic Relatedness in Biomedical Science

    NASA Astrophysics Data System (ADS)

    Emadzadeh, Ehsan

    Text mining of biomedical literature and clinical notes is a very active field of research in biomedical science. Semantic analysis is one of the core modules for different Natural Language Processing (NLP) solutions. Methods for calculating semantic relatedness of two concepts can be very useful in solutions solving different problems such as relationship extraction, ontology creation and question / answering [1--6]. Several techniques exist in calculating semantic relatedness of two concepts. These techniques utilize different knowledge sources and corpora. So far, researchers attempted to find the best hybrid method for each domain by combining semantic relatedness techniques and data sources manually. In this work, attempts were made to eliminate the needs for manually combining semantic relatedness methods targeting any new contexts or resources through proposing an automated method, which attempted to find the best combination of semantic relatedness techniques and resources to achieve the best semantic relatedness score in every context. This may help the research community find the best hybrid method for each context considering the available algorithms and resources.

  4. Towards efficient next generation light sources: combined solution processed and evaporated layers for OLEDs

    NASA Astrophysics Data System (ADS)

    Hartmann, D.; Sarfert, W.; Meier, S.; Bolink, H.; García Santamaría, S.; Wecker, J.

    2010-05-01

    Typically high efficient OLED device structures are based on a multitude of stacked thin organic layers prepared by thermal evaporation. For lighting applications these efficient device stacks have to be up-scaled to large areas which is clearly challenging in terms of high through-put processing at low-cost. One promising approach to meet cost-efficiency, high through-put and high light output is the combination of solution and evaporation processing. Moreover, the objective is to substitute as many thermally evaporated layers as possible by solution processing without sacrificing the device performance. Hence, starting from the anode side, evaporated layers of an efficient white light emitting OLED stack are stepwise replaced by solution processable polymer and small molecule layers. In doing so different solutionprocessable hole injection layers (= polymer HILs) are integrated into small molecule devices and evaluated with regard to their electro-optical performance as well as to their planarizing properties, meaning the ability to cover ITO spikes, defects and dust particles. Thereby two approaches are followed whereas in case of the "single HIL" approach only one polymer HIL is coated and in case of the "combined HIL" concept the coated polymer HIL is combined with a thin evaporated HIL. These HIL architectures are studied in unipolar as well as bipolar devices. As a result the combined HIL approach facilitates a better control over the hole current, an improved device stability as well as an improved current and power efficiency compared to a single HIL as well as pure small molecule based OLED stacks. Furthermore, emitting layers based on guest/host small molecules are fabricated from solution and integrated into a white hybrid stack (WHS). Up to three evaporated layers were successfully replaced by solution-processing showing comparable white light emission spectra like an evaporated small molecule reference stack and lifetime values of several 100 h.

  5. Visualization of multi-INT fusion data using Java Viewer (JVIEW)

    NASA Astrophysics Data System (ADS)

    Blasch, Erik; Aved, Alex; Nagy, James; Scott, Stephen

    2014-05-01

    Visualization is important for multi-intelligence fusion and we demonstrate issues for presenting physics-derived (i.e., hard) and human-derived (i.e., soft) fusion results. Physics-derived solutions (e.g., imagery) typically involve sensor measurements that are objective, while human-derived (e.g., text) typically involve language processing. Both results can be geographically displayed for user-machine fusion. Attributes of an effective and efficient display are not well understood, so we demonstrate issues and results for filtering, correlation, and association of data for users - be they operators or analysts. Operators require near-real time solutions while analysts have the opportunities of non-real time solutions for forensic analysis. In a use case, we demonstrate examples using the JVIEW concept that has been applied to piloting, space situation awareness, and cyber analysis. Using the open-source JVIEW software, we showcase a big data solution for multi-intelligence fusion application for context-enhanced information fusion.

  6. The effectiveness of sodium hydroxide (NaOH) and sodium carbonate (Na2CO3) on the impurities removal of saturated salt solution

    NASA Astrophysics Data System (ADS)

    Pujiastuti, C.; Ngatilah, Y.; Sumada, K.; Muljani, S.

    2018-01-01

    Increasing the quality of salt can be done through various methods such as washing (hydro-extraction), re-crystallization, ion exchange methods and others. In the process of salt quality improvement by re-crystallization method where salt product diluted with water to form saturated solution and re-crystallized through heating process. The quality of the salt produced is influenced by the quality of the dissolved salt and the crystallization mechanism applied. In this research is proposed a concept that before the saturated salt solution is recrystallized added a chemical for removal of the impurities such as magnesium ion (Mg), calcium (Ca), potassium (K) and sulfate (SO4) is contained in a saturated salt solution. The chemical reagents that used are sodium hydroxide (NaOH) 2 N and sodium carbonate (Na2CO3) 2 N. This research aims to study effectiveness of sodium hydroxide and sodium carbonate on the impurities removal of magnesium (Mg), calcium (Ca), potassium (K) and sulfate (SO4). The results showed that the addition of sodium hydroxide solution can be decreased the impurity ions of magnesium (Mg) 95.2%, calcium ion (Ca) 45%, while the addition of sodium carbonate solution can decreased magnesium ion (Mg) 66.67% and calcium ion (Ca) 77.5%, but both types of materials are not degradable sulfate ions (SO4). The sodium hydroxide solution more effective to decrease magnesium ion than sodium carbonate solution, and the sodium carbonate solution more effective to decrease calcium ion than sodium hydroxide solution.

  7. A classical view on nonclassical nucleation.

    PubMed

    Smeets, Paul J M; Finney, Aaron R; Habraken, Wouter J E M; Nudelman, Fabio; Friedrich, Heiner; Laven, Jozua; De Yoreo, James J; Rodger, P Mark; Sommerdijk, Nico A J M

    2017-09-19

    Understanding and controlling nucleation is important for many crystallization applications. Calcium carbonate (CaCO 3 ) is often used as a model system to investigate nucleation mechanisms. Despite its great importance in geology, biology, and many industrial applications, CaCO 3 nucleation is still a topic of intense discussion, with new pathways for its growth from ions in solution proposed in recent years. These new pathways include the so-called nonclassical nucleation mechanism via the assembly of thermodynamically stable prenucleation clusters, as well as the formation of a dense liquid precursor phase via liquid-liquid phase separation. Here, we present results from a combined experimental and computational investigation on the precipitation of CaCO 3 in dilute aqueous solutions. We propose that a dense liquid phase (containing 4-7 H 2 O per CaCO 3 unit) forms in supersaturated solutions through the association of ions and ion pairs without significant participation of larger ion clusters. This liquid acts as the precursor for the formation of solid CaCO 3 in the form of vaterite, which grows via a net transfer of ions from solution according to z Ca 2+ + z CO 3 2- → z CaCO 3 The results show that all steps in this process can be explained according to classical concepts of crystal nucleation and growth, and that long-standing physical concepts of nucleation can describe multistep, multiphase growth mechanisms.

  8. America's credentialing crisis: a review of its origins and attempts at solutions.

    PubMed

    Lewis-Jenkins, G

    2000-01-01

    Physicians and their office staffs need no introduction to the concept of practitioner credentialing. They know all too well the inefficiencies, redundancies, and frustrations that typically accompany the process. They may not be as well informed, however, about the origins of the current credentialing crisis and its far-reaching effects on health care. This article supplies a context for understanding the origin and evolution of the crisis and examines the differing approaches health care organizations and state governments are taking to address it. It also provides a look at how one state and its health care community are attempting to create a homegrown, collaborative solution.

  9. Mental models: a basic concept for human factors design in infection prevention.

    PubMed

    Sax, H; Clack, L

    2015-04-01

    Much of the effort devoted to promoting better hand hygiene is based on the belief that poor hand hygiene reflects poor motivation. We argue, however, that automatic unconscious behaviour driven by 'mental models' is an important contributor to what actually happens. Mental models are concepts of reality--imaginary, often blurred, and sometimes unstable. Human beings use them to reduce mental load and free up capacity in the conscious mind to focus on deliberate activities. They are pragmatic solutions to the complexity of life. Knowledge of such mental processes helps healthcare designers and clinicians overcome barriers to behavioural change. This article reviews the concept of mental models and considers how it can be used to improve hand hygiene and patient safety. Copyright © 2015 The Healthcare Infection Society. Published by Elsevier Ltd. All rights reserved.

  10. Storage system architectures and their characteristics

    NASA Technical Reports Server (NTRS)

    Sarandrea, Bryan M.

    1993-01-01

    Not all users storage requirements call for 20 MBS data transfer rates, multi-tier file or data migration schemes, or even automated retrieval of data. The number of available storage solutions reflects the broad range of user requirements. It is foolish to think that any one solution can address the complete range of requirements. For users with simple off-line storage requirements, the cost and complexity of high end solutions would provide no advantage over a more simple solution. The correct answer is to match the requirements of a particular storage need to the various attributes of the available solutions. The goal of this paper is to introduce basic concepts of archiving and storage management in combination with the most common architectures and to provide some insight into how these concepts and architectures address various storage problems. The intent is to provide potential consumers of storage technology with a framework within which to begin the hunt for a solution which meets their particular needs. This paper is not intended to be an exhaustive study or to address all possible solutions or new technologies, but is intended to be a more practical treatment of todays storage system alternatives. Since most commercial storage systems today are built on Open Systems concepts, the majority of these solutions are hosted on the UNIX operating system. For this reason, some of the architectural issues discussed focus around specific UNIX architectural concepts. However, most of the architectures are operating system independent and the conclusions are applicable to such architectures on any operating system.

  11. An atomistic simulation scheme for modeling crystal formation from solution.

    PubMed

    Kawska, Agnieszka; Brickmann, Jürgen; Kniep, Rüdiger; Hochrein, Oliver; Zahn, Dirk

    2006-01-14

    We present an atomistic simulation scheme for investigating crystal growth from solution. Molecular-dynamics simulation studies of such processes typically suffer from considerable limitations concerning both system size and simulation times. In our method this time-length scale problem is circumvented by an iterative scheme which combines a Monte Carlo-type approach for the identification of ion adsorption sites and, after each growth step, structural optimization of the ion cluster and the solvent by means of molecular-dynamics simulation runs. An important approximation of our method is based on assuming full structural relaxation of the aggregates between each of the growth steps. This concept only holds for compounds of low solubility. To illustrate our method we studied CaF2 aggregate growth from aqueous solution, which may be taken as prototypes for compounds of very low solubility. The limitations of our simulation scheme are illustrated by the example of NaCl aggregation from aqueous solution, which corresponds to a solute/solvent combination of very high salt solubility.

  12. Human face recognition using eigenface in cloud computing environment

    NASA Astrophysics Data System (ADS)

    Siregar, S. T. M.; Syahputra, M. F.; Rahmat, R. F.

    2018-02-01

    Doing a face recognition for one single face does not take a long time to process, but if we implement attendance system or security system on companies that have many faces to be recognized, it will take a long time. Cloud computing is a computing service that is done not on a local device, but on an internet connected to a data center infrastructure. The system of cloud computing also provides a scalability solution where cloud computing can increase the resources needed when doing larger data processing. This research is done by applying eigenface while collecting data as training data is also done by using REST concept to provide resource, then server can process the data according to existing stages. After doing research and development of this application, it can be concluded by implementing Eigenface, recognizing face by applying REST concept as endpoint in giving or receiving related information to be used as a resource in doing model formation to do face recognition.

  13. Mimicking electrodeposition in the gas phase: a programmable concept for selected-area fabrication of multimaterial nanostructures.

    PubMed

    Cole, Jesse J; Lin, En-Chiang; Barry, Chad R; Jacobs, Heiko O

    2010-05-21

    An in situ gas-phase process that produces charged streams of Au, Si, TiO(2), ZnO, and Ge nanoparticles/clusters is reported together with a programmable concept for selected-area assembly/printing of more than one material type. The gas-phase process mimics solution electrodeposition whereby ions in the liquid phase are replaced with charged clusters in the gas phase. The pressure range in which the analogy applies is discussed and it is demonstrated that particles can be plated into pores vertically (minimum resolution 60 nm) or laterally to form low-resistivity (48 microOmega cm) interconnects. The process is applied to the formation of multimaterial nanoparticle films and sensors. The system works at atmospheric pressure and deposits material at room temperature onto electrically biased substrate regions. The combination of pumpless operation and parallel nozzle-free deposition provides a scalable tool for printable flexible electronics and the capability to mix and match materials.

  14. Creating Dynamic Learning Environment to Enhance Students’ Engagement in Learning Geometry

    NASA Astrophysics Data System (ADS)

    Sariyasa

    2017-04-01

    Learning geometry gives many benefits to students. It strengthens the development of deductive thinking and reasoning; it also provides an opportunity to improve visualisation and spatial ability. Some studies, however, have pointed out the difficulties that students encountered when learning geometry. A preliminary study by the author in Bali revealed that one of the main problems was teachers’ difficulties in delivering geometry instruction. It was partly due to the lack of appropriate instructional media. Coupling with dynamic geometry software, dynamic learning environments is a promising solution to this problem. Employing GeoGebra software supported by the well-designed instructional process may result in more meaningful learning, and consequently, students are motivated to engage in the learning process more deeply and actively. In this paper, we provide some examples of GeoGebra-aided learning activities that allow students to interactively explore and investigate geometry concepts and the properties of geometry objects. Thus, it is expected that such learning environment will enhance students’ internalisation process of geometry concepts.

  15. Using decision-tree classifier systems to extract knowledge from databases

    NASA Technical Reports Server (NTRS)

    St.clair, D. C.; Sabharwal, C. L.; Hacke, Keith; Bond, W. E.

    1990-01-01

    One difficulty in applying artificial intelligence techniques to the solution of real world problems is that the development and maintenance of many AI systems, such as those used in diagnostics, require large amounts of human resources. At the same time, databases frequently exist which contain information about the process(es) of interest. Recently, efforts to reduce development and maintenance costs of AI systems have focused on using machine learning techniques to extract knowledge from existing databases. Research is described in the area of knowledge extraction using a class of machine learning techniques called decision-tree classifier systems. Results of this research suggest ways of performing knowledge extraction which may be applied in numerous situations. In addition, a measurement called the concept strength metric (CSM) is described which can be used to determine how well the resulting decision tree can differentiate between the concepts it has learned. The CSM can be used to determine whether or not additional knowledge needs to be extracted from the database. An experiment involving real world data is presented to illustrate the concepts described.

  16. Multiverse data-flow control.

    PubMed

    Schindler, Benjamin; Waser, Jürgen; Ribičić, Hrvoje; Fuchs, Raphael; Peikert, Ronald

    2013-06-01

    In this paper, we present a data-flow system which supports comparative analysis of time-dependent data and interactive simulation steering. The system creates data on-the-fly to allow for the exploration of different parameters and the investigation of multiple scenarios. Existing data-flow architectures provide no generic approach to handle modules that perform complex temporal processing such as particle tracing or statistical analysis over time. Moreover, there is no solution to create and manage module data, which is associated with alternative scenarios. Our solution is based on generic data-flow algorithms to automate this process, enabling elaborate data-flow procedures, such as simulation, temporal integration or data aggregation over many time steps in many worlds. To hide the complexity from the user, we extend the World Lines interaction techniques to control the novel data-flow architecture. The concept of multiple, special-purpose cursors is introduced to let users intuitively navigate through time and alternative scenarios. Users specify only what they want to see, the decision which data are required is handled automatically. The concepts are explained by taking the example of the simulation and analysis of material transport in levee-breach scenarios. To strengthen the general applicability, we demonstrate the investigation of vortices in an offline-simulated dam-break data set.

  17. Studying ion exchange in solution and at biological membranes by FCS.

    PubMed

    Widengren, Jerker

    2013-01-01

    By FCS, a wide range of processes can be studied, covering time ranges from subnanoseconds to seconds. In principle, any process at equilibrium conditions can be measured, which reflects itself by a change in the detected fluorescence intensity. In this review, it is described how FCS and variants thereof can be used to monitor ion exchange, in solution and along biological membranes. Analyzing fluorescence fluctuations of ion-sensitive fluorophores by FCS offers selective advantages over other techniques for measuring local ion concentrations, and, in particular, for studying exchange kinetics of ions on a very local scale. This opens for several areas of application. The FCS approach was used to investigate fundamental aspects of proton exchange at and along biological membranes. The protonation relaxation rate, as measured by FCS for a pH-sensitive dye, can also provide information about local accessibility/interaction of a particular labeling site and conformational states of biomolecules, in a similar fashion as in a fluorescence quenching experiment. The same FCS concept can also be applied to ion exchange studies using other ion-sensitive fluorophores, and by use of dyes sensitive to other ambient conditions the concept can be extended also beyond ion exchange studies. Copyright © 2013 Elsevier Inc. All rights reserved.

  18. Integrated Force Method Solution to Indeterminate Structural Mechanics Problems

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Hopkins, Dale A.; Halford, Gary R.

    2004-01-01

    Strength of materials problems have been classified into determinate and indeterminate problems. Determinate analysis primarily based on the equilibrium concept is well understood. Solutions of indeterminate problems required additional compatibility conditions, and its comprehension was not exclusive. A solution to indeterminate problem is generated by manipulating the equilibrium concept, either by rewriting in the displacement variables or through the cutting and closing gap technique of the redundant force method. Compatibility improvisation has made analysis cumbersome. The authors have researched and understood the compatibility theory. Solutions can be generated with equal emphasis on the equilibrium and compatibility concepts. This technique is called the Integrated Force Method (IFM). Forces are the primary unknowns of IFM. Displacements are back-calculated from forces. IFM equations are manipulated to obtain the Dual Integrated Force Method (IFMD). Displacement is the primary variable of IFMD and force is back-calculated. The subject is introduced through response variables: force, deformation, displacement; and underlying concepts: equilibrium equation, force deformation relation, deformation displacement relation, and compatibility condition. Mechanical load, temperature variation, and support settling are equally emphasized. The basic theory is discussed. A set of examples illustrate the new concepts. IFM and IFMD based finite element methods are introduced for simple problems.

  19. Random element method for numerical modeling of diffusional processes

    NASA Technical Reports Server (NTRS)

    Ghoniem, A. F.; Oppenheim, A. K.

    1982-01-01

    The random element method is a generalization of the random vortex method that was developed for the numerical modeling of momentum transport processes as expressed in terms of the Navier-Stokes equations. The method is based on the concept that random walk, as exemplified by Brownian motion, is the stochastic manifestation of diffusional processes. The algorithm based on this method is grid-free and does not require the diffusion equation to be discritized over a mesh, it is thus devoid of numerical diffusion associated with finite difference methods. Moreover, the algorithm is self-adaptive in space and explicit in time, resulting in an improved numerical resolution of gradients as well as a simple and efficient computational procedure. The method is applied here to an assortment of problems of diffusion of momentum and energy in one-dimension as well as heat conduction in two-dimensions in order to assess its validity and accuracy. The numerical solutions obtained are found to be in good agreement with exact solution except for a statistical error introduced by using a finite number of elements, the error can be reduced by increasing the number of elements or by using ensemble averaging over a number of solutions.

  20. Make-to-order manufacturing - new approach to management of manufacturing processes

    NASA Astrophysics Data System (ADS)

    Saniuk, A.; Waszkowski, R.

    2016-08-01

    Strategic management must now be closely linked to the management at the operational level, because only in such a situation the company can be flexible and can quickly respond to emerging opportunities and pursue ever-changing strategic objectives. In these conditions industrial enterprises seek constantly new methods, tools and solutions which help to achieve competitive advantage. They are beginning to pay more attention to cost management, economic effectiveness and performance of business processes. In the article characteristics of make-to-order systems (MTO) and needs associated with managing such systems is identified based on the literature analysis. The main aim of this article is to present the results of research related to the development of a new solution dedicated to small and medium enterprises manufacture products solely on the basis of production orders (make-to- order systems). A set of indicators to enable continuous monitoring and control of key strategic areas this type of company is proposed. A presented solution includes the main assumptions of the following concepts: the Performance Management (PM), the Balanced Scorecard (BSC) and a combination of strategic management with the implementation of operational management. The main benefits of proposed solution are to increase effectiveness of MTO manufacturing company management.

  1. CFDP Evolutions and File Based Operations

    NASA Astrophysics Data System (ADS)

    Valverde, Alberto; Taylor, Chris; Magistrati, Giorgio; Maiorano, Elena; Colombo, Cyril; Haddow, Colin

    2015-09-01

    The complexity of the scientific ESA missions in terms of data handling requirements has been steadily increasing in the last years. The availability of high speed telemetry links to ground, the increase on the data storage capacity, as well as the processing performance of the spacecraft avionics have enabled this process. Nowadays, it is common to find missions with hundreds of gigabytes of daily on-board generated data, with terabytes of on-board mass memories and with downlinks of several hundreds of megabits per second. This technological trends push an upgrade on the spacecraft data handling and operation concept, smarter solutions are needed to sustain such high data rates and volumes, while improving the on-board autonomy and easing operations. This paper describes the different activities carried out to adapt to the new data handling scenario. It contains an analysis of the proposed operations concept for file-based spacecrafts, including the updates on the PUS and CFDP standards.

  2. A case-based reasoning tool for breast cancer knowledge management with data mining concepts and techniques

    NASA Astrophysics Data System (ADS)

    Demigha, Souâd.

    2016-03-01

    The paper presents a Case-Based Reasoning Tool for Breast Cancer Knowledge Management to improve breast cancer screening. To develop this tool, we combine both concepts and techniques of Case-Based Reasoning (CBR) and Data Mining (DM). Physicians and radiologists ground their diagnosis on their expertise (past experience) based on clinical cases. Case-Based Reasoning is the process of solving new problems based on the solutions of similar past problems and structured as cases. CBR is suitable for medical use. On the other hand, existing traditional hospital information systems (HIS), Radiological Information Systems (RIS) and Picture Archiving Information Systems (PACS) don't allow managing efficiently medical information because of its complexity and heterogeneity. Data Mining is the process of mining information from a data set and transform it into an understandable structure for further use. Combining CBR to Data Mining techniques will facilitate diagnosis and decision-making of medical experts.

  3. Utilisation d'analyse de concepts formels pour la gestion de variabilite d'un logiciel configure dynamiquement

    NASA Astrophysics Data System (ADS)

    Menguy, Theotime

    Because of its critical nature, avionic industry is bound with numerous constraints such as security standards and certifications while having to fulfill the clients' desires for personalization. In this context, variability management is a very important issue for re-engineering projects of avionic softwares. In this thesis, we propose a new approach, based on formal concept analysis and semantic web, to support variability management. The first goal of this research is to identify characteristic behaviors and interactions of configuration variables in a dynamically configured system. To identify such elements, we used formal concept analysis on different levels of abstractions in the system and defined new metrics. Then, we built a classification for the configuration variables and their relations in order to enable a quick identification of a variable's behavior in the system. This classification could help finding a systematic approach to process variables during a re-engineering operation, depending on their category. To have a better understanding of the system, we also studied the shared controls of code between configuration variables. A second objective of this research is to build a knowledge platform to gather the results of all the analysis performed, and to store any additional element relevant in the variability management context, for instance new results helping define re-engineering process for each of the categories. To address this goal, we built a solution based on a semantic web, defining a new ontology, very extensive and enabling to build inferences related to the evolution processes. The approach presented here is, to the best of our knowledge, the first classification of configuration variables of a dynamically configured software and an original use of documentation and variability management techniques using semantic web in the aeronautic field. The analysis performed and the final results show that formal concept analysis is a way to identify specific properties and behaviors and that semantic web is a good solution to store and explore the results. However, the use of formal concept analysis with new boolean relations, such as the link between configuration variables and files, and the definition of new inferences may be a way to draw better conclusions. The use of the same methodology with other systems would enable to validate the approach in other contexts.

  4. Capturing the Diversity of Transition from a Multidisciplinary Perspective

    ERIC Educational Resources Information Center

    Burns, Edgar

    2010-01-01

    The broad utility of the concept of transition in many disciplines provides career educators and career advisory personnel with expanded opportunities to explore fresh solutions to problems they meet in the course of their work. Further practical solutions become available by continuing to seek applications of the concept. Career transition…

  5. Multiple Solutions Approach (MSA): Conceptions and Practices of Primary School Teachers in Ghana

    ERIC Educational Resources Information Center

    Nabie, Michael Johnson; Raheem, Kolawole; Agbemaka, John Bijou; Sabtiwu, Rufai

    2016-01-01

    The study explored the curriculum guidelines and primary school teachers' conceptions and practices of the Multiple Solutions Approach (MSA) in teaching mathematics using basic qualitative research design. Informal conversation interviews (ICIs), observations, video and document analyses were used to collect data. Participants included a purposive…

  6. Investigating adaptive reasoning and strategic competence: Difference male and female

    NASA Astrophysics Data System (ADS)

    Syukriani, Andi; Juniati, Dwi; Siswono, Tatag Yuli Eko

    2017-08-01

    The series of adaptive reasoning and strategic competencies represent the five components of mathematical proficiency to describe the students' mathematics learning success. Gender contribute to the problem-solving process. This qualitative research approach investigated the adaptive reasoning and strategic competence aspects of a male student and a female student when they solved mathematical problem. They were in the eleventh grade of high school in Makassar. Both also had similar mathematics ability and were in the highest category. The researcher as the main instrument used secondary instrument to obtain the appropriate subject and to investigate the aspects of adaptive reasoning and strategic competence. Test of mathematical ability was used to locate the subjects with similar mathematical ability. The unstructured guideline interview was used to investigate aspects of adaptive reasoning and strategic competence when the subject completed the task of mathematical problem. The task of mathematical problem involves several concepts as the right solution, such as the circle concept, triangle concept, trigonometry concept, and Pythagoras concept. The results showed that male and female subjects differed in applying a strategy to understand, formulate and represent the problem situation. Furthermore, both also differed in explaining the strategy used and the relationship between concepts and problem situations.

  7. Machine learning-based coreference resolution of concepts in clinical documents

    PubMed Central

    Ware, Henry; Mullett, Charles J; El-Rawas, Oussama

    2012-01-01

    Objective Coreference resolution of concepts, although a very active area in the natural language processing community, has not yet been widely applied to clinical documents. Accordingly, the 2011 i2b2 competition focusing on this area is a timely and useful challenge. The objective of this research was to collate coreferent chains of concepts from a corpus of clinical documents. These concepts are in the categories of person, problems, treatments, and tests. Design A machine learning approach based on graphical models was employed to cluster coreferent concepts. Features selected were divided into domain independent and domain specific sets. Training was done with the i2b2 provided training set of 489 documents with 6949 chains. Testing was done on 322 documents. Results The learning engine, using the un-weighted average of three different measurement schemes, resulted in an F measure of 0.8423 where no domain specific features were included and 0.8483 where the feature set included both domain independent and domain specific features. Conclusion Our machine learning approach is a promising solution for recognizing coreferent concepts, which in turn is useful for practical applications such as the assembly of problem and medication lists from clinical documents. PMID:22582205

  8. A Comparative Study : Microprogrammed Vs Risc Architectures For Symbolic Processing

    NASA Astrophysics Data System (ADS)

    Heudin, J. C.; Metivier, C.; Demigny, D.; Maurin, T.; Zavidovique, B.; Devos, F.

    1987-05-01

    It is oftenclaimed that conventional computers are not well suited for human-like tasks : Vision (Image Processing), Intelligence (Symbolic Processing) ... In the particular case of Artificial Intelligence, dynamic type-checking is one example of basic task that must be improved. The solution implemented in most Lisp work-stations consists in a microprogrammed architecture with a tagged memory. Another way to gain efficiency is to design a well suited instruction set for symbolic processing, which reduces the semantic gap between the high level language and the machine code. In this framework, the RISC concept provides a convenient approach to study new architectures for symbolic processing. This paper compares both approaches and describes our projectof designing a compact symbolic processor for Artificial Intelligence applications.

  9. Generalized master equations for non-Poisson dynamics on networks.

    PubMed

    Hoffmann, Till; Porter, Mason A; Lambiotte, Renaud

    2012-10-01

    The traditional way of studying temporal networks is to aggregate the dynamics of the edges to create a static weighted network. This implicitly assumes that the edges are governed by Poisson processes, which is not typically the case in empirical temporal networks. Accordingly, we examine the effects of non-Poisson inter-event statistics on the dynamics of edges, and we apply the concept of a generalized master equation to the study of continuous-time random walks on networks. We show that this equation reduces to the standard rate equations when the underlying process is Poissonian and that its stationary solution is determined by an effective transition matrix whose leading eigenvector is easy to calculate. We conduct numerical simulations and also derive analytical results for the stationary solution under the assumption that all edges have the same waiting-time distribution. We discuss the implications of our work for dynamical processes on temporal networks and for the construction of network diagnostics that take into account their nontrivial stochastic nature.

  10. Generalized master equations for non-Poisson dynamics on networks

    NASA Astrophysics Data System (ADS)

    Hoffmann, Till; Porter, Mason A.; Lambiotte, Renaud

    2012-10-01

    The traditional way of studying temporal networks is to aggregate the dynamics of the edges to create a static weighted network. This implicitly assumes that the edges are governed by Poisson processes, which is not typically the case in empirical temporal networks. Accordingly, we examine the effects of non-Poisson inter-event statistics on the dynamics of edges, and we apply the concept of a generalized master equation to the study of continuous-time random walks on networks. We show that this equation reduces to the standard rate equations when the underlying process is Poissonian and that its stationary solution is determined by an effective transition matrix whose leading eigenvector is easy to calculate. We conduct numerical simulations and also derive analytical results for the stationary solution under the assumption that all edges have the same waiting-time distribution. We discuss the implications of our work for dynamical processes on temporal networks and for the construction of network diagnostics that take into account their nontrivial stochastic nature.

  11. Synthesis of Diopside by Solution Combustion Process Using Glycine Fuel

    NASA Astrophysics Data System (ADS)

    Sherikar, Baburao N.; Umarji, A. M.

    Nano ceramic Diopside (CaMgSi2O6) powders are synthesized by Solution Combustion Process(SCS) using Calcium nitrate, Magnesium nitrate as oxidizer and glycine as fuel, fumed silica as silica source. Ammonium nitrate (AN) is used as extra oxidizer. Effect of AN on Diopside phase formation is investigated. The adiabatic flame temperatures are calculated theoretically for varying amount of AN according to thermodynamic concept and correlated with the observed flame temperatures. A “Multi channel thermocouple setup connected to computer interfaced Keithley multi voltmeter 2700” is used to monitor the thermal events during the process. An interpretation based on maximum combustion temperature and the amount of gases produced during reaction for various AN compositions has been proposed for the nature of combustion and its correlation with the characteristics of as synthesized powder. These powders are characterized by XRD, SEM showing that the powders are composed of polycrystalline oxides with crystallite size of 58nm to 74nm.

  12. Numerical study on injection parameters optimization of thin wall and biodegradable polymers parts

    NASA Astrophysics Data System (ADS)

    Santos, C.; Mendes, A.; Carreira, P.; Mateus, A.; Malça, C.

    2017-07-01

    Nowadays, the molds industry searches new markets, with diversified and added value products. The concept associated to the production of thin walled and biodegradable parts mostly manufactured by injection process has assumed a relevant importance due to environmental and economic factors. The growth of a global consciousness about the harmful effects of the conventional polymers in our life quality associated with the legislation imposed, become key factors for the choice of a particular product by the consumer. The target of this work is to provide an integrated solution for the injection of parts with thin walls and manufactured using biodegradable materials. This integrated solution includes the design and manufacture processes of the mold as well as to find the optimum values for the injection parameters in order to become the process effective and competitive. For this, the Moldflow software was used. It was demonstrated that this computational tool provides an effective responsiveness and it can constitute an important tool in supporting the injection molding of thin-walled and biodegradable parts.

  13. Multiple Choice Knapsack Problem: example of planning choice in transportation.

    PubMed

    Zhong, Tao; Young, Rhonda

    2010-05-01

    Transportation programming, a process of selecting projects for funding given budget and other constraints, is becoming more complex as a result of new federal laws, local planning regulations, and increased public involvement. This article describes the use of an integer programming tool, Multiple Choice Knapsack Problem (MCKP), to provide optimal solutions to transportation programming problems in cases where alternative versions of projects are under consideration. In this paper, optimization methods for use in the transportation programming process are compared and then the process of building and solving the optimization problems is discussed. The concepts about the use of MCKP are presented and a real-world transportation programming example at various budget levels is provided. This article illustrates how the use of MCKP addresses the modern complexities and provides timely solutions in transportation programming practice. While the article uses transportation programming as a case study, MCKP can be useful in other fields where a similar decision among a subset of the alternatives is required. Copyright 2009 Elsevier Ltd. All rights reserved.

  14. Enzymatic production and in situ separation of natural β-ionone from β-carotene.

    PubMed

    Nacke, Christoph; Hüttmann, Sonja; Etschmann, Maria M W; Schrader, Jens

    2012-12-01

    A biotechnological process concept for generation and in situ separation of natural β-ionone from β-carotene is presented. The process employs carotenoid cleavage dioxygenases (CCDs), a plant-derived iron-containing nonheme enzyme family requiring only dissolved oxygen as cosubstrate and no additional cofactors. Organophilic pervaporation was found to be very well suited for continuous in situ separation of β-ionone. Its application led to a highly pure product despite the complexity of the reaction solution containing cell homogenates. Among three different pervaporation membrane types tested, a polyoctylmethylsiloxane active layer on a porous polyetherimide support led to the best results. A laboratory-scale demonstration plant was set up, and a highly pure aqueous-ethanolic solution of β-ionone was produced from β-carotene. The described process permits generation of high-value flavor and fragrance compounds bearing the desired label "natural" according to US and European food and safety regulations and demonstrates the potential of CCD enzymes for selective oxidative cleavage of carotenoids.

  15. Contingency theoretic methodology for agent-based web-oriented manufacturing systems

    NASA Astrophysics Data System (ADS)

    Durrett, John R.; Burnell, Lisa J.; Priest, John W.

    2000-12-01

    The development of distributed, agent-based, web-oriented, N-tier Information Systems (IS) must be supported by a design methodology capable of responding to the convergence of shifts in business process design, organizational structure, computing, and telecommunications infrastructures. We introduce a contingency theoretic model for the use of open, ubiquitous software infrastructure in the design of flexible organizational IS. Our basic premise is that developers should change in the way they view the software design process from a view toward the solution of a problem to one of the dynamic creation of teams of software components. We postulate that developing effective, efficient, flexible, component-based distributed software requires reconceptualizing the current development model. The basic concepts of distributed software design are merged with the environment-causes-structure relationship from contingency theory; the task-uncertainty of organizational- information-processing relationships from information processing theory; and the concept of inter-process dependencies from coordination theory. Software processes are considered as employees, groups of processes as software teams, and distributed systems as software organizations. Design techniques already used in the design of flexible business processes and well researched in the domain of the organizational sciences are presented. Guidelines that can be utilized in the creation of component-based distributed software will be discussed.

  16. Emerging concepts for management of river ecosystems and challenges to applied integration of physical and biological sciences in the Pacific Northwest, USA

    USGS Publications Warehouse

    Rieman, Bruce; Dunham, Jason B.; Clayton, James

    2006-01-01

    Integration of biological and physical concepts is necessary to understand and conserve the ecological integrity of river systems. Past attempts at integration have often focused at relatively small scales and on mechanistic models that may not capture the complexity of natural systems leaving substantial uncertainty about ecological responses to management actions. Two solutions have been proposed to guide management in the face of that uncertainty: the use of “natural variability” in key environmental patterns, processes, or disturbance as a reference; and the retention of some areas as essentially unmanaged reserves to conserve and represent as much biological diversity as possible. Both concepts are scale dependent because dominant processes or patterns that might be referenced will change with scale. Context and linkages across scales may be as important in structuring biological systems as conditions within habitats used by individual organisms. Both ideas view the physical environment as a template for expression, maintenance, and evolution of ecological diversity. To conserve or restore a diverse physical template it will be important to recognize the ecologically important differences in physical characteristics and processes among streams or watersheds that we might attempt to mimic in management or represent in conservation or restoration reserves.

  17. The use of solution adaptive grids in solving partial differential equations

    NASA Technical Reports Server (NTRS)

    Anderson, D. A.; Rai, M. M.

    1982-01-01

    The grid point distribution used in solving a partial differential equation using a numerical method has a substantial influence on the quality of the solution. An adaptive grid which adjusts as the solution changes provides the best results when the number of grid points available for use during the calculation is fixed. Basic concepts used in generating and applying adaptive grids are reviewed in this paper, and examples illustrating applications of these concepts are presented.

  18. Sensitivity method for integrated structure/active control law design

    NASA Technical Reports Server (NTRS)

    Gilbert, Michael G.

    1987-01-01

    The development is described of an integrated structure/active control law design methodology for aeroelastic aircraft applications. A short motivating introduction to aeroservoelasticity is given along with the need for integrated structures/controls design algorithms. Three alternative approaches to development of an integrated design method are briefly discussed with regards to complexity, coordination and tradeoff strategies, and the nature of the resulting solutions. This leads to the formulation of the proposed approach which is based on the concepts of sensitivity of optimum solutions and multi-level decompositions. The concept of sensitivity of optimum is explained in more detail and compared with traditional sensitivity concepts of classical control theory. The analytical sensitivity expressions for the solution of the linear, quadratic cost, Gaussian (LQG) control problem are summarized in terms of the linear regulator solution and the Kalman Filter solution. Numerical results for a state space aeroelastic model of the DAST ARW-II vehicle are given, showing the changes in aircraft responses to variations of a structural parameter, in this case first wing bending natural frequency.

  19. A new method to measure effective soil solution concentration predicts copper availability to plants.

    PubMed

    Zhang, H; Zhao, F J; Sun, B; Davison, W; McGrath, S P

    2001-06-15

    Risk assessments of metal contaminated soils need to address metal bioavailability. To predict the bioavailability of metals to plants, it is necessary to understand both solution and solid phase supply processes in soils. In striving to find surrogate chemical measurements, scientists have focused either on soil solution chemistry, including free ion activities, or operationally defined fractions of metals. Here we introduce the new concept of effective concentration, CE, which includes both the soil solution concentration and an additional term, expressed as a concentration, that represents metal supplied from the solid phase. CE was measured using the technique of diffusive gradients in thin films (DGT) which, like a plant, locally lowers soil solution concentrations, inducing metal supply from the solid phase, as shown by a dynamic model of the DGT-soil system. Measurements of Cu as CE, soil solution concentration, by EDTA extraction and as free Cu2+ activity in soil solution were made on 29 different soils covering a large range of copper concentrations. Theywere compared to Cu concentrations in the plant material of Lepidium heterophyllum grown on the same soils. Plant concentrations were linearly related and highly correlated with CE but were more scattered and nonlinear with respect to free Cu2+ activity, EDTA extraction, or soil solution concentrations. These results demonstrate that the dominant supply processes in these soils are diffusion and labile metal release, which the DGT-soil system mimics. The quantity CE is shown to have promise as a quantitative measure of the bioavailable metal in soils.

  20. Advanced protein formulations

    PubMed Central

    Wang, Wei

    2015-01-01

    It is well recognized that protein product development is far more challenging than that for small-molecule drugs. The major challenges include inherent sensitivity to different types of stresses during the drug product manufacturing process, high rate of physical and chemical degradation during long-term storage, and enhanced aggregation and/or viscosity at high protein concentrations. In the past decade, many novel formulation concepts and technologies have been or are being developed to address these product development challenges for proteins. These concepts and technologies include use of uncommon/combination of formulation stabilizers, conjugation or fusion with potential stabilizers, site-specific mutagenesis, and preparation of nontraditional types of dosage forms—semiaqueous solutions, nonfreeze-dried solid formulations, suspensions, and other emerging concepts. No one technology appears to be mature, ideal, and/or adequate to address all the challenges. These gaps will likely remain in the foreseeable future and need significant efforts for ultimate resolution. PMID:25858529

  1. Space Station communications and tracking system

    NASA Technical Reports Server (NTRS)

    Dietz, Reinhold H.

    1987-01-01

    A comprehensive description of the existing Space Station communications and tracking system requirements, architecture, and design concepts is provided. Areas which will require innovative solutions to provide cost-effective flight systems are emphasized. Among these are the space-to-space links, the differential global positioning system for determining relative position with free-flying vehicles, multitarget radar, packet/isochronous signal processing, and laser docking systems. In addition, the importance of advanced development, tests, and analyses is summarized.

  2. Does Disease Matter? Incorporating Solution-Focused Brief Therapy in Alcoholism Treatment.

    ERIC Educational Resources Information Center

    Osborn, Cynthia J.

    1997-01-01

    Surveyed alcoholism counselors (N=284) to determine whether the disease concept of alcoholism precludes acceptance and use of Solution-Focused Brief Therapy (SFBT) in alcoholism treatment. Results suggest that SFBT may be feasible for alcoholism treatment and that endorsement of the disease concept is compatible with the principles of SFBT. (EMK)

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Disselkamp, Robert S.; Chajkowski, Sarah M.; Boyles, Kelly R.

    Here we discuss results obtained as part of a three-year investigation at Pacific Northwest National Laboratory of ultrasound processing to effect selectivity and activity in the hydrogenation of water-soluble olefins on transition metal catalysts. We have shown previously that of the two regimes for ultrasound processing, high-power cavitating and high-power non-cavitating, only the former can effect product selectivity dramatically (> 1000%) whereas the selectivity of the latter was comparable with those obtained in stirred/silent control experiments [R.S. Disselkamp, Y.-H. Chin, C.H.F. Peden, J. Catal., 227, 552 (2005)]. As a means of ensuring the benefits of cavitating ultrasound processing, we introducedmore » the concept of employing inert dopants into the reacting solution. These inert dopants do not partake in solution chemistry but enable a more facile transition from high-power non-cavitating to cavitating conditions during sonication treatment. With cavitation processing conditions ensured, we discuss here results of isotopic H/D substitution for a variety of substrates and illustrate how such isotope dependent chemistries during substrate hydrogenation elucidate detailed mechanistic information about these reaction systems.« less

  4. Regression analysis as a design optimization tool

    NASA Technical Reports Server (NTRS)

    Perley, R.

    1984-01-01

    The optimization concepts are described in relation to an overall design process as opposed to a detailed, part-design process where the requirements are firmly stated, the optimization criteria are well established, and a design is known to be feasible. The overall design process starts with the stated requirements. Some of the design criteria are derived directly from the requirements, but others are affected by the design concept. It is these design criteria that define the performance index, or objective function, that is to be minimized within some constraints. In general, there will be multiple objectives, some mutually exclusive, with no clear statement of their relative importance. The optimization loop that is given adjusts the design variables and analyzes the resulting design, in an iterative fashion, until the objective function is minimized within the constraints. This provides a solution, but it is only the beginning. In effect, the problem definition evolves as information is derived from the results. It becomes a learning process as we determine what the physics of the system can deliver in relation to the desirable system characteristics. As with any learning process, an interactive capability is a real attriubute for investigating the many alternatives that will be suggested as learning progresses.

  5. Text processing through Web services: calling Whatizit.

    PubMed

    Rebholz-Schuhmann, Dietrich; Arregui, Miguel; Gaudan, Sylvain; Kirsch, Harald; Jimeno, Antonio

    2008-01-15

    Text-mining (TM) solutions are developing into efficient services to researchers in the biomedical research community. Such solutions have to scale with the growing number and size of resources (e.g. available controlled vocabularies), with the amount of literature to be processed (e.g. about 17 million documents in PubMed) and with the demands of the user community (e.g. different methods for fact extraction). These demands motivated the development of a server-based solution for literature analysis. Whatizit is a suite of modules that analyse text for contained information, e.g. any scientific publication or Medline abstracts. Special modules identify terms and then link them to the corresponding entries in bioinformatics databases such as UniProtKb/Swiss-Prot data entries and gene ontology concepts. Other modules identify a set of selected annotation types like the set produced by the EBIMed analysis pipeline for proteins. In the case of Medline abstracts, Whatizit offers access to EBI's in-house installation via PMID or term query. For large quantities of the user's own text, the server can be operated in a streaming mode (http://www.ebi.ac.uk/webservices/whatizit).

  6. Environmental issues and process risks for operation of carbon capture plant

    NASA Astrophysics Data System (ADS)

    Lajnert, Radosław; Nowak, Martyna; Telenga-Kopyczyńska, Jolanta

    2018-01-01

    The scope of this publication is a presentation of environmental issues and process risks connected with operation an installation for carbon capture from waste gas. General technological assumptions, typical for demonstration plant for carbon capture from waste gas (DCCP) with application of two different solutions - 30% water solution of monoethanoloamine (MEA) and water solution with 30% AMP (2-amino-2-methyl-1-propanol) and 10% piperazine have been described. The concept of DCCP installation was made for Łaziska Power Plant in Łaziska Górne owned by TAURON Wytwarzanie S.A. Main hazardous substances, typical for such installation, which can be dangerous for human life and health or for the environment have been presented. Pollution emission to the air, noise emission, waste water and solid waste management have been described. The environmental impact of the released substances has been stated. Reference to emission standards specified in regulations for considered substances has been done. Principles of risk analysis have been presented and main hazards in carbon dioxide absorption node and regeneration node have been evaluated.

  7. Fuel development for gas-cooled fast reactors

    NASA Astrophysics Data System (ADS)

    Meyer, M. K.; Fielding, R.; Gan, J.

    2007-09-01

    The Generation IV Gas-cooled Fast Reactor (GFR) concept is proposed to combine the advantages of high-temperature gas-cooled reactors (such as efficient direct conversion with a gas turbine and the potential for application of high-temperature process heat), with the sustainability advantages that are possible with a fast-spectrum reactor. The latter include the ability to fission all transuranics and the potential for breeding. The GFR is part of a consistent set of gas-cooled reactors that includes a medium-term Pebble Bed Modular Reactor (PBMR)-like concept, or concepts based on the Gas Turbine Modular Helium Reactor (GT-MHR), and specialized concepts such as the Very High-Temperature Reactor (VHTR), as well as actinide burning concepts [A Technology Roadmap for Generation IV Nuclear Energy Systems, US DOE Nuclear Energy Research Advisory Committee and the Generation IV International Forum, December 2002]. To achieve the necessary high power density and the ability to retain fission gas at high temperature, the primary fuel concept proposed for testing in the United States is dispersion coated fuel particles in a ceramic matrix. Alternative fuel concepts considered in the US and internationally include coated particle beds, ceramic clad fuel pins, and novel ceramic 'honeycomb' structures. Both mixed carbide and mixed nitride-based solid solutions are considered as fuel phases.

  8. Linking climate change mitigation and coastal eutrophication management through biogas technology: Evidence from a new Danish bioenergy concept.

    PubMed

    Kaspersen, Bjarke Stoltze; Christensen, Thomas Budde; Fredenslund, Anders Michael; Møller, Henrik Bjarne; Butts, Michael Brian; Jensen, Niels H; Kjaer, Tyge

    2016-01-15

    The interest in sustainable bioenergy solutions has gained great importance in Europe due to the need to reduce GHG emissions and to meet environmental policy targets, not least for the protection of groundwater and surface water quality. In the Municipality of Solrød in Denmark, a novel bioenergy concept for anaerobic co-digestion of food industry residues, manure and beach-cast seaweed has been developed and tested in order to quantify the potential for synergies between climate change mitigation and coastal eutrophication management in the Køge Bay catchment. The biogas plant, currently under construction, was designed to handle an annual input of up to 200,000 t of biomass based on four main fractions: pectin wastes, carrageenan wastes, manure and beach-cast seaweed. This paper describes how this bioenergy concept can contribute to strengthening the linkages between climate change mitigation strategies and Water Framework Directive (WFD) action planning. Our assessments of the projected biogas plant indicate an annual reduction of GHG emissions of approx. 40,000 t CO2 equivalents, corresponding to approx. 1/3 of current total GHG emissions in the Municipality of Solrød. In addition, nitrogen and phosphorous loads to Køge Bay are estimated to be reduced by approx. 63 t yr.(-1) and 9 tyr.(-1), respectively, contributing to the achievement of more than 70% of the nutrient reduction target set for Køge Bay in the first WFD river basin management plan. This study shows that anaerobic co-digestion of the specific food industry residues, pig manure and beach-cast seaweed is feasible and that there is a very significant, cost-effective GHG and nutrient loading mitigation potential for this bioenergy concept. Our research demonstrates how an integrated planning process where considerations about the total environment are integrated into the design and decision processes can support the development of this kind of holistic bioenergy solutions. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. Aircraft conceptual design - an adaptable parametric sizing methodology

    NASA Astrophysics Data System (ADS)

    Coleman, Gary John, Jr.

    Aerospace is a maturing industry with successful and refined baselines which work well for traditional baseline missions, markets and technologies. However, when new markets (space tourism) or new constrains (environmental) or new technologies (composite, natural laminar flow) emerge, the conventional solution is not necessarily best for the new situation. Which begs the question "how does a design team quickly screen and compare novel solutions to conventional solutions for new aerospace challenges?" The answer is rapid and flexible conceptual design Parametric Sizing. In the product design life-cycle, parametric sizing is the first step in screening the total vehicle in terms of mission, configuration and technology to quickly assess first order design and mission sensitivities. During this phase, various missions and technologies are assessed. During this phase, the designer is identifying design solutions of concepts and configurations to meet combinations of mission and technology. This research undertaking contributes the state-of-the-art in aircraft parametric sizing through (1) development of a dedicated conceptual design process and disciplinary methods library, (2) development of a novel and robust parametric sizing process based on 'best-practice' approaches found in the process and disciplinary methods library, and (3) application of the parametric sizing process to a variety of design missions (transonic, supersonic and hypersonic transports), different configurations (tail-aft, blended wing body, strut-braced wing, hypersonic blended bodies, etc.), and different technologies (composite, natural laminar flow, thrust vectored control, etc.), in order to demonstrate the robustness of the methodology and unearth first-order design sensitivities to current and future aerospace design problems. This research undertaking demonstrates the importance of this early design step in selecting the correct combination of mission, technologies and configuration to meet current aerospace challenges. Overarching goal is to avoid the reoccurring situation of optimizing an already ill-fated solution.

  10. The Tail of BPM

    NASA Astrophysics Data System (ADS)

    Kruba, Steve; Meyer, Jim

    Business process management suites (BPMS's) represent one of the fastest growing segments in the software industry as organizations automate their key business processes. As this market matures, it is interesting to compare it to Chris Anderson's 'Long Tail.' Although the 2004 "Long Tail" article in Wired magazine was primarily about the media and entertainment industries, it has since been applied (and perhaps misapplied) to other markets. Analysts describe a "Tail of BPM" market that is, perhaps, several times larger than the traditional BPMS product market. This paper will draw comparisons between the concepts in Anderson's article (and subsequent book) and the BPM solutions market.

  11. Using a Systematic Approach in the Analysis of the Factors That Influence On a Form Formation of Buildings of Higher Educational Establishments

    NASA Astrophysics Data System (ADS)

    Martyniv, Oleksandra; Kinasz, Roman

    2017-10-01

    This material covers the row of basic factors that influence on architectonically-spatial solution formation of building of Higher educational establishments (hereinafter universities). For this purpose, the systematization process of factors that influence on the university architecture was conducted and presented. The conclusion of this article was the proposed concept of considering universities as a hierarchical system, elements of which act as factors of influence, which in the process of alternating influence lead to the main goal, namely the formation of a new university building.

  12. Possible Solutions as a Concept in Behavior Change Interventions.

    PubMed

    Mahoney, Diane E

    2018-04-24

    Nurses are uniquely positioned to implement behavior change interventions. Yet, nursing interventions have traditionally resulted from nurses problem-solving rather than allowing the patient to self-generate possible solutions for attaining specific health outcomes. The purpose of this review is to clarify the meaning of possible solutions in behavior change interventions. Walker and Avant's method on concept analysis serves as the framework for examination of the possible solutions. Possible solutions can be defined as continuous strategies initiated by patients and families to overcome existing health problems. As nurses engage in behavior change interventions, supporting patients and families in problem-solving will optimize health outcomes and transform clinical practice. © 2018 NANDA International, Inc.

  13. A flexible architecture for advanced process control solutions

    NASA Astrophysics Data System (ADS)

    Faron, Kamyar; Iourovitski, Ilia

    2005-05-01

    Advanced Process Control (APC) is now mainstream practice in the semiconductor manufacturing industry. Over the past decade and a half APC has evolved from a "good idea", and "wouldn"t it be great" concept to mandatory manufacturing practice. APC developments have primarily dealt with two major thrusts, algorithms and infrastructure, and often the line between them has been blurred. The algorithms have evolved from very simple single variable solutions to sophisticated and cutting edge adaptive multivariable (input and output) solutions. Spending patterns in recent times have demanded that the economics of a comprehensive APC infrastructure be completely justified for any and all cost conscious manufacturers. There are studies suggesting integration costs as high as 60% of the total APC solution costs. Such cost prohibitive figures clearly diminish the return on APC investments. This has limited the acceptance and development of pure APC infrastructure solutions for many fabs. Modern APC solution architectures must satisfy the wide array of requirements from very manual R&D environments to very advanced and automated "lights out" manufacturing facilities. A majority of commercially available control solutions and most in house developed solutions lack important attributes of scalability, flexibility, and adaptability and hence require significant resources for integration, deployment, and maintenance. Many APC improvement efforts have been abandoned and delayed due to legacy systems and inadequate architectural design. Recent advancements (Service Oriented Architectures) in the software industry have delivered ideal technologies for delivering scalable, flexible, and reliable solutions that can seamlessly integrate into any fabs" existing system and business practices. In this publication we shall evaluate the various attributes of the architectures required by fabs and illustrate the benefits of a Service Oriented Architecture to satisfy these requirements. Blue Control Technologies has developed an advance service oriented architecture Run to Run Control System which addresses these requirements.

  14. Nature-Based Solutions in the EU: Innovating with nature to address social, economic and environmental challenges.

    PubMed

    Faivre, Nicolas; Fritz, Marco; Freitas, Tiago; de Boissezon, Birgit; Vandewoestijne, Sofie

    2017-11-01

    Contemporary societies are facing a broad range of challenges, from pressures on human health and well-being to natural capital depletion, and the security of food, water and energy. These challenges are deeply intertwined with global processes, such as climate change and with local events such as natural disasters. The EU's research & innovation (R&I) policy is now seeking to address these challenges from a new perspective, with Nature-Based Solutions, and turn them into innovation opportunities that optimise the synergies between nature, society and the economy. Nature-Based Solutions can be an opportunity for innovation, and are here promoted by both policymakers and practitioners as a cost-effective way of creating a greener, more sustainable, and more competitive economy. Since 2013, the European Commission has devoted particular attention to Nature-Based Solutions through consultations and dialogues that sought to make the concept of these solutions more concrete and to define the concept's place within the spectrum of ecosystem-based approaches. In 2014, the Commission launched an expert group, which conducted further analysis, and made recommendations to help increase the use of Nature-Based Solutions and bring nature back into cities. In 2015, a survey was conducted on citizens' views and perceptions of 'Nature in Cities' to provide further insight for future work. Based on these elements and on results from running EU projects, the Commission has developed an R&I agenda for Nature-Based Solutions and has published targeted calls for proposals for large-scale demonstration projects in this field in 2016 and 2017. Additional R&I actions at EU level that promote systemic Nature-Based Solutions and their benefits to cities and territories are planned with the aim to improve the implementation capacity and evidence base for deploying Nature-Based Solutions and developing corresponding future markets. They are also expected to foster an interdisciplinary R&I and stakeholder community and the exchange of good practices in this field, as well as help shaping and implementing international R&I agendas on Nature-Based Solutions. Copyright © 2017. Published by Elsevier Inc.

  15. Increasing Capacity Exploitation in Food Supply Chains Using Grid Concepts

    NASA Astrophysics Data System (ADS)

    Volk, Eugen; Müller, Marcus; Jacob, Ansger; Racz, Peter; Waldburger, Martin

    Food supply chains today are characterized by fixed trade relations with long term contracts established between heterogeneous supply chain companies. Production and logistics capacities of these companies are often utilized in an economically inefficient manner only. In addition, increased consumer awareness in food safety issues renders supply chain management even more challenging, since integrated tracking and tracing along the whole food supply chain is needed. Facing these issues of supply chain management complexity and completely documented product quality, this paper proposes a full lifecycle solution for dynamic capacity markets based on concepts used in the field of Grid [1], like management of Virtual Organization (VO) combined with Service Level Agreement (SLA). The solution enables the cost-efficient utilization of real world capacities (e.g., production capacities or logistics facilities) by using a simple, browser-based portal. Users are able to enter into product-specific negotiations with buyers and suppliers of a food supply chain, and to obtain real-time access to product information including SLA evaluation reports. Thus, business opportunities in wider market access, process innovation, and trustworthy food products are offered for participating supply chain companies.

  16. Registered nurses' constructed meaning of concepts of solution and their use in clinical practice

    NASA Astrophysics Data System (ADS)

    Wilkes, Lesley M.; Batts, Judith E.

    1991-12-01

    Since the introduction of nursing into tertiary institutions in Australia in 1975, there has been increasing interest in the teaching of physical science to nurses. Various courses in physical science for nurse students have been developed. They vary in length and content but there is agreement that concepts taught should be closely related to nursing applications. The choice of relevant concepts tends to be made by individual curriculum developers. This paper reports an examination of the use of physical science concepts and their relevance from the perspective of registered nurses practising in general ward areas. Inherent in this study is the premise that for registered nurses to have ideas of the physical science underlying their practice they must have constructed meaning first for these concepts. Specific chemical concepts related to solutions are discussed in these terms.

  17. Image processing via VLSI: A concept paper

    NASA Technical Reports Server (NTRS)

    Nathan, R.

    1982-01-01

    Implementing specific image processing algorithms via very large scale integrated systems offers a potent solution to the problem of handling high data rates. Two algorithms stand out as being particularly critical -- geometric map transformation and filtering or correlation. These two functions form the basis for data calibration, registration and mosaicking. VLSI presents itself as an inexpensive ancillary function to be added to almost any general purpose computer and if the geometry and filter algorithms are implemented in VLSI, the processing rate bottleneck would be significantly relieved. A set of image processing functions that limit present systems to deal with future throughput needs, translates these functions to algorithms, implements via VLSI technology and interfaces the hardware to a general purpose digital computer is developed.

  18. Progression in High School Students' (Aged 16-18) Conceptualizations about Chemical Reactions in Solution.

    ERIC Educational Resources Information Center

    Boo, Hong-Kwen; Watson, J. R.

    2001-01-01

    Explores the development over time of students' understandings of the concept of chemical reaction in the context of two familiar reactions in solution. Based on interviews (n=48), results show that students made some progress in their understanding of the concept of chemical reaction but some fundamental misconceptions remained. (Author/MM)

  19. Investigating the Effectiveness of Teaching Methods Based on a Four-Step Constructivist Strategy

    NASA Astrophysics Data System (ADS)

    Çalik, Muammer; Ayas, Alipaşa; Coll, Richard K.

    2010-02-01

    This paper reports on an investigation of the effectiveness an intervention using several different methods for teaching solution chemistry. The teaching strategy comprised a four-step approach derived from a constructivist view of learning. A sample consisting of 44 students (18 boys and 26 girls) was selected purposively from two different Grade 9 classes in the city of Trabzon, Turkey. Data collection employed a purpose-designed `solution chemistry concept test', consisting of 17 items, with the quantitative data from the survey supported by qualitative interview data. The findings suggest that using different methods embedded within the four-step constructivist-based teaching strategy enables students to refute some alternative conceptions, but does not completely eliminate student alternative conceptions for solution chemistry.

  20. Possible Concepts for Waterproofing of Norwegian TBM Railway Tunnels

    NASA Astrophysics Data System (ADS)

    Dammyr, Øyvind; Nilsen, Bjørn; Thuro, Kurosch; Grøndal, Jørn

    2014-05-01

    The aim of this paper is to evaluate and compare the durability, life expectancy and maintenance needs of traditional Norwegian waterproofing concepts to the generally more rigid waterproofing concepts seen in other European countries. The focus will be on solutions for future Norwegian tunnel boring machine railway tunnels. Experiences from operation of newer and older tunnels with different waterproofing concepts have been gathered and analyzed. In the light of functional requirements for Norwegian rail tunnels, some preliminary conclusions about suitable concepts are drawn. Norwegian concepts such as polyethylene panels and lightweight concrete segments with membrane are ruled out. European concepts involving double shell draining systems (inner shell of cast concrete with membrane) and single shell undrained systems (waterproof concrete segments) are generally evaluated as favorable. Sprayable membranes and waterproof/insulating shotcrete are welcomed innovations, but more research is needed to verify their reliability and cost effectiveness compared to the typical European concepts. Increasing traffic and reliance on public transport systems in Norway result in high demand for durable and cost effective solutions.

  1. The laboratory demonstration and signal processing of the inverse synthetic aperture imaging ladar

    NASA Astrophysics Data System (ADS)

    Gao, Si; Zhang, ZengHui; Xu, XianWen; Yu, WenXian

    2017-10-01

    This paper presents a coherent inverse synthetic-aperture imaging ladar(ISAL)system to obtain high resolution images. A balanced coherent optics system in laboratory is built with binary phase coded modulation transmit waveform which is different from conventional chirp. A whole digital signal processing solution is proposed including both quality phase gradient autofocus(QPGA) algorithm and cubic phase function(CPF) algorithm. Some high-resolution well-focused ISAL images of retro-reflecting targets are shown to validate the concepts. It is shown that high resolution images can be achieved and the influences from vibrations of platform involving targets and radar can be automatically compensated by the distinctive laboratory system and digital signal process.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Restivo, M.

    SRNL Environmental and Chemical Process Technology (E&CPT) was requested to perform testing of vacuum pumps per a verbal request from the Customer, SRNL Hydrogen Processing Technology. Tritium Operations is currently having difficulties procuring the Normetex™® Model 15 m 3/hr (9 CFM) vacuum pump (formerly Normetex Pompes, now Eumeca SARL). One possible solution proposed by Hydrogen Processing Technology personnel is to use two Senior Aerospace Metal Bellows MB-601 vacuum pumps piped with the heads in series, and the pumps in series (Figure 1 below). This memorandum documents the ultimate vacuum testing that was performed to determine if this concept was amore » viable alternate vacuum pump strategy. This testing dovetails with previous pump evaluations documented in references 1 and 2.« less

  3. Direction of the Rational Use of Water at the Facilities for Growing Poultry

    NASA Astrophysics Data System (ADS)

    Potseluev, A. A.; Nazarov, I. V.; Porotkova, A. K.; Volovikova, N. V.

    2018-01-01

    The article notes the effect of water use in the technological process of automatic drinking agricultural poultry on the quality and the quantity of outputs. At the same time, the requirements to the quality of the used water, the regimes of its consumption by the poultry and the role of mechanization of the process of automatic drinking in the rational use of the water resource, the processing and the reuse of contaminated wastes are disclosed. Within the framework of this concept, we propose constructively technological solutions of systems and means of automatic drinking agricultural poultry, providing the rational use of water as one of the important products of vital activity of agricultural poultry.

  4. Using concepts from biology to improve problem-solving methods

    NASA Astrophysics Data System (ADS)

    Goodman, Erik D.; Rothwell, Edward J.; Averill, Ronald C.

    2011-06-01

    Observing nature has been a cornerstone of engineering design. Today, engineers look not only at finished products, but imitate the evolutionary process by which highly optimized artifacts have appeared in nature. Evolutionary computation began by capturing only the simplest ideas of evolution, but today, researchers study natural evolution and incorporate an increasing number of concepts in order to evolve solutions to complex engineering problems. At the new BEACON Center for the Study of Evolution in Action, studies in the lab and field and in silico are laying the groundwork for new tools for evolutionary engineering design. This paper, which accompanies a keynote address, describes various steps in development and application of evolutionary computation, particularly as regards sensor design, and sets the stage for future advances.

  5. Combustion of liquid paint wastes in fluidized bed boiler as element of waste management system in the paint factory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Soko, W.A.; Biaecka, B.

    1998-12-31

    In this paper the solution to waste problems in the paint industry is presented by describing their combustion in a fluidized bed boiler as a part of the waste management system in the paint factory. Based on the Cleaner Production idea and concept of integration of design process with a future exploitation of equipment, some modifications of the waste management scheme in the factory are discussed to reduce the quantity of toxic wastes. To verify this concept combustion tests of paint production wastes and cocombustion of paint wastes with coal in an adopted industrial boiler were done. Results of thesemore » tests are presented in the paper.« less

  6. Contributions of experimental protobiogenesis to the theory of evolution

    NASA Technical Reports Server (NTRS)

    Fox, S. W.

    1976-01-01

    Inferences from experiments in protobiogenesis are examined as a forward extension of the theory of evolutionary biology. A nondiscontinuous, intraconsistent theory of general evolution embracing both protobiology and biology is outlined. This overview emphasizes Darwinian selection in the later stages of evolution, and stereochemical molecular selection in some of its earlier stages. It incorporates the concept of limitation of the scope of evolution by internal constraints on variation, based on the argument that internally limiting constraints observed in experiments with molecules are operative in organisms, if chemical processes occur within biological processes and biological processes are assumed to be exponentializations of chemical processes. Major evolutionary events might have occurred by rapid self-assembly processes analogous to those observed in the formation of phase-separated microspheres from amorphous powder or supersaturated solutions.

  7. Design and Analysis of a Formation Flying System for the Cross-Scale Mission Concept

    NASA Technical Reports Server (NTRS)

    Cornara, Stefania; Bastante, Juan C.; Jubineau, Franck

    2007-01-01

    The ESA-funded "Cross-Scale Technology Reference Study has been carried out with the primary aim to identify and analyse a mission concept for the investigation of fundamental space plasma processes that involve dynamical non-linear coupling across multiple length scales. To fulfill this scientific mission goal, a constellation of spacecraft is required, flying in loose formations around the Earth and sampling three characteristic plasma scale distances simultaneously, with at least two satellites per scale: electron kinetic (10 km), ion kinetic (100-2000 km), magnetospheric fluid (3000-15000 km). The key Cross-Scale mission drivers identified are the number of S/C, the space segment configuration, the reference orbit design, the transfer and deployment strategy, the inter-satellite localization and synchronization process and the mission operations. This paper presents a comprehensive overview of the mission design and analysis for the Cross-Scale concept and outlines a technically feasible mission architecture for a multi-dimensional investigation of space plasma phenomena. The main effort has been devoted to apply a thorough mission-level trade-off approach and to accomplish an exhaustive analysis, so as to allow the characterization of a wide range of mission requirements and design solutions.

  8. Self-assembly concepts for multicompartment nanostructures

    NASA Astrophysics Data System (ADS)

    Gröschel, André H.; Müller, Axel H. E.

    2015-07-01

    Compartmentalization is ubiquitous to many biological and artificial systems, be it for the separate storage of incompatible matter or to isolate transport processes. Advancements in the synthesis of sequential block copolymers offer a variety of tools to replicate natural design principles with tailor-made soft matter for the precise spatial separation of functionalities on multiple length scales. Here, we review recent trends in the self-assembly of amphiphilic block copolymers to multicompartment nanostructures (MCNs) under (semi-)dilute conditions, with special emphasis on ABC triblock terpolymers. The intrinsic immiscibility of connected blocks induces short-range repulsion into discrete nano-domains stabilized by a third, soluble block or molecular additive. Polymer blocks can be synthesized from an arsenal of functional monomers directing self-assembly through packing frustration or response to various fields. The mobility in solution further allows the manipulation of self-assembly processes into specific directions by clever choice of environmental conditions. This review focuses on practical concepts that direct self-assembly into predictable nanostructures, while narrowing particle dispersity with respect to size, shape and internal morphology. The growing understanding of underlying self-assembly mechanisms expands the number of experimental concepts providing the means to target and manipulate progressively complex superstructures.

  9. The empty OR-process analysis and a new concept for flexible and modular use in minimal invasive surgery.

    PubMed

    Eckmann, Christian; Olbrich, Guenter; Shekarriz, Hodjat; Bruch, Hans-Peter

    2003-01-01

    The reproducible advantages of minimal invasive surgery have led to a worldwide spread of these techniques. Nevertheless, the increasing use of technology causes problems in the operating room (OR). The workstation environment and workflow are handicapped by a great number of isolated solutions that demand a large amount of space. The Center of Excellence in Medical Technology (CEMET) was established in 2001 as an institution for a close cooperation between users, science, and manufacturers of medical devices in the State of Schleswig-Holstein, Germany. The future OR, as a major project, began with a detailed process analysis, which disclosed a large number of medical devices with different interfaces and poor standardisation as main problems. Smaller and more flexible devices are necessary, as well as functional modules located outside the OR. Only actuators should be positioned near the operation area. The future OR should include a flexible-room concept and less equipment than is in use currently. A uniform human-user interface is needed to control the OR environment. This article addresses the need for a clear workspace environment, intelligent-user interfaces, and flexible-room concept to improve the potentials in use of minimal invasive surgery.

  10. Exact least squares adaptive beamforming using an orthogonalization network

    NASA Astrophysics Data System (ADS)

    Yuen, Stanley M.

    1991-03-01

    The pros and cons of various classical and state-of-the-art methods in adaptive array processing are discussed, and the relevant concepts and historical developments are pointed out. A set of easy-to-understand equations for facilitating derivation of any least-squares-based algorithm is derived. Using this set of equations and incorporating all of the useful properties associated with various techniques, an efficient solution to the real-time adaptive beamforming problem is developed.

  11. Roll-to-Roll printed large-area all-polymer solar cells with 5% efficiency based on a low crystallinity conjugated polymer blend

    NASA Astrophysics Data System (ADS)

    Gu, Xiaodan; Zhou, Yan; Gu, Kevin; Kurosawa, Tadanori; Yan, Hongping; Wang, Cheng; Toney, Micheal; Bao, Zhenan

    The challenge of continuous printing in high efficiency large-area organic solar cells is a key limiting factor for their widespread adoption. We present a materials design concept for achieving large-area, solution coated all-polymer bulk heterojunction (BHJ) solar cells with stable phase separation morphology between the donor and acceptor. The key concept lies in inhibiting strong crystallization of donor and acceptor polymers, thus forming intermixed, low crystallinity and mostly amorphous blends. Based on experiments using donors and acceptors with different degree of crystallinity, our results showed that microphase separated donor and acceptor domain sizes are inversely proportional to the crystallinity of the conjugated polymers. This methodology of using low crystallinity donors and acceptors has the added benefit of forming a consistent and robust morphology that is insensitive to different processing conditions, allowing one to easily scale up the printing process from a small scale solution shearing coater to a large-scale continuous roll-to-roll (R2R) printer. We were able to continuously roll-to-roll slot die print large area all-polymer solar cells with power conversion efficiencies of 5%, with combined cell area up to 10 cm2. This is among the highest efficiencies realized with R2R coated active layer organic materials on flexible substrate. DOE BRIDGE sunshot program. Office of Naval Research.

  12. Roll-to-Roll Printed Large-Area All-Polymer Solar Cells with 5% Efficiency Based on a Low Crystallinity Conjugated Polymer Blend

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gu, Xiaodan; Zhou, Yan; Gu, Kevin

    The challenge of continuous printing in high-efficiency large-area organic solar cells is a key limiting factor for their widespread adoption. We present a materials design concept for achieving large-area, solution-coated all-polymer bulk heterojunction solar cells with stable phase separation morphology between the donor and acceptor. The key concept lies in inhibiting strong crystallization of donor and acceptor polymers, thus forming intermixed, low crystallinity, and mostly amorphous blends. Based on experiments using donors and acceptors with different degree of crystallinity, the results show that microphase separated donor and acceptor domain sizes are inversely proportional to the crystallinity of the conjugated polymers.more » This particular methodology of using low crystallinity donors and acceptors has the added benefit of forming a consistent and robust morphology that is insensitive to different processing conditions, allowing one to easily scale up the printing process from a small-scale solution shearing coater to a large-scale continuous roll-to-roll (R2R) printer. Large-area all-polymer solar cells are continuously roll-to-roll slot die printed with power conversion efficiencies of 5%, with combined cell area up to 10 cm 2. This is among the highest efficiencies realized with R2R-coated active layer organic materials on flexible substrate.« less

  13. Roll-to-Roll Printed Large-Area All-Polymer Solar Cells with 5% Efficiency Based on a Low Crystallinity Conjugated Polymer Blend

    DOE PAGES

    Gu, Xiaodan; Zhou, Yan; Gu, Kevin; ...

    2017-03-07

    The challenge of continuous printing in high-efficiency large-area organic solar cells is a key limiting factor for their widespread adoption. We present a materials design concept for achieving large-area, solution-coated all-polymer bulk heterojunction solar cells with stable phase separation morphology between the donor and acceptor. The key concept lies in inhibiting strong crystallization of donor and acceptor polymers, thus forming intermixed, low crystallinity, and mostly amorphous blends. Based on experiments using donors and acceptors with different degree of crystallinity, the results show that microphase separated donor and acceptor domain sizes are inversely proportional to the crystallinity of the conjugated polymers.more » This particular methodology of using low crystallinity donors and acceptors has the added benefit of forming a consistent and robust morphology that is insensitive to different processing conditions, allowing one to easily scale up the printing process from a small-scale solution shearing coater to a large-scale continuous roll-to-roll (R2R) printer. Large-area all-polymer solar cells are continuously roll-to-roll slot die printed with power conversion efficiencies of 5%, with combined cell area up to 10 cm 2. This is among the highest efficiencies realized with R2R-coated active layer organic materials on flexible substrate.« less

  14. Flow-Directed Crystallization for Printed Electronics.

    PubMed

    Qu, Ge; Kwok, Justin J; Diao, Ying

    2016-12-20

    The solution printability of organic semiconductors (OSCs) represents a distinct advantage for materials processing, enabling low-cost, high-throughput, and energy-efficient manufacturing with new form factors that are flexible, stretchable, and transparent. While the electronic performance of OSCs is not comparable to that of crystalline silicon, the solution processability of OSCs allows them to complement silicon by tackling challenging aspects for conventional photolithography, such as large-area electronics manufacturing. Despite this, controlling the highly nonequilibrium morphology evolution during OSC printing remains a challenge, hindering the achievement of high electronic device performance and the elucidation of structure-property relationships. Many elegant morphological control methodologies have been developed in recent years including molecular design and novel processing approaches, but few have utilized fluid flow to control morphology in OSC thin films. In this Account, we discuss flow-directed crystallization as an effective strategy for controlling the crystallization kinetics during printing of small molecule and polymer semiconductors. Introducing the concept of flow-directed crystallization to the field of printed electronics is inspired by recent advances in pharmaceutical manufacturing and flow processing of flexible-chain polymers. Although flow-induced crystallization is well studied in these areas, previous findings may not apply directly to the field of printed electronics where the molecular structures (i.e., rigid π-conjugated backbone decorated with flexible side chains) and the intermolecular interactions (i.e., π-π interactions, quadrupole interactions) of OSCs differ substantially from those of pharmaceuticals or flexible-chain polymers. Another critical difference is the important role of solvent evaporation in open systems, which defines the flow characteristics and determines the crystallization kinetics and pathways. In other words, flow-induced crystallization is intimately coupled with the mass transport processes driven by solvent evaporation during printing. In this Account, we will highlight these distinctions of flow-directed crystallization for printed electronics. In the context of solution printing of OSCs, the key issue that flow-directed crystallization addresses is the kinetics mismatch between crystallization and various transport processes during printing. We show that engineering fluid flows can tune the kinetics of OSC crystallization by expediting the nucleation and crystal growth processes, significantly enhancing thin film morphology and device performance. For small molecule semiconductors, nucleation can be enhanced and patterned by directing the evaporative flux via contact line engineering, and defective crystal growth can be alleviated by enhancing mass transport to yield significantly improved coherence length and reduced grain boundaries. For conjugated polymers, extensional and shear flow can expedite nucleation through flow-induced conformation change, facilitating the control of microphase separation, degree of crystallinity, domain alignment, and percolation. Although the nascent concept of flow-directed solution printing has not yet been widely adopted in the field of printed electronics, we anticipate that it can serve as a platform technology in the near future for improving device performance and for systematically tuning thin film morphology to construct structure-property relationships. From a fundamental perspective, it is imperative to develop a better understanding of the effects of fluid flow and mass transport on OSC crystallization as these processes are ubiquitous across all solution processing techniques and can critically impact charge transport properties.

  15. Critical time scales for advection-diffusion-reaction processes.

    PubMed

    Ellery, Adam J; Simpson, Matthew J; McCue, Scott W; Baker, Ruth E

    2012-04-01

    The concept of local accumulation time (LAT) was introduced by Berezhkovskii and co-workers to give a finite measure of the time required for the transient solution of a reaction-diffusion equation to approach the steady-state solution [A. M. Berezhkovskii, C. Sample, and S. Y. Shvartsman, Biophys. J. 99, L59 (2010); A. M. Berezhkovskii, C. Sample, and S. Y. Shvartsman, Phys. Rev. E 83, 051906 (2011)]. Such a measure is referred to as a critical time. Here, we show that LAT is, in fact, identical to the concept of mean action time (MAT) that was first introduced by McNabb [A. McNabb and G. C. Wake, IMA J. Appl. Math. 47, 193 (1991)]. Although McNabb's initial argument was motivated by considering the mean particle lifetime (MPLT) for a linear death process, he applied the ideas to study diffusion. We extend the work of these authors by deriving expressions for the MAT for a general one-dimensional linear advection-diffusion-reaction problem. Using a combination of continuum and discrete approaches, we show that MAT and MPLT are equivalent for certain uniform-to-uniform transitions; these results provide a practical interpretation for MAT by directly linking the stochastic microscopic processes to a meaningful macroscopic time scale. We find that for more general transitions, the equivalence between MAT and MPLT does not hold. Unlike other critical time definitions, we show that it is possible to evaluate the MAT without solving the underlying partial differential equation (pde). This makes MAT a simple and attractive quantity for practical situations. Finally, our work explores the accuracy of certain approximations derived using MAT, showing that useful approximations for nonlinear kinetic processes can be obtained, again without treating the governing pde directly.

  16. [Barriers to Digitalisation of Healthcare in Germany: A Survey of Experts].

    PubMed

    Nohl-Deryk, Pascal; Brinkmann, Jesaja Kenneth; Gerlach, Ferdinand Michael; Schreyögg, Jonas; Achelrod, Dmitrij

    2018-01-04

    Digital health is a growing area in healthcare with a huge potential. Nevertheless, the degree of digitalization in German healthcare is low when compared internationally and with other German industries. Despite political efforts, certain barriers seem to strongly impede the process of digitalization process in healthcare. We surveyed 18 representative healthcare experts from various sectors with semi-structured interviews on barriers and solutions for digital health. Thematic analysis by Braun and Clarke was used for interpretation. The interviewees identified barriers that were stakeholder-specific and across stakeholders. Self-regulatory bodies and the medical profession were found to lack willingness and organizational structure for digitalization. Lack of evidence and missing interoperability represented primary obstacles, while current legislation and financial regulations were rarely mentioned. In particular, infrastructure expansion and interoperability would require a coordinated, state intervention. Positive communication on possibilities and benefits of digital solutions was also considered important. A strong political will, an overarching strategy accompanied by a communication concept seems to be necessary in order for digital health to succeed. Regarding legislation, binding specifications, deadlines and sanctions may be needed for self-regulatory bodies, while also involving users in the development process at an early stage and creating positive incentives for using digital solutions. © Georg Thieme Verlag KG Stuttgart · New York.

  17. All solution-processed micro-structured flexible electrodes for low-cost light-emitting pressure sensors fabrication.

    PubMed

    Shimotsu, Rie; Takumi, Takahiro; Vohra, Varun

    2017-07-31

    Recent studies have demonstrated the advantage of developing pressure-sensitive devices with light-emitting properties for direct visualization of pressure distribution, potential application to next generation touch panels and human-machine interfaces. To ensure that this technology is available to everyone, its production cost should be kept as low as possible. Here, simple device concepts, namely, pressure sensitive flexible hybrid electrodes and OLED architecture, are used to produce low-cost resistive or light-emitting pressure sensors. Additionally, integrating solution-processed self-assembled micro-structures into the flexible hybrid electrodes composed of an elastomer and conductive materials results in enhanced device performances either in terms of pressure or spatial distribution sensitivity. For instance, based on the pressure applied, the measured values for the resistances of pressure sensors range from a few MΩ down to 500 Ω. On the other hand, unlike their evaporated equivalents, the combination of solution-processed flexible electrodes with an inverted OLED architectures display bright green emission when a pressure over 200 kPa is applied. At a bias of 3 V, their luminance can be tuned by applying a higher pressure of 500 kPa. Consequently, features such as fingernails and fingertips can be clearly distinguished from one another in these long-lasting low-cost devices.

  18. Analysis and Test Correlation of Proof of Concept Box for Blended Wing Body-Low Speed Vehicle

    NASA Technical Reports Server (NTRS)

    Spellman, Regina L.

    2003-01-01

    The Low Speed Vehicle (LSV) is a 14.2% scale remotely piloted vehicle of the revolutionary Blended Wing Body concept. The design of the LSV includes an all composite airframe. Due to internal manufacturing capability restrictions, room temperature layups were necessary. An extensive materials testing and manufacturing process development effort was underwent to establish a process that would achieve the high modulus/low weight properties required to meet the design requirements. The analysis process involved a loads development effort that incorporated aero loads to determine internal forces that could be applied to a traditional FEM of the vehicle and to conduct detailed component analyses. A new tool, Hypersizer, was added to the design process to address various composite failure modes and to optimize the skin panel thickness of the upper and lower skins for the vehicle. The analysis required an iterative approach as material properties were continually changing. As a part of the material characterization effort, test articles, including a proof of concept wing box and a full-scale wing, were fabricated. The proof of concept box was fabricated based on very preliminary material studies and tested in bending, torsion, and shear. The box was then tested to failure under shear. The proof of concept box was also analyzed using Nastran and Hypersizer. The results of both analyses were scaled to determine the predicted failure load. The test results were compared to both the Nastran and Hypersizer analytical predictions. The actual failure occurred at 899 lbs. The failure was predicted at 1167 lbs based on the Nastran analysis. The Hypersizer analysis predicted a lower failure load of 960 lbs. The Nastran analysis alone was not sufficient to predict the failure load because it does not identify local composite failure modes. This analysis has traditionally been done using closed form solutions. Although Hypersizer is typically used as an optimizer for the design process, the failure prediction was used to help gain acceptance and confidence in this new tool. The correlated models and process were to be used to analyze the full BWB-LSV airframe design. The analysis and correlation with test results of the proof of concept box is presented here, including the comparison of the Nastran and Hypersizer results.

  19. Retardation of mobile radionuclides in granitic rock fractures by matrix diffusion

    NASA Astrophysics Data System (ADS)

    Hölttä, P.; Poteri, A.; Siitari-Kauppi, M.; Huittinen, N.

    Transport of iodide and sodium has been studied by means of block fracture and core column experiments to evaluate the simplified radionuclide transport concept. The objectives were to examine the processes causing retention in solute transport, especially matrix diffusion, and to estimate their importance during transport in different scales and flow conditions. Block experiments were performed using a Kuru Grey granite block having a horizontally planar natural fracture. Core columns were constructed from cores drilled orthogonal to the fracture of the granite block. Several tracer tests were performed using uranine, 131I and 22Na as tracers at water flow rates 0.7-50 μL min -1. Transport of tracers was modelled by applying the advection-dispersion model based on the generalized Taylor dispersion added with matrix diffusion. Scoping calculations were combined with experiments to test the model concepts. Two different experimental configurations could be modelled applying consistent transport processes and parameters. The processes, advection-dispersion and matrix diffusion, were conceptualized with sufficient accuracy to replicate the experimental results. The effects of matrix diffusion were demonstrated on the slightly sorbing sodium and mobile iodine breakthrough curves.

  20. Advances in microfluidics for drug discovery.

    PubMed

    Lombardi, Dario; Dittrich, Petra S

    2010-11-01

    Microfluidics is considered as an enabling technology for the development of unconventional and innovative methods in the drug discovery process. The concept of micrometer-sized reaction systems in the form of continuous flow reactors, microdroplets or microchambers is intriguing, and the versatility of the technology perfectly fits with the requirements of drug synthesis, drug screening and drug testing. In this review article, we introduce key microfluidic approaches to the drug discovery process, highlighting the latest and promising achievements in this field, mainly from the years 2007 - 2010. Despite high expectations of microfluidic approaches to several stages of the drug discovery process, up to now microfluidic technology has not been able to significantly replace conventional drug discovery platforms. Our aim is to identify bottlenecks that have impeded the transfer of microfluidics into routine platforms for drug discovery and show some recent solutions to overcome these hurdles. Although most microfluidic approaches are still applied only for proof-of-concept studies, thanks to creative microfluidic research in the past years unprecedented novel capabilities of microdevices could be demonstrated, and general applicable, robust and reliable microfluidic platforms seem to be within reach.

  1. A fully vectorized numerical solution of the incompressible Navier-Stokes equations. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Patel, N.

    1983-01-01

    A vectorizable algorithm is presented for the implicit finite difference solution of the incompressible Navier-Stokes equations in general curvilinear coordinates. The unsteady Reynolds averaged Navier-Stokes equations solved are in two dimension and non-conservative primitive variable form. A two-layer algebraic eddy viscosity turbulence model is used to incorporate the effects of turbulence. Two momentum equations and a Poisson pressure equation, which is obtained by taking the divergence of the momentum equations and satisfying the continuity equation, are solved simultaneously at each time step. An elliptic grid generation approach is used to generate a boundary conforming coordinate system about an airfoil. The governing equations are expressed in terms of the curvilinear coordinates and are solved on a uniform rectangular computational domain. A checkerboard SOR, which can effectively utilize the computer architectural concept of vector processing, is used for iterative solution of the governing equations.

  2. Flow, Transport, and Reaction in Porous Media: Percolation Scaling, Critical-Path Analysis, and Effective Medium Approximation

    NASA Astrophysics Data System (ADS)

    Hunt, Allen G.; Sahimi, Muhammad

    2017-12-01

    We describe the most important developments in the application of three theoretical tools to modeling of the morphology of porous media and flow and transport processes in them. One tool is percolation theory. Although it was over 40 years ago that the possibility of using percolation theory to describe flow and transport processes in porous media was first raised, new models and concepts, as well as new variants of the original percolation model are still being developed for various applications to flow phenomena in porous media. The other two approaches, closely related to percolation theory, are the critical-path analysis, which is applicable when porous media are highly heterogeneous, and the effective medium approximation—poor man's percolation—that provide a simple and, under certain conditions, quantitatively correct description of transport in porous media in which percolation-type disorder is relevant. Applications to topics in geosciences include predictions of the hydraulic conductivity and air permeability, solute and gas diffusion that are particularly important in ecohydrological applications and land-surface interactions, and multiphase flow in porous media, as well as non-Gaussian solute transport, and flow morphologies associated with imbibition into unsaturated fractures. We describe new applications of percolation theory of solute transport to chemical weathering and soil formation, geomorphology, and elemental cycling through the terrestrial Earth surface. Wherever quantitatively accurate predictions of such quantities are relevant, so are the techniques presented here. Whenever possible, the theoretical predictions are compared with the relevant experimental data. In practically all the cases, the agreement between the theoretical predictions and the data is excellent. Also discussed are possible future directions in the application of such concepts to many other phenomena in geosciences.

  3. Magnetic levitation assisted aircraft take-off and landing (feasibility study - GABRIEL concept)

    NASA Astrophysics Data System (ADS)

    Rohacs, Daniel; Rohacs, Jozsef

    2016-08-01

    The Technology Roadmap 2013 developed by the International Air Transport Association envisions the option of flying without an undercarriage to be in operation by 2032. Preliminary investigations clearly indicate that magnetic levitation technology (MagLev) might be an appealing solution to assist the aircraft take-off and landing. The EU supported research project, abbreviated as GABRIEL, was dealing with (i) the concept development, (ii) the identification, evaluation and selection of the deployable magnetic levitation technology, (iii) the definition of the core system elements (including the required aircraft modifications, the ground-based system and airport elements, and the rendezvous control system), (iv) the analysis of the safety and security aspects, (v) the concept validation and (vi) the estimation of the proposed concept impact in terms of aircraft weight, noise, emission, cost-benefit). All results introduced here are compared to a medium size hypothetic passenger aircraft (identical with an Airbus A320). This paper gives a systematic overview of (i) the applied methods, (ii) the investigation of the possible use of magnetic levitation technology to assist the commercial aircraft take-off and landing processes and (iii) the demonstrations, validations showing the feasibility of the radically new concept. All major results are outlined.

  4. Colors, humors and evil eye: indigenous classification and treatment of childhood diarrhea in highland Guatemala.

    PubMed

    Burleigh, E; Dardano, C; Cruz, J R

    1990-11-01

    Focal group interviews on indigenous perceptions and reported management of childhood diarrhea were conducted in 1987-88 in Guatemala as a part of a prospective epidemiological field study of chronic diarrhea. Six cognitive schemata were identified, each with specific causes, a linked progression of concepts, symptoms, signs, and diagnostic characteristics. Nearly all were related to the humoral theory of disease, including the concept of evil eye. Diarrheal disease was conceptualized in the village as a set of processes which could be either "hot" or "cold" rather than as an unchanging single-symptom entity occupying only one spot on the humoral continuum. Clarification of the temporal relationship between concepts was found to be essential to the understanding of these indigenously-defined schemata. Stool color reflecting humoral theory was the primary concept used in household-level diagnosis. Reported behavior associated with these cognitive schemata (traditional treatments, pharmaceutical and dietary management) showed remarkable constancy, and adhered for the most part to the humoral concept of equilibrium. These included the use of oral rehydration solutions (ORS) and liquids. The applied importance of humoral theory to home-based use of ORS is discussed briefly as is the indigenous definition of dehydration.

  5. Unified Software Solution for Efficient SPR Data Analysis in Drug Research

    PubMed Central

    Dahl, Göran; Steigele, Stephan; Hillertz, Per; Tigerström, Anna; Egnéus, Anders; Mehrle, Alexander; Ginkel, Martin; Edfeldt, Fredrik; Holdgate, Geoff; O’Connell, Nichole; Kappler, Bernd; Brodte, Annette; Rawlins, Philip B.; Davies, Gareth; Westberg, Eva-Lotta; Folmer, Rutger H. A.; Heyse, Stephan

    2016-01-01

    Surface plasmon resonance (SPR) is a powerful method for obtaining detailed molecular interaction parameters. Modern instrumentation with its increased throughput has enabled routine screening by SPR in hit-to-lead and lead optimization programs, and SPR has become a mainstream drug discovery technology. However, the processing and reporting of SPR data in drug discovery are typically performed manually, which is both time-consuming and tedious. Here, we present the workflow concept, design and experiences with a software module relying on a single, browser-based software platform for the processing, analysis, and reporting of SPR data. The efficiency of this concept lies in the immediate availability of end results: data are processed and analyzed upon loading the raw data file, allowing the user to immediately quality control the results. Once completed, the user can automatically report those results to data repositories for corporate access and quickly generate printed reports or documents. The software module has resulted in a very efficient and effective workflow through saved time and improved quality control. We discuss these benefits and show how this process defines a new benchmark in the drug discovery industry for the handling, interpretation, visualization, and sharing of SPR data. PMID:27789754

  6. A novel physical eco-hydrological model concept for preferential flow based on experimental applications.

    NASA Astrophysics Data System (ADS)

    Jackisch, Conrad; van Schaik, Loes; Graeff, Thomas; Zehe, Erwin

    2014-05-01

    Preferential flow through macropores often determines hydrological characteristics - especially regarding runoff generation and fast transport of solutes. Macropore settings may yet be very different in nature and dynamics, depending on their origin. While biogenic structures follow activity cycles (e.g. earth worms) and population conditions (e.g. roots), pedogenic and geogenic structures may depend on water stress (e.g. cracks) or large events (e.g. flushed voids between skeleton and soil pipes) or simply persist (e.g. bedrock interface). On the one hand, such dynamic site characteristics can be observed in seasonal changes in its reaction to precipitation. On the other hand, sprinkling experiments accompanied by tracers or time-lapse 3D Ground-Penetrating-Radar are suitable tools to determine infiltration patterns and macropore configuration. However, model representation of the macropore-matrix system is still problematic, because models either rely on effective parameters (assuming well-mixed state) or on explicit advection strongly simplifying or neglecting interaction with the diffusive flow domain. Motivated by the dynamic nature of macropores, we present a novel model approach for interacting diffusive and advective water, solutes and energy transport in structured soils. It solely relies on scale- and process-aware observables. A representative set of macropores (data from sprinkling experiments) determines the process model scale through 1D advective domains. These are connected to a 2D matrix domain which is defined by pedo-physical retention properties. Water is represented as particles. Diffusive flow is governed by a 2D random walk of these particles while advection may take place in the macropore domain. Macropore-matrix interaction is computed as dissipation of the advective momentum of a particle by its experienced drag from the matrix domain. Through a representation of matrix and macropores as connected diffusive and advective domains for water transport we open up double domain concepts linking porescale physics to preferential macroscale fingerprints without effective parameterisation or mixing assumptions. Moreover, solute transport, energy balance aspects and lateral heterogeneity in soil moisture distribution are intrinsically captured. In addition, macropore and matrix domain settings may change over time based on physical and stochastic observations. The representativity concept allows scaleability from plotscale to the lower mesoscale.

  7. Product Lifecycle Management and the Quest for Sustainable Space Exploration Solutions

    NASA Technical Reports Server (NTRS)

    Caruso, Pamela W.; Dumbacher, Daniel L.; Grieves, Michael

    2011-01-01

    Product Lifecycle Management (PLM) is an outcome of lean thinking to eliminate waste and increase productivity. PLM is inextricably tied to the systems engineering business philosophy, coupled with a methodology by which personnel, processes and practices, and information technology combine to form an architecture platform for product design, development, manufacturing, operations, and decommissioning. In this model, which is being implemented by the Marshall Space Flight Center (MSFC) Engineering Directorate, total lifecycle costs are important variables for critical decision-making. With the ultimate goal to deliver quality products that meet or exceed requirements on time and within budget, PLM is a powerful concept to shape everything from engineering trade studies and testing goals, to integrated vehicle operations and retirement scenarios. This briefing will demonstrate how the MSFC Engineering Directorate is implementing PLM as part of an overall strategy to deliver safe, reliable, and affordable space exploration solutions and how that strategy aligns with the Agency and Center systems engineering policies and processes. Sustainable space exploration solutions demand that all lifecycle phases be optimized, and engineering the next generation space transportation system requires a paradigm shift such that digital tools and knowledge management, which are central elements of PLM, are used consistently to maximum effect. Adopting PLM, which has been used by the aerospace and automotive industry for many years, for spacecraft applications provides a foundation for strong, disciplined systems engineering and accountable return on investment. PLM enables better solutions using fewer resources by making lifecycle considerations in an integrative decision-making process.

  8. Life's Biological Chemistry: A Destiny or Destination Starting from Prebiotic Chemistry?

    PubMed

    Krishnamurthy, Ramanarayanan

    2018-06-05

    Research into understanding the origins -and evolution- of life has long been dominated by the concept of taking clues from extant biology and extrapolating its molecules and pathways backwards in time. This approach has also guided the search for solutions to the problem of how contemporary biomolecules would have arisen directly from prebiotic chemistry on early earth. However, the continuing difficulties in finding universally convincing solutions in connecting prebiotic chemistry to biological chemistry should give us pause, and prompt us to rethink this concept of treating extant life's chemical processes as the sole end goal and, therefore, focusing only -and implicitly- on the respective extant chemical building blocks. Rather, it may be worthwhile "to set aside the goal" and begin with what would have been plausible prebiotic reaction mixtures (which may have no obvious or direct connection to life's chemical building blocks and processes) - and allow their chemistries and interactions, under different geochemical constraints, to guide and illuminate as to what processes and systems can emerge. Such a conceptual approach gives rise to the prospect that chemistry of life-as-we-know-it is not the only result (not a "destiny"), but one that has emerged among many potential possibilities (a "destination"). This postulate, in turn, could impact the way we think about chemical signatures and criteria used in the search for alternative and extraterrestrial "life". As a bonus, we may discover the chemistries and pathways naturally that led to the emergence of life as we know it. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. Parametric Evaluation of Interstellar Exploration Mission Concepts

    NASA Technical Reports Server (NTRS)

    Adams, Robert B.

    2017-01-01

    One persistent difficulty in evaluating the myriad advanced propulsion concepts proposed over the last 60 years is a true apples to apples comparison of the expected gain in performance. This analysis is complicated by numerous factors including, multiple missions of interest to the advanced propulsion community, the lack of a credible closed form solution to 'medium thrust' trajectories, and lack of detailed design data for most proposed concepts that lend credibility to engine performance estimates. This paper describes a process on how to make fair comparisons of different propulsion concepts for multiple missions over a wide range of performance values. The figure below illustrates this process. This paper describes in detail the process and outlines the status so far in compiling the required data. Parametric data for several missions are calculated and plotted against specific power-specific impulse scatter plots of expected propulsion system performance. The overlay between required performance as defined by the trajectory parametrics and expected performance as defined in the literature for major categories of propulsion systems clearly defines which propulsion systems are the most apt for a given mission. The application of the Buckingham Pi theorem to general parameters for interstellar exploration ( mission time, mass, specific impulse, specific power, distance, propulsion source energy/mass, etc.) yields a number of dimensionless variables. The relationships of these variables can then be explored before application to a particular mission. Like in the fields of fluid mechanics and heat transfer, the use of the Buckingham Pi theorem results in new variables to make apples to apples comparisons.

  10. IT Solution concept development for tracking and analyzing the labor effectiveness of employees

    NASA Astrophysics Data System (ADS)

    Ilin, Igor; Shirokova, Svetlana; Lepekhin, Aleksandr

    2018-03-01

    Labor efficiency and productivity of employees is an important aspect for the environment within any type of organization. This is particularly crucial factor for the companies, if which operations are associated with physical labor, such as construction companies. Productivity and efficiency are both very complicated concepts and a huge variety of methods and approaches to its analysis can be implemented within the organization. Despite that, it is important to choose the methods, which not only analyze the key performance indicators of employee, but take into account personal indicators, which might affect performance even more than professional skills. For this complicated analysis task it is important to build IT solution for tracking and analyzing of the labor effectiveness. The concept for designing this IT solution is proposed in the current research.

  11. [Phenomenology of multiculturalism and intercultural pluralism].

    PubMed

    Hoyos, Guillermo

    2012-01-01

    Multiculturalism is defined as the combination, within a given territory, of a social unit and a cultural plurality by way of exchanges and communications among actors who use different categories of expression, analysis and interpretation. A multiculturalist project should not promote a society that is split up into closed groups; on the contrary, it should set forth policies based on communication and cooperation processes among the cultural communities. To understand this concept, we will present the ontological basis of this phenomenonin the search for a communicational solution, with our startpoint being a phenomenological description of the way in which multiculturalism manifests to us in life; to later deepen into the meaning of the phenomenon,and finally offer a pluralist solution to the problems and challenges cultural differences bring about.

  12. Use of multiple cluster analysis methods to explore the validity of a community outcomes concept map.

    PubMed

    Orsi, Rebecca

    2017-02-01

    Concept mapping is now a commonly-used technique for articulating and evaluating programmatic outcomes. However, research regarding validity of knowledge and outcomes produced with concept mapping is sparse. The current study describes quantitative validity analyses using a concept mapping dataset. We sought to increase the validity of concept mapping evaluation results by running multiple cluster analysis methods and then using several metrics to choose from among solutions. We present four different clustering methods based on analyses using the R statistical software package: partitioning around medoids (PAM), fuzzy analysis (FANNY), agglomerative nesting (AGNES) and divisive analysis (DIANA). We then used the Dunn and Davies-Bouldin indices to assist in choosing a valid cluster solution for a concept mapping outcomes evaluation. We conclude that the validity of the outcomes map is high, based on the analyses described. Finally, we discuss areas for further concept mapping methods research. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Study of waste management towards sustainable green campus in Universitas Gadjah Mada

    NASA Astrophysics Data System (ADS)

    Setyowati, Mega; Kusumawanto, Arif; Prasetya, Agus

    2018-05-01

    Waste management is a part of the green campus achievement program. Universitas Gadjah Mada has a Standard Operating Procedure for managing produced waste. Waste produced by each building or work unit is temporarily accommodated in the waste depot before dumped into the landfill. This research aims to study the waste management system in UGM, in accordance with the concept of a green campus. The concept of green campus to improve the efficiency of waste management needs to be supported by various parties. The success of the green campus program relies on an integrated approach, a sustainable implementation that involves stakeholders of the university. In actualizing the concept of a green campus, the university has its own waste processing system. The organic produced waste is processed into compost, while plastic waste is converted into alternative fuel. Overall, the waste management system that UGM owns is ineffective and inefficient, it was proved by the fact that there is still much waste dumped into the landfill. UGM provides a laboratory that is specialized to process waste that is produced by UGM. It is planned to be able to reduce the amount of waste that is dumped into the landfill. According to the results, vermicomposting technology, the manufacture of liquid fertilizer from leachate, and the manufacture of the composite from a mixture of leaves and paper were offered as solutions.

  14. Simulating the control of molecular reactions via modulated light fields: from gas phase to solution

    NASA Astrophysics Data System (ADS)

    Thallmair, Sebastian; Keefer, Daniel; Rott, Florian; de Vivie-Riedle, Regina

    2017-04-01

    Over the past few years quantum control has proven to be very successful in steering molecular processes. By combining theory with experiment, even highly complex control aims were realized in the gas phase. In this topical review, we illustrate the past achievements on several examples in the molecular context. The next step for the quantum control of chemical processes is to translate the fruitful interplay between theory and experiment to the condensed phase and thus to the regime where chemical synthesis can be supported. On the theory side, increased efforts to include solvent effects in quantum control simulations were made recently. We discuss two major concepts, namely an implicit description of the environment via the density matrix algorithm and an explicit inclusion of solvent molecules. By application to chemical reactions, both concepts conclude that despite environmental perturbations leading to more complex control tasks, efficient quantum control in the condensed phase is still feasible.

  15. Rigorous GNSS network solutions of unlimited size

    NASA Astrophysics Data System (ADS)

    Boomkamp, H.; Iag Working Group 1. 1. 1

    2010-12-01

    The session description states that rigorous estimation processes for millions of parameters are computationally impossible. A more accurate observation would be that such solutions exceed the capacity of current Analysis Centres by several orders of magnitude, as was already discussed during the IGS Workshop of 2004. We can however make processing elements that are smaller and simpler than conventional Analysis Centres, until we have a “centre” that can be replicated in arbitrary amounts, at zero cost. In practice this means that the processing element is reduced to a single, automated computer application that can run anywhere. These analysis elements are connected via the internet into a scalable grid computing scheme that can handle GNSS networks of any size. The approach is not fundamentally different from current combination solutions among a network of Analysis Centres, but refines the granularity of the network elements in order to reduce system complexity and eliminate cost. The Dancer project of IAG Working Group 1 has developed a JXTA peer-to-peer application to this purpose. Dancer splits a conventional batch least squares process into as many interacting subtasks as there are receivers. Each task can then run on a local PC of a permanent GNSS site, or anywhere else. All Dancer instances find the same global solution for satellite orbits, clocks and Earth rotation parameters via an efficient vector averaging method called square dancing. The hardware requirements for a single Dancer process do not exceed those of e.g. current mobile phone applications, so that future generations of GNSS receivers may be able to run such a task as an embedded process. This leads to the concept of “smart receivers” that no longer require any post-processing infrastructure. Instead they need an internet connection to join thousands of other smart receivers in a global network solution. The key algorithms, project status and further deployment of the Dancer system will be presented. A brief summary is also given of two follow-on projects, called Digger (distributed computing for global geodetic reprocessing) and Dart (Dancer real-time). For more details, see www.GPSdancer.com.

  16. Solution Concepts for Distributed Decision-Making without Coordination

    NASA Technical Reports Server (NTRS)

    Beling, Peter A.; Patek, Stephen D.

    2005-01-01

    Consider a single-stage problem in which we have a group N agents who are attempting to minimize the expected cost of their joint actions, without the benefit of communication or a pre-established protocol but with complete knowledge of the expected cost of any joint set of actions for the group. We call this situation a static coordination problem. The central issue in defining an appropriate solution concept for static coordination problems is considering how to deal with the fact that if the agents axe faced with a set of multiple (mixed) strategies that are equally attractive in terms of cost, a failure of coordination may lead to an expected cost value that is worse than that of any of the strategies in the set. In this proposal, we describe the notion of a general coordination problem, describe initial efforts at developing a solution concept for static coordination problems, and then outline a research agenda that centers on activities that will be basis for obtaining a complete understanding of solutions to static coordination problems.

  17. Flexible processing and the design of grammar.

    PubMed

    Sag, Ivan A; Wasow, Thomas

    2015-02-01

    We explore the consequences of letting the incremental and integrative nature of language processing inform the design of competence grammar. What emerges is a view of grammar as a system of local monotonic constraints that provide a direct characterization of the signs (the form-meaning correspondences) of a given language. This "sign-based" conception of grammar has provided precise solutions to the key problems long thought to motivate movement-based analyses, has supported three decades of computational research developing large-scale grammar implementations, and is now beginning to play a role in computational psycholinguistics research that explores the use of underspecification in the incremental computation of partial meanings.

  18. Building a common pipeline for rule-based document classification.

    PubMed

    Patterson, Olga V; Ginter, Thomas; DuVall, Scott L

    2013-01-01

    Instance-based classification of clinical text is a widely used natural language processing task employed as a step for patient classification, document retrieval, or information extraction. Rule-based approaches rely on concept identification and context analysis in order to determine the appropriate class. We propose a five-step process that enables even small research teams to develop simple but powerful rule-based NLP systems by taking advantage of a common UIMA AS based pipeline for classification. Our proposed methodology coupled with the general-purpose solution provides researchers with access to the data locked in clinical text in cases of limited human resources and compact timelines.

  19. Biomass Feedstock and Conversion Supply System Design and Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jacobson, Jacob J.; Roni, Mohammad S.; Lamers, Patrick

    Idaho National Laboratory (INL) supports the U.S. Department of Energy’s bioenergy research program. As part of the research program INL investigates the feedstock logistics economics and sustainability of these fuels. A series of reports were published between 2000 and 2013 to demonstrate the feedstock logistics cost. Those reports were tailored to specific feedstock and conversion process. Although those reports are different in terms of conversion, some of the process in the feedstock logistic are same for each conversion process. As a result, each report has similar information. A single report can be designed that could bring all commonality occurred inmore » the feedstock logistics process while discussing the feedstock logistics cost for different conversion process. Therefore, this report is designed in such a way that it can capture different feedstock logistics cost while eliminating the need of writing a conversion specific design report. Previous work established the current costs based on conventional equipment and processes. The 2012 programmatic target was to demonstrate a delivered biomass logistics cost of $55/dry ton for woody biomass delivered to fast pyrolysis conversion facility. The goal was achieved by applying field and process demonstration unit-scale data from harvest, collection, storage, preprocessing, handling, and transportation operations into INL’s biomass logistics model. The goal of the 2017 Design Case is to enable expansion of biofuels production beyond highly productive resource areas by breaking the reliance of cost-competitive biofuel production on a single, low-cost feedstock. The 2017 programmatic target is to supply feedstock to the conversion facility that meets the in-feed conversion process quality specifications at a total logistics cost of $80/dry T. The $80/dry T. target encompasses total delivered feedstock cost, including both grower payment and logistics costs, while meeting all conversion in-feed quality targets. The 2012 $55/dry T. programmatic target included only logistics costs with a limited focus on biomass quantity, quality and did not include a grower payment. The 2017 Design Case explores two approaches to addressing the logistics challenge: one is an agronomic solution based on blending and integrated landscape management and the second is a logistics solution based on distributed biomass preprocessing depots. The concept behind blended feedstocks and integrated landscape management is to gain access to more regional feedstock at lower access fees (i.e., grower payment) and to reduce preprocessing costs by blending high quality feedstocks with marginal quality feedstocks. Blending has been used in the grain industry for a long time; however, the concept of blended feedstocks in the biofuel industry is a relatively new concept. The blended feedstock strategy relies on the availability of multiple feedstock sources that are blended using a least-cost formulation within an economical supply radius, which, in turn, decreases the grower payment by reducing the amount of any single biomass. This report will introduce the concepts of blending and integrated landscape management and justify their importance in meeting the 2017 programmatic goals.« less

  20. Indicators to facilitate the early identification of patients with major depressive disorder in need of highly specialized care: A concept mapping study.

    PubMed

    van Krugten, F C W; Goorden, M; van Balkom, A J L M; Spijker, J; Brouwer, W B F; Hakkaart-van Roijen, L

    2018-04-01

    Early identification of the subgroup of patients with major depressive disorder (MDD) in need of highly specialized care could enhance personalized intervention. This, in turn, may reduce the number of treatment steps needed to achieve and sustain an adequate treatment response. The aim of this study was to identify patient-related indicators that could facilitate the early identification of the subgroup of patients with MDD in need of highly specialized care. Initial patient indicators were derived from a systematic review. Subsequently, a structured conceptualization methodology known as concept mapping was employed to complement the initial list of indicators by clinical expertise and develop a consensus-based conceptual framework. Subject-matter experts were invited to participate in the subsequent steps (brainstorming, sorting, and rating) of the concept mapping process. A final concept map solution was generated using nonmetric multidimensional scaling and agglomerative hierarchical cluster analyses. In total, 67 subject-matter experts participated in the concept mapping process. The final concept map revealed the following 10 major clusters of indicators: 1-depression severity, 2-onset and (treatment) course, 3-comorbid personality disorder, 4-comorbid substance use disorder, 5-other psychiatric comorbidity, 6-somatic comorbidity, 7-maladaptive coping, 8-childhood trauma, 9-social factors, and 10-psychosocial dysfunction. The study findings highlight the need for a comprehensive assessment of patient indicators in determining the need for highly specialized care, and suggest that the treatment allocation of patients with MDD to highly specialized mental healthcare settings should be guided by the assessment of clinical and nonclinical patient factors. © 2018 Wiley Periodicals, Inc.

  1. Data-Intensive Science Meets Inquiry-Driven Pedagogy: Interactive Big Data Exploration, Threshold Concepts, and Liminality

    NASA Astrophysics Data System (ADS)

    Ramachandran, R.; Nair, U. S.; Word, A.

    2014-12-01

    Threshold concepts in any discipline are the core concepts an individual must understand in order to master a discipline. By their very nature, these concepts are troublesome, irreversible, integrative, bounded, discursive, and reconstitutive. Although grasping threshold concepts can be extremely challenging for each learner as s/he moves through stages of cognitive development relative to a given discipline, the learner's grasp of these concepts determines the extent to which s/he is prepared to work competently and creatively within the field itself. The movement of individuals from a state of ignorance of these core concepts to one of mastery occurs not along a linear path but in iterative cycles of knowledge creation and adjustment in liminal spaces - conceptual spaces through which learners move from the vaguest awareness of concepts to mastery, accompanied by understanding of their relevance, connectivity, and usefulness relative to questions and constructs in a given discipline. With the explosive growth of data available in atmospheric science, driven largely by satellite Earth observations and high-resolution numerical simulations, paradigms such as that of data-intensive science have emerged. These paradigm shifts are based on the growing realization that current infrastructure, tools and processes will not allow us to analyze and fully utilize the complex and voluminous data that is being gathered. In this emerging paradigm, the scientific discovery process is driven by knowledge extracted from large volumes of data. In this presentation, we contend that this paradigm naturally lends to inquiry-driven pedagogy where knowledge is discovered through inductive engagement with large volumes of data rather than reached through traditional, deductive, hypothesis-driven analyses. In particular, data-intensive techniques married with an inductive methodology allow for exploration on a scale that is not possible in the traditional classroom with its typical problem sets and static, limited data samples. In addition, we identify existing gaps and possible solutions for addressing the infrastructure and tools as well as a pedagogical framework through which to implement this inductive approach.

  2. An Analysis of 16-17-Year-Old Students' Understanding of Solution Chemistry Concepts Using a Two-Tier Diagnostic Instrument

    ERIC Educational Resources Information Center

    Adadan, Emine; Savasci, Funda

    2012-01-01

    This study focused on the development of a two-tier multiple-choice diagnostic instrument, which was designed and then progressively modified, and implemented to assess students' understanding of solution chemistry concepts. The results of the study are derived from the responses of 756 Grade 11 students (age 16-17) from 14 different high schools…

  3. Buckling of a circular plate made of a shape memory alloy due to a reverse thermoelastic martensite transformation

    NASA Astrophysics Data System (ADS)

    Movchan, A. A.; Sil'chenko, L. G.

    2008-02-01

    We solve the axisymmetric buckling problem for a circular plate made of a shape memory alloy undergoing reverse martensite transformation under the action of a compressing load, which occurs after the direct martensite transformation under the action of a generally different (extending or compressing) load. The problem was solved without any simplifying assumptions concerning the transverse dimension of the supplementary phase transition region related to buckling. The mathematical problem was reduced to a nonlinear eigenvalue problem. An algorithm for solving this problem was proposed. It was shown that the critical buckling load under the reverse transition, which is obtained by taking into account the evolution of the phase strains, can be many times lower than the same quantity obtained under the assumption that the material behavior is elastic even for the least (martensite) values of the elastic moduli. The critical buckling force decreases with increasing modulus of the load applied at the preliminary stage of direct transition and weakly depends on whether this load was extending or compressing. In shape memory alloys (SMA), mutually related processes of strain and direct (from the austenitic into the martensite phase) or reverse thermoelastic phase transitions may occur. The direct transition occurs under cooling and (or) an increase in stresses and is accompanied by a significant decrease (nearly by a factor of three in titan nickelide) of the Young modulus. If the direct transition occurs under the action of stresses with nonzero deviator, then it is accompanied by accumulation of macroscopic phase strains, whose intensity may reach 8%. Under the reverse transition, which occurs under heating and (or) unloading, the moduli increase and the accumulated strain is removed. For plates compressed in their plane, in the case of uniform temperature distribution over the thickness, one can separate trivial processes under which the strained plate remains plane and the phase ratio has a uniform distribution over the thickness. For sufficiently high compressing loads, the trivial process of uniform compression may become unstable in the sense that, for small perturbations of the plate deflection, temperature, the phase ratio, or the load, the difference between the corresponding perturbed process and the unperturbed process may be significant. The results of several experiments concerning the buckling of SMA elements are given in [1, 2], and the statement and solution of the corresponding boundary value problems can be found in [3-11]. The experimental studies [2] and several analytic solutions obtained for the Shanley column [3, 4], rods [5-7], rectangular plates under direct [8] and reverse [9] transitions showed that the processes of thermoelastic phase transitions can significantly (by several times) decrease the critical buckling loads compared with their elastic values calculated for the less rigid martensite state of the material. Moreover, buckling does not occur in the one-phase martensite state in which the elastic moduli are minimal but in the two-phase state in which the values of the volume fractions of the austenitic and martensite phase are approximately equal to each other. This fact is most astonishing for buckling, studied in the present paper, under the reverse transition in which the Young modulus increases approximately half as much from the beginning of the phase transition to the moment of buckling. In [3-9] and in the present paper, the static buckling criterion is used. Following this criterion, the critical load is defined to be the load such that a nontrivial solution of the corresponding quasistatic problem is possible under the action of this load. If, in the problems of stability of rods and SMA plates, small perturbations of the external load are added to small perturbations of the deflection (the critical force is independent of the amplitude of the latter), then the critical forces vary depending on the value of perturbations of the external load [5, 8, 9]. Thus, in the case of small perturbations of the load, the problem of stability of SMA elements becomes indeterminate. The solution of the stability problem for SMA elements also depends on whether the small perturbations of the phase ratio and the phase strain tensor are taken into account. According to this, the problem of stability of SMA elements can be solved in the framework of several statements (concepts, hypotheses) which differ in the set of quantities whose perturbations are admissible (taken into account) in the process of solving the problem. The variety of these statements applied to the problem of buckling of SMA elements under direct martensite transformation is briefly described in [4, 5]. But, in the problem of buckling under the reverse transformation, some of these statements must be changed. The main question which we should answer when solving the problem of stability of SMA elements is whether small perturbations of the phase ratio (the volume fraction of the martensite phase q) are taken into account, because an appropriate choice significantly varies the results of solving the stability problem. If, under the transition to the adjacent form of equilibrium, the phase ratio of all points of the body is assumed to remain the same, then we deal with the "fixed phase atio" concept. The opposite approach can be classified as the "supplementary phase transition" concept (which occurs under the transition to the adjacent form of equilibrium). It should be noted that, since SMA have temperature hysteresis, the phase ratio in SMA can endure only one-sided small variations. But if we deal with buckling under the inverse transformation, then the variation in the volume fraction of the martensite phase cannot be positive. The phase ratio is not an independent variable, like loads or temperature, but, due to the constitutive relations, its variations occur together with the temperature variations and, in the framework of connected models for a majority of SMA, together with variations in the actual stresses. Therefore, the presence or absence of variations in q is determined by the presence or absence of variations in the temperature, deflection, and load, as well as by the system of constitutive relations used in this particular problem. In the framework of unconnected models which do not take the influence of actual stresses on the phase ratio into account, the "fixed phase ratio" concept corresponds to the case of absence of temperature variations. The variations in the phase ratio may also be absent in connected models in the case of specially chosen values of variations in the temperature and (or) in the external load, as well as in the case of SMA of CuMn type, for which the influence of the actual stresses on the phase compound is absent or negligible. In the framework of the "fixed phase ratio" hypothesis, the stability problem for SMA elements has a solution coinciding in form with the solution of the corresponding elastic problem, with the elastic moduli replaced by the corresponding functions of the phase ratio. In the framework of the supplementary phase transition" concept, the result of solving the stability problem essentially depends on whether the small perturbations of the external loads are taken into account in the process of solving the problem. The point is that, when solving the problem in the connected setting, the supplementary phase transition region occupies, in general, not the entire cross-section of the plate but only part of it, and the location of the boundary of this region depends on the existence and the value of these small perturbations. More precisely, the existence of arbitrarily small perturbations of the actual load can result in finite changes of the configuration of the supplementary phase transition region and hence in finite change of the critical values of the load. Here we must distinguish the "fixed load" hypothesis where no perturbations of the external loads are admitted and the "variable load" hypothesis in the opposite case. The conditions that there no variations in the external loads imply additional equations for determining the boundary of the supplementary phase transition region. If the "supplementary phase transition" concept and the "fixed load" concept are used together, then the solution of the stability problem of SMA is uniquely determined in the same sense as the solution of the elastic stability problem under the static approach. In the framework of the "variable load" concept, the result of solving the stability problem for SMA ceases to be unique. But one can find the upper and lower bounds for the critical forces which correspond to the cases of total absence of the supplementary phase transition: the upper bound corresponds to the critical load coinciding with that determined in the framework of the "fixed phase ratio" concept, and the lower bound corresponds to the case where the entire cross-section of the plate experiences the supplementary phase transition. The first version does not need any additional name, and the second version can be called as the "all-round supplementary phase transition" hypothesis. In the present paper, the above concepts are illustrated by examples of solving problems about axisymmetric buckling of a circular freely supported or rigidly fixed plate experiencing reverse martensite transformation under the action of an external force uniformly distributed over the contour. We find analytic solutions in the framework of all the above-listed statements except for the case of free support in the "fixed load" concept, for which we obtain a numerical solution.

  4. NXE pellicle: offering a EUV pellicle solution to the industry

    NASA Astrophysics Data System (ADS)

    Brouns, Derk; Bendiksen, Aage; Broman, Par; Casimiri, Eric; Colsters, Paul; Delmastro, Peter; de Graaf, Dennis; Janssen, Paul; van de Kerkhof, Mark; Kramer, Ronald; Kruizinga, Matthias; Kuntzel, Henk; van der Meulen, Frits; Ockwell, David; Peter, Maria; Smith, Daniel; Verbrugge, Beatrijs; van de Weg, David; Wiley, Jim; Wojewoda, Noelie; Zoldesi, Carmen; van Zwol, Pieter

    2016-03-01

    Towards the end of 2014, ASML committed to provide a EUV pellicle solution to the industry. Last year, during SPIE Microlithography 2015, we introduced the NXE pellicle concept, a removable pellicle solution that is compatible with current and future patterned mask inspection methods. This paper shows results of how we took this concept to a complete EUV pellicle solution for the industry. We will highlight some technical design challenges we faced developing the NXE pellicle and how we solved them. We will also present imaging results of pellicle exposures on a 0.33 NA NXE scanner system. In conjunction with the NXE pellicle, we will also present the supporting tooling we have developed to enable pellicle use.

  5. National Computer Security Conference Proceedings (12th): Information Systems Security: Solutions for Today - Concepts for Tomorrow Held in Baltimore, Maryland on 10-13 October 1989

    DTIC Science & Technology

    1989-10-13

    and other non -technical aspects of the system). System-wide Perspective. The systerm that is being designed and engineered must include not just the...specifications and is regarded as the lowest-level (implementation) of detail.-’ Ihis decomposition follows the typical "top down" design methodology ...formal verification process has contributed to the security and correctness of the TCB design and implementation. FORMUL METHODOLOGY DESCRIPTION The

  6. Parametric amplification in quasi-PT symmetric coupled waveguide structures

    NASA Astrophysics Data System (ADS)

    Zhong, Q.; Ahmed, A.; Dadap, J. I.; Osgood, R. M., Jr.; El-Ganainy, R.

    2016-12-01

    The concept of non-Hermitian parametric amplification was recently proposed as a means to achieve an efficient energy conversion throughout the process of nonlinear three wave mixing in the absence of phase matching. Here we investigate this effect in a waveguide coupler arrangement whose characteristics are tailored to introduce passive PT symmetry only for the idler component. By means of analytical solutions and numerical analysis, we demonstrate the utility of these novel schemes and obtain the optimal design conditions for these devices.

  7. On some properties of bone functional adaptation phenomenon useful in mechanical design.

    PubMed

    Nowak, Michał

    2010-01-01

    The paper discusses some unique properties of trabecular bone functional adaptation phenomenon, useful in mechanical design. On the basis of the biological process observations and the principle of constant strain energy density on the surface of the structure, the generic structural optimisation system has been developed. Such approach allows fulfilling mechanical theorem for the stiffest design, comprising the optimisations of size, shape and topology, using the concepts known from biomechanical studies. Also the biomimetic solution of multiple load problems is presented.

  8. Managing Problems Before Problems Manage You.

    PubMed

    Grigsby, Jim

    2015-01-01

    Every day we face problems, both personal and professional, and our initial reaction determines how well we solve those problems. Whether a problem is minor or major, short-term or lingering, there are techniques we can employ to help manage the problem and the problem-solving process. This article, based on my book Don't Tick Off The Gators! Managing Problems Before Problems Manage You, presents 12 different concepts for managing problems, not "cookie cutter" solutions, but different ideas that you can apply as they fit your circumstances.

  9. Simulation-based process windows simultaneously considering two and three conflicting criteria in injection molding

    PubMed Central

    Rodríguez-Yáñez, Alicia Berenice; Méndez-Vázquez, Yaileen

    2014-01-01

    Process windows in injection molding are habitually built with only one performance measure in mind. In reality, a more realistic picture can be obtained when considering multiple performance measures at a time, especially in the presence of conflict. In this work, the construction of process windows for injection molding (IM) is undertaken considering two and three performance measures in conflict simultaneously. The best compromises between the criteria involved are identified through the direct application of the concept of Pareto-dominance in multiple criteria optimization. The aim is to provide a formal and realistic strategy to set processing conditions in IM operations. The resulting optimization approach is easily implementable in MS Excel. The solutions are presented graphically to facilitate their use in manufacturing plants. PMID:25530927

  10. Simulation-based process windows simultaneously considering two and three conflicting criteria in injection molding.

    PubMed

    Rodríguez-Yáñez, Alicia Berenice; Méndez-Vázquez, Yaileen; Cabrera-Ríos, Mauricio

    2014-01-01

    Process windows in injection molding are habitually built with only one performance measure in mind. In reality, a more realistic picture can be obtained when considering multiple performance measures at a time, especially in the presence of conflict. In this work, the construction of process windows for injection molding (IM) is undertaken considering two and three performance measures in conflict simultaneously. The best compromises between the criteria involved are identified through the direct application of the concept of Pareto-dominance in multiple criteria optimization. The aim is to provide a formal and realistic strategy to set processing conditions in IM operations. The resulting optimization approach is easily implementable in MS Excel. The solutions are presented graphically to facilitate their use in manufacturing plants.

  11. Prototype Flight Management Capabilities to Explore Temporal RNP Concepts

    NASA Technical Reports Server (NTRS)

    Ballin, Mark G.; Williams, David H.; Allen, Bonnie Danette; Palmer, Michael T.

    2008-01-01

    Next Generation Air Transportation System (NextGen) concepts of operation may require aircraft to fly planned trajectories in four dimensions three spatial dimensions and time. A prototype 4D flight management capability is being developed by NASA to facilitate the development of these concepts. New trajectory generation functions extend today's flight management system (FMS) capabilities that meet a single Required Time of Arrival (RTA) to trajectory solutions that comply with multiple RTA constraints. When a solution is not possible, a constraint management capability relaxes constraints to achieve a trajectory solution that meets the most important constraints as specified by candidate NextGen concepts. New flight guidance functions provide continuous guidance to the aircraft s flight control system to enable it to fly specified 4D trajectories. Guidance options developed for research investigations include a moving time window with varying tolerances that are a function of proximity to imposed constraints, and guidance that recalculates the aircraft s planned trajectory as a function of the estimation of current compliance. Compliance tolerances are related to required navigation performance (RNP) through the extension of existing RNP concepts for lateral containment. A conceptual temporal RNP implementation and prototype display symbology are proposed.

  12. Onboard shuttle on-line software requirements system: Prototype

    NASA Technical Reports Server (NTRS)

    Kolkhorst, Barbara; Ogletree, Barry

    1989-01-01

    The prototype discussed here was developed as proof of a concept for a system which could support high volumes of requirements documents with integrated text and graphics; the solution proposed here could be extended to other projects whose goal is to place paper documents in an electronic system for viewing and printing purposes. The technical problems (such as conversion of documentation between word processors, management of a variety of graphics file formats, and difficulties involved in scanning integrated text and graphics) would be very similar for other systems of this type. Indeed, technological advances in areas such as scanning hardware and software and display terminals insure that some of the problems encountered here will be solved in the near-term (less than five years). Examples of these solvable problems include automated input of integrated text and graphics, errors in the recognition process, and the loss of image information which results from the digitization process. The solution developed for the Online Software Requirements System is modular and allows hardware and software components to be upgraded or replaced as industry solutions mature. The extensive commercial software content allows the NASA customer to apply resources to solving the problem and maintaining documents.

  13. Water reorientation in the hydration shells of hydrophilic and hydrophobic solutes

    NASA Astrophysics Data System (ADS)

    Laage, Damien; Stirnemann, Guillaume; Hynes, James T.

    2010-06-01

    We discuss some key aspects of our recent theoretical work on water reorientation dynamics, which is important in a wide range of phenomena, including aqueous phase chemical reactions, protein folding, and drug binding to proteins and DNA. It is shown that, contrary to the standard conception that these dynamics are diffusional, the reorientation of a water molecule occurs by sudden, large amplitude angular jumps. The mechanism involves the exchange of one hydrogen bond for another by the reorienting water, and the process can be fruitfully viewed as a chemical reaction. The results for reorientation times, which can be well described analytically, are discussed in the context of the molecular level interpretation of recent ultrafast infrared spectroscopic results, focusing on the concepts of structure making/breaking and solvent ‘icebergs’.

  14. An empirical evaulation of computerized tools to aid in enroute flight planning

    NASA Technical Reports Server (NTRS)

    Smith, Philip J.; Mccoy, C. Elaine; Layton, Charles

    1993-01-01

    The paper describes an experiment using the Flight Planning Testbed (FPT) in which 27 airline dispatchers were studied. Five general questions was addresses in the study: Under what circumstances does the introduction of computer-generated suggestions (flight plans) influence the planning behavior of dispatchers; what is the nature of such influences; How beneficial are the general design concepts underlying FPT; How effective are the specific implementation decisions made in realizing these general design concepts; How effectively do dispatchers evaluate situations requiring replanning and how effectively do they identify appropriate solutions to these situations. The study leaves little doubt that the introduction of computer-generated suggestions for solving a flight planning problem can have a marked impact on the cognitive processes of the user and on the ultimate plan selected.

  15. MapReduce in the Cloud: A Use Case Study for Efficient Co-Occurrence Processing of MEDLINE Annotations with MeSH.

    PubMed

    Kreuzthaler, Markus; Miñarro-Giménez, Jose Antonio; Schulz, Stefan

    2016-01-01

    Big data resources are difficult to process without a scaled hardware environment that is specifically adapted to the problem. The emergence of flexible cloud-based virtualization techniques promises solutions to this problem. This paper demonstrates how a billion of lines can be processed in a reasonable amount of time in a cloud-based environment. Our use case addresses the accumulation of concept co-occurrence data in MEDLINE annotation as a series of MapReduce jobs, which can be scaled and executed in the cloud. Besides showing an efficient way solving this problem, we generated an additional resource for the scientific community to be used for advanced text mining approaches.

  16. Cognitive flexibility and undergraduate physiology students: increasing advanced knowledge acquisition within an ill-structured domain.

    PubMed

    Rhodes, Ashley E; Rozell, Timothy G

    2017-09-01

    Cognitive flexibility is defined as the ability to assimilate previously learned information and concepts to generate novel solutions to new problems. This skill is crucial for success within ill-structured domains such as biology, physiology, and medicine, where many concepts are simultaneously required for understanding a complex problem, yet the problem consists of patterns or combinations of concepts that are not consistently used or needed across all examples. To succeed within ill-structured domains, a student must possess a certain level of cognitive flexibility: rigid thought processes and prepackaged informational retrieval schemes relying on rote memorization will not suffice. In this study, we assessed the cognitive flexibility of undergraduate physiology students using a validated instrument entitled Student's Approaches to Learning (SAL). The SAL evaluates how deeply and in what way information is processed, as well as the investment of time and mental energy that a student is willing to expend by measuring constructs such as elaboration and memorization. Our results indicate that students who rely primarily on memorization when learning new information have a smaller knowledge base about physiological concepts, as measured by a prior knowledge assessment and unit exams. However, students who rely primarily on elaboration when learning new information have a more well-developed knowledge base about physiological concepts, which is displayed by higher scores on a prior knowledge assessment and increased performance on unit exams. Thus students with increased elaboration skills possibly possess a higher level of cognitive flexibility and are more likely to succeed within ill-structured domains. Copyright © 2017 the American Physiological Society.

  17. "Generality of mis-fit"? The real-life difficulty of matching scales in an interconnected world.

    PubMed

    Keskitalo, E Carina H; Horstkotte, Tim; Kivinen, Sonja; Forbes, Bruce; Käyhkö, Jukka

    2016-10-01

    A clear understanding of processes at multiple scales and levels is of special significance when conceiving strategies for human-environment interactions. However, understanding and application of the scale concept often differ between administrative-political and ecological disciplines. These mirror major differences in potential solutions whether and how scales can, at all, be made congruent. As a result, opportunities of seeking "goodness-of-fit" between different concepts of governance should perhaps be reconsidered in the light of a potential "generality of mis-fit." This article reviews the interdisciplinary considerations inherent in the concept of scale in its ecological, as well as administrative-political, significance and argues that issues of how to manage "mis-fit" should be awarded more emphasis in social-ecological research and management practices. These considerations are exemplified by the case of reindeer husbandry in Fennoscandia. Whilst an indigenous small-scale practice, reindeer husbandry involves multi-level ecological and administrative-political complexities-complexities that we argue may arise in any multi-level system.

  18. Approach to an Affordable and Productive Space Transportation System

    NASA Technical Reports Server (NTRS)

    McCleskey, Carey M.; Rhodes, Russel E.; Lepsch, Roger A.; Henderson, Edward M.; Robinson, John W.

    2012-01-01

    This paper describes an approach for creating space transportation architectures that are affordable, productive, and sustainable. The architectural scope includes both flight and ground system elements, and focuses on their compatibility to achieve a technical solution that is operationally productive, and also affordable throughout its life cycle. Previous papers by the authors and other members of the Space Propulsion Synergy Team (SPST) focused on space flight system engineering methods, along with operationally efficient propulsion system concepts and technologies. This paper follows up previous work by using a structured process to derive examples of conceptual architectures that integrate a number of advanced concepts and technologies. The examples are not intended to provide a near-term alternative architecture to displace current near-term design and development activity. Rather, the examples demonstrate an approach that promotes early investments in advanced system concept studies and trades (flight and ground), as well as in advanced technologies with the goal of enabling highly affordable, productive flight and ground space transportation systems.

  19. Information Pre-Processing using Domain Meta-Ontology and Rule Learning System

    NASA Astrophysics Data System (ADS)

    Ranganathan, Girish R.; Biletskiy, Yevgen

    Around the globe, extraordinary amounts of documents are being created by Enterprises and by users outside these Enterprises. The documents created in the Enterprises constitute the main focus of the present chapter. These documents are used to perform numerous amounts of machine processing. While using thesedocuments for machine processing, lack of semantics of the information in these documents may cause misinterpretation of the information, thereby inhibiting the productiveness of computer assisted analytical work. Hence, it would be profitable to the Enterprises if they use well defined domain ontologies which will serve as rich source(s) of semantics for the information in the documents. These domain ontologies can be created manually, semi-automatically or fully automatically. The focus of this chapter is to propose an intermediate solution which will enable relatively easy creation of these domain ontologies. The process of extracting and capturing domain ontologies from these voluminous documents requires extensive involvement of domain experts and application of methods of ontology learning that are substantially labor intensive; therefore, some intermediate solutions which would assist in capturing domain ontologies must be developed. This chapter proposes a solution in this direction which involves building a meta-ontology that will serve as an intermediate information source for the main domain ontology. This chapter proposes a solution in this direction which involves building a meta-ontology as a rapid approach in conceptualizing a domain of interest from huge amount of source documents. This meta-ontology can be populated by ontological concepts, attributes and relations from documents, and then refined in order to form better domain ontology either through automatic ontology learning methods or some other relevant ontology building approach.

  20. Improving Students' Understanding of the Connections between the Concepts of Real-Gas Mixtures, Gas Ideal-Solutions, and Perfect-Gas Mixtures

    ERIC Educational Resources Information Center

    Privat, Romain; Jaubert, Jean-Noël; Moine, Edouard

    2016-01-01

    In many textbooks of chemical-engineering thermodynamics, a gas mixture obeying the fundamental law pV[subscript m] = RT is most often called ideal-gas mixture (in some rare cases, the term perfect-gas mixture can be found). These textbooks also define the fundamental concept of ideal solution which in theory, can be applied indifferently to…

  1. Dynamic-template-directed multiscale assembly for large-area coating of highly-aligned conjugated polymer thin films

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mohammadi, Erfan; Zhao, Chuankai; Meng, Yifei

    Solution processable semiconducting polymers have been under intense investigations due to their diverse applications from printed electronics to biomedical devices. However, controlling the macromolecular assembly across length scales during solution coating remains a key challenge, largely due to the disparity in timescales of polymer assembly and high-throughput printing/coating. Herein we propose the concept of dynamic templating to expedite polymer nucleation and the ensuing assembly process, inspired by biomineralization templates capable of surface reconfiguration. Molecular dynamic simulations reveal that surface reconfigurability is key to promoting template–polymer interactions, thereby lowering polymer nucleation barrier. Employing ionic-liquid-based dynamic template during meniscus-guided coating results inmore » highly aligned, highly crystalline donor-acceptor polymer thin films over large area (41cm 2) and promoted charge transport along both the polymer backbone and the π-π stacking direction in field-effect transistors. We further demonstrate that the charge transport anisotropy can be reversed by tuning the degree of polymer backbone alignment.« less

  2. Dynamic-template-directed multiscale assembly for large-area coating of highly-aligned conjugated polymer thin films

    PubMed Central

    Mohammadi, Erfan; Zhao, Chuankai; Meng, Yifei; Qu, Ge; Zhang, Fengjiao; Zhao, Xikang; Mei, Jianguo; Zuo, Jian-Min; Shukla, Diwakar; Diao, Ying

    2017-01-01

    Solution processable semiconducting polymers have been under intense investigations due to their diverse applications from printed electronics to biomedical devices. However, controlling the macromolecular assembly across length scales during solution coating remains a key challenge, largely due to the disparity in timescales of polymer assembly and high-throughput printing/coating. Herein we propose the concept of dynamic templating to expedite polymer nucleation and the ensuing assembly process, inspired by biomineralization templates capable of surface reconfiguration. Molecular dynamic simulations reveal that surface reconfigurability is key to promoting template–polymer interactions, thereby lowering polymer nucleation barrier. Employing ionic-liquid-based dynamic template during meniscus-guided coating results in highly aligned, highly crystalline donor–acceptor polymer thin films over large area (>1 cm2) and promoted charge transport along both the polymer backbone and the π–π stacking direction in field-effect transistors. We further demonstrate that the charge transport anisotropy can be reversed by tuning the degree of polymer backbone alignment. PMID:28703136

  3. Dynamic-template-directed multiscale assembly for large-area coating of highly-aligned conjugated polymer thin films

    DOE PAGES

    Mohammadi, Erfan; Zhao, Chuankai; Meng, Yifei; ...

    2017-07-13

    Solution processable semiconducting polymers have been under intense investigations due to their diverse applications from printed electronics to biomedical devices. However, controlling the macromolecular assembly across length scales during solution coating remains a key challenge, largely due to the disparity in timescales of polymer assembly and high-throughput printing/coating. Herein we propose the concept of dynamic templating to expedite polymer nucleation and the ensuing assembly process, inspired by biomineralization templates capable of surface reconfiguration. Molecular dynamic simulations reveal that surface reconfigurability is key to promoting template–polymer interactions, thereby lowering polymer nucleation barrier. Employing ionic-liquid-based dynamic template during meniscus-guided coating results inmore » highly aligned, highly crystalline donor-acceptor polymer thin films over large area (41cm 2) and promoted charge transport along both the polymer backbone and the π-π stacking direction in field-effect transistors. We further demonstrate that the charge transport anisotropy can be reversed by tuning the degree of polymer backbone alignment.« less

  4. Streaming support for data intensive cloud-based sequence analysis.

    PubMed

    Issa, Shadi A; Kienzler, Romeo; El-Kalioby, Mohamed; Tonellato, Peter J; Wall, Dennis; Bruggmann, Rémy; Abouelhoda, Mohamed

    2013-01-01

    Cloud computing provides a promising solution to the genomics data deluge problem resulting from the advent of next-generation sequencing (NGS) technology. Based on the concepts of "resources-on-demand" and "pay-as-you-go", scientists with no or limited infrastructure can have access to scalable and cost-effective computational resources. However, the large size of NGS data causes a significant data transfer latency from the client's site to the cloud, which presents a bottleneck for using cloud computing services. In this paper, we provide a streaming-based scheme to overcome this problem, where the NGS data is processed while being transferred to the cloud. Our scheme targets the wide class of NGS data analysis tasks, where the NGS sequences can be processed independently from one another. We also provide the elastream package that supports the use of this scheme with individual analysis programs or with workflow systems. Experiments presented in this paper show that our solution mitigates the effect of data transfer latency and saves both time and cost of computation.

  5. Surface kinetic roughening caused by dental erosion: An atomic force microscopy study

    NASA Astrophysics Data System (ADS)

    Quartarone, Eliana; Mustarelli, Piercarlo; Poggio, Claudio; Lombardini, Marco

    2008-05-01

    Surface kinetic roughening takes place both in case of growth and erosion processes. Teeth surfaces are eroded by contact with acid drinks, such as those used to supplement mineral salts during sporting activities. Calcium-phosphate based (CPP-ACP) pastes are known to reduce the erosion process, and to favour the enamel remineralization. In this study we used atomic force microscopy (AFM) to investigate the surface roughening during dental erosion, and the mechanisms at the basis of the protection role exerted by a commercial CPP-ACP paste. We found a statistically significant difference (p<0.01) in the roughness of surfaces exposed and not exposed to the acid solutions. The treatment with the CPP-ACP paste determined a statistically significant reduction of the roughness values. By interpreting the AFM results in terms of fractal scaling concepts and continuum stochastic equations, we showed that the protection mechanism of the paste depends on the chemical properties of the acid solution.

  6. Football for life versus antidoping for the masses: ethical antidoping issues and solutions based on the extenuating experiences of an elite footballer competing while undergoing treatment for metastatic testicular cancer.

    PubMed

    Weiler, Richard; Tombides, Dylan; Urwin, Jon; Clarke, Jane; Verroken, Michele

    2014-05-01

    It is thankfully rare for extenuating circumstances to fully test the processes and procedures enshrined in national and world antidoping authorities' rules and laws. It is also thankfully very rare that a failed drugs test can have some positive implications. Antidoping laws are undoubtedly focused on ensuring fair competition, however, there are occasions when honest athletes discover medical diagnoses through failed antidoping tests. The purpose of this paper is to broadly discuss antidoping considerations encountered, based on the four principles of medical ethics and to propose simple solutions to these problems. Unfortunately, extreme medical circumstances will often test the limits of antidoping and medical processes and with open channels for feedback, these systems can improve. Performance enhancement seems an illogical concept if an athlete's medical treatment and disease are more inherently performance harming than unintended potential doping, but needs to be carefully managed to maintain fair sport.

  7. Modeling of porosity loss during compaction and cementation of sandstones

    NASA Astrophysics Data System (ADS)

    Lemée, Claire; Guéguen, Yves

    1996-10-01

    Irreversible inelastic processes are responsible for mechanical and chemical compaction of sedimentary rocks at the time of burying. Our purpose is to describe the inelastic response of the rock at large time scales. In order to do this, we build a model that describes how porosity progressively decreases at depth. We use a previous geometrical model for the compaction process of a sandstone by grain interpenetration that is restricted to the case of mass conservation. In addition, we introduce a compaction equilibrium concept. Solid grains can support stresses up to a critical effective stress, σc, before plastic flow occurs. This critical stress depends on temperature and is derived from the pressure-solution deformation law. Pressure solution is the plastic deformation mechanism implemented during compaction. Our model predicts a porosity destruction at a depth of about 3 km. This model has the property to define a range of compaction curves. We investigate the sensitivity of the model to the main input parameters: liquid film thickness, grain size, temperature gradient, and activation energy.

  8. Football for life versus antidoping for the masses: ethical antidoping issues and solutions based on the extenuating experiences of an elite footballer competing while undergoing treatment for metastatic testicular cancer

    PubMed Central

    Weiler, Richard; Tombides, Dylan; Urwin, Jon; Clarke, Jane; Verroken, Michele

    2014-01-01

    It is thankfully rare for extenuating circumstances to fully test the processes and procedures enshrined in national and world antidoping authorities’ rules and laws. It is also thankfully very rare that a failed drugs test can have some positive implications. Antidoping laws are undoubtedly focused on ensuring fair competition, however, there are occasions when honest athletes discover medical diagnoses through failed antidoping tests. The purpose of this paper is to broadly discuss antidoping considerations encountered, based on the four principles of medical ethics and to propose simple solutions to these problems. Unfortunately, extreme medical circumstances will often test the limits of antidoping and medical processes and with open channels for feedback, these systems can improve. Performance enhancement seems an illogical concept if an athlete’s medical treatment and disease are more inherently performance harming than unintended potential doping, but needs to be carefully managed to maintain fair sport. PMID:24668050

  9. Action Being Character: A Promising Perspective on the Solution Concept of Game Theory

    PubMed Central

    Deng, Kuiying; Chu, Tianguang

    2011-01-01

    The inconsistency of predictions from solution concepts of conventional game theory with experimental observations is an enduring question. These solution concepts are based on the canonical rationality assumption that people are exclusively self-regarding utility maximizers. In this article, we think this assumption is problematic and, instead, assume that rational economic agents act as if they were maximizing their implicit utilities, which turns out to be a natural extension of the canonical rationality assumption. Implicit utility is defined by a player's character to reflect his personal weighting between cooperative, individualistic, and competitive social value orientations. The player who actually faces an implicit game chooses his strategy based on the common belief about the character distribution for a general player and the self-estimation of his own character, and he is not concerned about which strategies other players will choose and will never feel regret about his decision. It is shown by solving five paradigmatic games, the Dictator game, the Ultimatum game, the Prisoner's Dilemma game, the Public Goods game, and the Battle of the Sexes game, that the framework of implicit game and its corresponding solution concept, implicit equilibrium, based on this alternative assumption have potential for better explaining people's actual behaviors in social decision making situations. PMID:21573055

  10. Action being character: a promising perspective on the solution concept of game theory.

    PubMed

    Deng, Kuiying; Chu, Tianguang

    2011-05-09

    The inconsistency of predictions from solution concepts of conventional game theory with experimental observations is an enduring question. These solution concepts are based on the canonical rationality assumption that people are exclusively self-regarding utility maximizers. In this article, we think this assumption is problematic and, instead, assume that rational economic agents act as if they were maximizing their implicit utilities, which turns out to be a natural extension of the canonical rationality assumption. Implicit utility is defined by a player's character to reflect his personal weighting between cooperative, individualistic, and competitive social value orientations. The player who actually faces an implicit game chooses his strategy based on the common belief about the character distribution for a general player and the self-estimation of his own character, and he is not concerned about which strategies other players will choose and will never feel regret about his decision. It is shown by solving five paradigmatic games, the Dictator game, the Ultimatum game, the Prisoner's Dilemma game, the Public Goods game, and the Battle of the Sexes game, that the framework of implicit game and its corresponding solution concept, implicit equilibrium, based on this alternative assumption have potential for better explaining people's actual behaviors in social decision making situations.

  11. Raster Data Partitioning for Supporting Distributed GIS Processing

    NASA Astrophysics Data System (ADS)

    Nguyen Thai, B.; Olasz, A.

    2015-08-01

    In the geospatial sector big data concept also has already impact. Several studies facing originally computer science techniques applied in GIS processing of huge amount of geospatial data. In other research studies geospatial data is considered as it were always been big data (Lee and Kang, 2015). Nevertheless, we can prove data acquisition methods have been improved substantially not only the amount, but the resolution of raw data in spectral, spatial and temporal aspects as well. A significant portion of big data is geospatial data, and the size of such data is growing rapidly at least by 20% every year (Dasgupta, 2013). The produced increasing volume of raw data, in different format, representation and purpose the wealth of information derived from this data sets represents only valuable results. However, the computing capability and processing speed rather tackle with limitations, even if semi-automatic or automatic procedures are aimed on complex geospatial data (Kristóf et al., 2014). In late times, distributed computing has reached many interdisciplinary areas of computer science inclusive of remote sensing and geographic information processing approaches. Cloud computing even more requires appropriate processing algorithms to be distributed and handle geospatial big data. Map-Reduce programming model and distributed file systems have proven their capabilities to process non GIS big data. But sometimes it's inconvenient or inefficient to rewrite existing algorithms to Map-Reduce programming model, also GIS data can not be partitioned as text-based data by line or by bytes. Hence, we would like to find an alternative solution for data partitioning, data distribution and execution of existing algorithms without rewriting or with only minor modifications. This paper focuses on technical overview of currently available distributed computing environments, as well as GIS data (raster data) partitioning, distribution and distributed processing of GIS algorithms. A proof of concept implementation have been made for raster data partitioning, distribution and processing. The first results on performance have been compared against commercial software ERDAS IMAGINE 2011 and 2014. Partitioning methods heavily depend on application areas, therefore we may consider data partitioning as a preprocessing step before applying processing services on data. As a proof of concept we have implemented a simple tile-based partitioning method splitting an image into smaller grids (NxM tiles) and comparing the processing time to existing methods by NDVI calculation. The concept is demonstrated using own development open source processing framework.

  12. Multiobjective GAs, quantitative indices, and pattern classification.

    PubMed

    Bandyopadhyay, Sanghamitra; Pal, Sankar K; Aruna, B

    2004-10-01

    The concept of multiobjective optimization (MOO) has been integrated with variable length chromosomes for the development of a nonparametric genetic classifier which can overcome the problems, like overfitting/overlearning and ignoring smaller classes, as faced by single objective classifiers. The classifier can efficiently approximate any kind of linear and/or nonlinear class boundaries of a data set using an appropriate number of hyperplanes. While designing the classifier the aim is to simultaneously minimize the number of misclassified training points and the number of hyperplanes, and to maximize the product of class wise recognition scores. The concepts of validation set (in addition to training and test sets) and validation functional are introduced in the multiobjective classifier for selecting a solution from a set of nondominated solutions provided by the MOO algorithm. This genetic classifier incorporates elitism and some domain specific constraints in the search process, and is called the CEMOGA-Classifier (constrained elitist multiobjective genetic algorithm based classifier). Two new quantitative indices, namely, the purity and minimal spacing, are developed for evaluating the performance of different MOO techniques. These are used, along with classification accuracy, required number of hyperplanes and the computation time, to compare the CEMOGA-Classifier with other related ones.

  13. Software reengineering

    NASA Technical Reports Server (NTRS)

    Fridge, Ernest M., III

    1991-01-01

    Programs in use today generally have all of the function and information processing capabilities required to do their specified job. However, older programs usually use obsolete technology, are not integrated properly with other programs, and are difficult to maintain. Reengineering is becoming a prominent discipline as organizations try to move their systems to more modern and maintainable technologies. The Johnson Space Center (JSC) Software Technology Branch (STB) is researching and developing a system to support reengineering older FORTRAN programs into more maintainable forms that can also be more readily translated to a modern languages such as FORTRAN 8x, Ada, or C. This activity has led to the development of maintenance strategies for design recovery and reengineering. These strategies include a set of standards, methodologies, and the concepts for a software environment to support design recovery and reengineering. A brief description of the problem being addressed and the approach that is being taken by the STB toward providing an economic solution to the problem is provided. A statement of the maintenance problems, the benefits and drawbacks of three alternative solutions, and a brief history of the STB experience in software reengineering are followed by the STB new FORTRAN standards, methodology, and the concepts for a software environment.

  14. A Framework for Multi-Stakeholder Decision-Making and ...

    EPA Pesticide Factsheets

    This contribution describes the implementation of the conditional-value-at-risk (CVaR) metric to create a general multi-stakeholder decision-making framework. It is observed that stakeholder dissatisfactions (distance to their individual ideal solutions) can be interpreted as random variables. We thus shape the dissatisfaction distribution and find an optimal compromise solution by solving a CVaR minimization problem parameterized in the probability level. This enables us to generalize multi-stakeholder settings previously proposed in the literature that minimizes average and worst-case dissatisfactions. We use the concept of the CVaR norm to give a geometric interpretation to this problem and use the properties of this norm to prove that the CVaR minimization problem yields Pareto optimal solutions for any choice of the probability level. We discuss a broad range of potential applications of the framework. We demonstrate the framework in a bio-waste processing facility location case study, where we seek compromise solutions (facility locations) that balance stakeholder priorities on transportation, safety, water quality, and capital costs. This conference presentation abstract explains a new decision-making framework that computes compromise solution alternatives (reach consensus) by mitigating dissatisfactions among stakeholders as needed for SHC Decision Science and Support Tools project.

  15. Information System Engineering Supporting Observation, Orientation, Decision, and Compliant Action

    NASA Astrophysics Data System (ADS)

    Georgakopoulos, Dimitrios

    The majority of today's software systems and organizational/business structures have been built on the foundation of solving problems via long-term data collection, analysis, and solution design. This traditional approach of solving problems and building corresponding software systems and business processes, falls short in providing the necessary solutions needed to deal with many problems that require agility as the main ingredient of their solution. For example, such agility is needed in responding to an emergency, in military command control, physical security, price-based competition in business, investing in the stock market, video gaming, network monitoring and self-healing, diagnosis in emergency health care, and many other areas that are too numerous to list here. The concept of Observe, Orient, Decide, and Act (OODA) loops is a guiding principal that captures the fundamental issues and approach for engineering information systems that deal with many of these problem areas. However, there are currently few software systems that are capable of supporting OODA. In this talk, we provide a tour of the research issues and state of the art solutions for supporting OODA. In addition, we provide specific examples of OODA solutions we have developed for the video surveillance and emergency response domains.

  16. Effects of NaCl, pH, and Potential on the Static Creep Behavior of AA1100

    NASA Astrophysics Data System (ADS)

    Wan, Quanhe; Quesnel, David J.

    2013-03-01

    The creep rates of AA1100 are measured during exposure to a variety of aggressive environments. NaCl solutions of various concentrations have no influence on the steady-state creep behavior, producing creep rates comparable to those measured in lab air at room temperature. However, after an initial incubation period of steady strain rate, a dramatic increase of strain rate is observed on exposure to HCl solutions and NaOH solutions, as well as during cathodic polarization of specimens in NaCl solutions. Creep strain produces a continuous deformation and elongation of the sample surface that is comparable to slow strain rates at crack tips thought to control the kinetics of crack growth during stress corrosion cracking (SCC). In this experiment, we separate the strain and surface deformation from the complex geometry of the crack tip to better understand the processes at work. Based on this concept, two possible explanations for the environmental influences on creep strain rates are discussed relating to the anodic dissolution of the free surface and hydrogen influences on deformation mechanisms. Consistencies of pH dependence between corrosion creep and SCC at low pH prove a creep-involved SCC mechanism, while the discrepancies between corrosion creep behavior and previous SCC results at high pH indicate a rate-limit step change in the crack propagation of the SCC process.

  17. Predicted reliability of aerospace electronics: Application of two advanced probabilistic concepts

    NASA Astrophysics Data System (ADS)

    Suhir, E.

    Two advanced probabilistic design-for-reliability (PDfR) concepts are addressed and discussed in application to the prediction, quantification and assurance of the aerospace electronics reliability: 1) Boltzmann-Arrhenius-Zhurkov (BAZ) model, which is an extension of the currently widely used Arrhenius model and, in combination with the exponential law of reliability, enables one to obtain a simple, easy-to-use and physically meaningful formula for the evaluation of the probability of failure (PoF) of a material or a device after the given time in operation at the given temperature and under the given stress (not necessarily mechanical), and 2) Extreme Value Distribution (EVD) technique that can be used to assess the number of repetitive loadings that result in the material/device degradation and eventually lead to its failure by closing, in a step-wise fashion, the gap between the bearing capacity (stress-free activation energy) of the material or the device and the demand (loading). It is shown that the material degradation (aging, damage accumulation, flaw propagation, etc.) can be viewed, when BAZ model is considered, as a Markovian process, and that the BAZ model can be obtained as the ultimate steady-state solution to the well-known Fokker-Planck equation in the theory of Markovian processes. It is shown also that the BAZ model addresses the worst, but a reasonably conservative, situation. It is suggested therefore that the transient period preceding the condition addressed by the steady-state BAZ model need not be accounted for in engineering evaluations. However, when there is an interest in understanding the transient degradation process, the obtained solution to the Fokker-Planck equation can be used for this purpose. As to the EVD concept, it attributes the degradation process to the accumulation of damages caused by a train of repetitive high-level loadings, while loadings of levels that are considerably lower than their extreme values do not contribute- appreciably to the finite lifetime of a material or a device. In our probabilistic risk management (PRM) based analysis we treat the stress-free activation energy (capacity) as a normally distributed random variable, and choose, for the sake of simplicity, the (single-parametric) Rayleigh law as the basic distribution underlying the EVD. The general concepts addressed and discussed are illustrated by numerical examples. It is concluded that the application of the PDfR approach and particularly the above two advanced models should be considered as a natural, physically meaningful, informative, comprehensive, and insightful technique that reflects well the physics underlying the degradation processes in materials, devices and systems. It is the author's belief that they will be widely used in engineering practice, when high reliability is imperative, and the ability to quantify it is highly desirable.

  18. Groundwater flow and transport modeling

    USGS Publications Warehouse

    Konikow, Leonard F.; Mercer, J.W.

    1988-01-01

    Deterministic, distributed-parameter, numerical simulation models for analyzing groundwater flow and transport problems have come to be used almost routinely during the past decade. A review of the theoretical basis and practical use of groundwater flow and solute transport models is used to illustrate the state-of-the-art. Because of errors and uncertainty in defining model parameters, models must be calibrated to obtain a best estimate of the parameters. For flow modeling, data generally are sufficient to allow calibration. For solute-transport modeling, lack of data not only limits calibration, but also causes uncertainty in process description. Where data are available, model reliability should be assessed on the basis of sensitivity tests and measures of goodness-of-fit. Some of these concepts are demonstrated by using two case histories. ?? 1988.

  19. ISPAN (Interactive Stiffened Panel Analysis): A tool for quick concept evaluation and design trade studies

    NASA Technical Reports Server (NTRS)

    Hairr, John W.; Dorris, William J.; Ingram, J. Edward; Shah, Bharat M.

    1993-01-01

    Interactive Stiffened Panel Analysis (ISPAN) modules, written in FORTRAN, were developed to provide an easy to use tool for creating finite element models of composite material stiffened panels. The modules allow the user to interactively construct, solve and post-process finite element models of four general types of structural panel configurations using only the panel dimensions and properties as input data. Linear, buckling and post-buckling solution capability is provided. This interactive input allows rapid model generation and solution by non finite element users. The results of a parametric study of a blade stiffened panel are presented to demonstrate the usefulness of the ISPAN modules. Also, a non-linear analysis of a test panel was conducted and the results compared to measured data and previous correlation analysis.

  20. Conception d'instrument pour une mission d'observation haute resolution et grand champ

    NASA Astrophysics Data System (ADS)

    Fayret, Jean-Philippe; Gaudin-Delrieu, Catherine; Lamard, Jean-Luc; Devilliers, Christophe; Costes, Vincent

    2017-11-01

    The future Earth observation missions aim at delivering images with a high resolution and a large field of view. The PLEIADES mission, coming after the SPOT satellites, lead to enhance the resolution to submetric values with a swath over 20km. Panchromatic and multispectral images will be proposed. Starting with the mission requirements elaborated by the CNES, Alcatel Space Industries has conducted a study to identify the instrument concepts most suited to comply with these performance. In addition, to minimise the development costs, a mini satellite approach has been selected, leading to a compact concept for the instrument design. During the study, various detection techniques and the associated detectors have been investigated from classical pushbroom to supermode acquisition modes. For each of these options, different optical lay-outs were proposed and evaluated with respect to performance as well as interfaces requirements. Optical performance, mechanical design constraints and manufacturing processes were taken into account to assess the performances of the various solutions. Eventually the most promising concept was selected and a preliminary design study performed. This concept, based on a Korsch optical scheme associated with TDI detectors, complies with the mission requirements and allows for a wide number of possibilities of accommodation with a minisatellite class platform.

  1. Environmental concepts in rural Honduras: A case study of their range and application within environmental education design

    NASA Astrophysics Data System (ADS)

    Bradford, Robert Sanders

    1998-12-01

    The rate of environmental degradation in the Third World continues to present residents of countries like Honduras with conditions that threaten the quality of life and ecological systems. How people conceptualize their environment could be a point of entry into a greater understanding of environmental problems. Through individual interviews and focus group discussions, this study comprises a qualitative examination of the environmental concepts of a sample of 75 rural Hondurans. Analysis of their concepts was used to construct a tentative interpretation of the rural Honduran worldview characteristics of Self, Other, Relationship, Classification, Causality, Time, and Space. The findings of this investigation indicated that rural Hondurans conceptualize their environment through the worldview lenses of survival and poverty, leading to a sense of fatalism when confronting the complex and multifaceted problems associated with quality of life and environmental quality. Analysis of concepts and worldview also indicated that rural Hondurans generally do not believe their environmental problems are solvable, nor do they appear to understand that these problems are also cultural problems whose solutions will most likely require some revision of their current worldview. An educational approach that fosters the integration of compatible environmental concepts into the rural Honduran worldview is recommended through the application of design strategies for a prospective environmental education process.

  2. Constructive Development of the Solutions of Linear Equations in Introductory Ordinary Differential Equations

    ERIC Educational Resources Information Center

    Mallet, D. G.; McCue, S. W.

    2009-01-01

    The solution of linear ordinary differential equations (ODEs) is commonly taught in first-year undergraduate mathematics classrooms, but the understanding of the concept of a solution is not always grasped by students until much later. Recognizing what it is to be a solution of a linear ODE and how to postulate such solutions, without resorting to…

  3. Trusted Fabrication through 3D Integration

    DTIC Science & Technology

    2017-03-01

    contiguous and thus identifiable. The concept of a “smart partitioner” is introduced for a second experiment. Keywords: Trusted Fab ; VLSI; 3DIC...to the fabrication facility. One solution is the split- fab concept in which the design is split into two separate fabs early in the metal stack, and...possible solution is proposed herein whereby a three chip stack is formed, two built in normal semiconductor fabs and one in an interposer fab . This

  4. Summary of Plutonium-238 Production Alternatives Analysis Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    James Werner; Wade E. Bickford; David B. Lord

    The Team implemented a two-phase evaluation process. During the first phase, a wide variety of past and new candidate facilities and processing methods were assessed against the criteria established by DOE for this assessment. Any system or system element selected for consideration as an alternative within the project to reestablish domestic production of Pu-238 must meet the following minimum criteria: Any required source material must be readily available in the United States, without requiring the development of reprocessing technologies or investments in systems to separate material from identified sources. It must be cost, schedule, and risk competitive with existing baselinemore » technology. Any identified facilities required to support the concept must be available to the program for the entire project life cycle (notionally 35 years, unless the concept is so novel as to require a shorter duration). It must present a solution that can generate at least 1.5 Kg of Pu-238 oxide per year, for at least 35 years. It must present a low-risk, near-term solution to the National Aeronautics and Space Administration’s urgent mission need. DOE has implemented this requirement by eliminating from project consideration any alternative with key technologies at less than Technology Readiness Level 5. The Team evaluated the options meeting these criteria using a more detailed assessment of the reasonable facility variations and compared them to the preferred option, which consists of target irradiation at the Advanced Test Reactor (ATR) and the High Flux Isotope Reactor (HFIR), target fabrication and chemical separations processing at the ORNL Radiochemical Engineering Development Center, and neptunium 237 storage at the Materials and Fuels Complex at INL. This preferred option is consistent with the Records of Decision from the earlier National Environmental Policy Act (NEPA) documentation« less

  5. Wastewater treatment to enhance the economic viability of microalgae culture.

    PubMed

    Pires, J C M; Alvim-Ferraz, M C M; Martins, F G; Simões, M

    2013-08-01

    Microalgae culture is still not economically viable and it presents some negative environmental impacts, concerning water, nutrient and energy requirements. In this context, this study aims to review the recent advances on microalgal cultures in wastewaters to enhance their economic viability. We focused on three different culture concepts: (1) suspended cell systems, (2) cell immobilization, and (3) microalgae consortia. Cultures with suspended cells are the most studied. The nutrient removal efficiencies are usually high for wastewaters of different sources. However, biomass harvesting is difficult and a costly process due to the small cell size and lower culture density. On the other hand, the cell immobilization systems showed to be the solution for this problem, having as main limitation the nutrient diffusion from bulk to cells, which results in a reduced nutrient removal efficiency. The consortium between microalgae and bacteria enhances the growth of both microorganisms. This culture concept showed to be a promising technology to improve wastewater treatment, regarding not only nutrient removal but also biomass harvesting by bioflocculation. The aggregation mechanism must be studied in depth to find the process parameters that would lead to an effective and cheap harvesting process.

  6. How individual traces and interactive timelines could support outage execution - Toward an outage historian concept

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parfouru, S.; De-Beler, N.

    2012-07-01

    In the context of a project that is designing innovative ICT-based solutions for the organizational concept of outage management, we focus on the informational process of the OCR (Outage Control Room) underlying the execution of the outages. Informational process are based on structured and unstructured documents that have a key role in the collaborative processes and management of the outage. We especially track the structured and unstructured documents, electronically or not, from creation to sharing. Our analysis allows us to consider that the individual traces produced by an individual participant with a specific role could be multi-purpose and support sharingmore » between participants without creating duplication of work. The ultimate goal is to be able to generate an outage historian, that is not just focused on highly structured information, which could be useful to improve the continuity of information between participants. We study the implementation of this approach through web technologies and social media tools to address this issue. We also investigate the issue of data access through interactive visualization timelines coupled with other modality's to assist users in the navigation and exploration of the proposed historian. (authors)« less

  7. Health literacy and health communication

    PubMed Central

    2010-01-01

    Health communication consists of interpersonal or mass communication activities focused on improving the health of individuals and populations. Skills in understanding and applying information about health issues are critical to this process and may have a substantial impact on health behaviors and health outcomes. These skills have recently been conceptualized in terms of health literacy (HL). This article introduces current concepts and measurements of HL, and discusses the role of HL in health communication, as well as future research directions in this domain. Studies of HL have increased dramatically during the past few years, but a gap between the conceptual definition of HL and its application remains. None of the existing instruments appears to completely measure the concept of HL. In particular, studies on communication/interaction and HL remain limited. Furthermore, HL should be considered not only in terms of the characteristics of individuals, but also in terms of the interactional processes between individuals and their health and social environments. Improved HL may enhance the ability and motivation of individuals to find solutions to both personal and public health problems, and these skills could be used to address various health problems throughout life. The process underpinning HL involves empowerment, one of the major goals of health communication. PMID:21054840

  8. Solving Solutions: Exploring Unknowns through Chemistry.

    ERIC Educational Resources Information Center

    Burns, John; Yoshina, Granville; Goodding, Debbie; Streitberger, Eric

    2000-01-01

    Presents a chemistry activity that introduces students to one type of chemical bond by developing the integer operation concept of zero pairs. Leads to an activity of combining drops of 0.3 molar solutions to form six different colored precipitates from five solutions. (ASK)

  9. Origins Space Telescope Concept 2: Trades, Decisions, and Study Status

    NASA Astrophysics Data System (ADS)

    Leisawitz, David; DiPirro, Michael; Carter, Ruth; Origins Space Telescope Decadal Mission Concept Study Team

    2018-01-01

    The Origins Space Telescope (OST) will trace the history of our cosmic origins from the time dust and heavy elements began to alter the astrophysical processes that shaped galaxies and enabled planets to form, culminating at least once in the development of a life-bearing planet. But how did the universe evolve in response to its changing ingredients, and how common are planets that support life? The OST, an advancing concept for the Far-Infrared Surveyor mission described in the NASA Astrophysics roadmap, is being designed to answer these questions. As envisaged in the Roadmap, Enduring Quests/Daring Visions, OST will offer sensitivity and spectroscopic capabilities that vastly exceed those found in any preceding far-IR observatory. The spectral range of OST was extended down to 6 microns to allow measurements of key biomarkers in transiting exoplanet spectra. Thus, OST is a mid- and far-IR mission. OST Concept 2 will inform the Science and Technology Definition Team’s understanding of the “solution space,” enabling a recommendation to the 2020 Decadal Survey which, while not fully optimized, will be scientifically compelling, executable, and intended to maximize the science return per dollar. OST Concept 1, described in a companion paper, would satisfy virtually all of the STDT’s science objectives in under 5 years. Concept 2 is intentionally less ambitious than Concept 1, but it still includes a 4 K telescope, enabling exquisitely sensitive far-IR measurements. This paper will summarize the architecture options considered for OST Concept 2 and describe the factors that led to the chosen design concept. Lessons from the Concept 1 study influenced our choices. We report progress on the Concept 2 study to date.

  10. Applications of the CAM Based on a New Decoupling Procedure of Correlation Functions in the One-Dimensional Contact Process

    NASA Astrophysics Data System (ADS)

    Konno, Norio; Katori, Makoto

    The one-dimensional contact process (CP) is studied by a systematic series of approximations. A new decoupling procedure of correlation functions is proposed by combining the idea of Suzuki's correlation-identity-decoupling (CID) with a concept of window. Liggett's approximations are also considered. Applying Suzuki's coherent-anomaly method (CAM) to the mean-field-type solutions, the values of the critical point and the critical exponents are estimated as λc = 1.6490(±0.0008), β=0.280(±0.013), Δ(= Δ/δ)= 1.734(±O.OO1), β=0.627(±0.005). Finally a comparison with other estimates is shown.

  11. Applications of the CAM Based on a New Decoupling Procedure of Correlation Functions in the One-Dimensional Contact Process

    NASA Astrophysics Data System (ADS)

    Konno, Norio; Katori, Makoto

    1990-05-01

    The one-dimensional contact process (CP) is studied by a systematic series of approximations. A new decoupling procedure of correlation functions is proposed by combining the idea of Suzuki’s correlation-identity-decoupling (CID) with a concept of window. Liggett’s approximations are also considered. Applying Suzuki’s coherent-anomaly method (CAM) to the mean-field-type solutions, the values of the critical point and the critical exponents are estimated as λc{=}1.6490(± 0.0008), β{=}0.280(± 0.013), \\varDelta({=}β/δ){=}1.734(± 0.001), \\hatβ{=}0.627(± 0.005). Finally a comparison with other estimates is shown.

  12. Power generation by thermally assisted electroluminescence: like optical cooling, but different

    NASA Astrophysics Data System (ADS)

    Buckner, Benjamin D.; Heeg, Bauke

    2008-02-01

    Thermally assisted electro-luminescence may provide a means to convert heat into electricity. In this process, radiation from a hot light-emitting diode (LED) is converted to electricity by a photovoltaic (PV) cell, which is termed thermophotonics. Novel analytical solutions to the equations governing such a system show that this system combines physical characteristics of thermophotovoltaics (TPV) and the inverse process of laser cooling. The flexibility of having both adjustable bias and load parameters may allow an optimized power generation system based on this concept to exceed the power throughput and efficiency of TPV systems. Such devices could function as efficient solar thermal, waste heat, and fuel-based generators.

  13. Six Sigma and Lean concepts, a case study: patient centered care model for a mammography center.

    PubMed

    Viau, Mark; Southern, Becky

    2007-01-01

    Boca Raton Community Hospital in South Florida decided to increase return while enhancing patient experience and increasing staff morale. They implemented a program to pursue "enterprise excellence" through Six Sigma methodologies. In order to ensure the root causes to delays and rework were addressed, a multigenerational project plan with 3 major components was developed. Step 1: Stabilize; Step 2: Optimize; Step 3: Innovate. By including staff and process owners in the process, they are empowered to think differently about what they do and how they do it. A team that works collaboratively to identify problems and develop solutions can only be a positive to any organization.

  14. Pump-Probe Fragmentation Action Spectroscopy: A Powerful Tool to Unravel Light-Induced Processes in Molecular Photocatalysts.

    PubMed

    Imanbaew, Dimitri; Lang, Johannes; Gelin, Maxim F; Kaufhold, Simon; Pfeffer, Michael G; Rau, Sven; Riehn, Christoph

    2017-05-08

    We present a proof of concept that ultrafast dynamics combined with photochemical stability information of molecular photocatalysts can be acquired by electrospray ionization mass spectrometry combined with time-resolved femtosecond laser spectroscopy in an ion trap. This pump-probe "fragmentation action spectroscopy" gives straightforward access to information that usually requires high purity compounds and great experimental efforts. Results of gas-phase studies on the electronic dynamics of two supramolecular photocatalysts compare well to previous findings in solution and give further evidence for a directed electron transfer, a key process for photocatalytic hydrogen generation. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Integration of Off-Track Sonic Boom Analysis in Conceptual Design of Supersonic Aircraft

    NASA Technical Reports Server (NTRS)

    Ordaz, Irian; Li, Wu

    2011-01-01

    A highly desired capability for the conceptual design of aircraft is the ability to rapidly and accurately evaluate new concepts to avoid adverse trade decisions that may hinder the development process in the later stages of design. Evaluating the robustness of new low-boom concepts is important for the conceptual design of supersonic aircraft. Here, robustness means that the aircraft configuration has a low-boom ground signature at both under- and off-track locations. An integrated process for off-track boom analysis is developed to facilitate the design of robust low-boom supersonic aircraft. The integrated off-track analysis can also be used to study the sonic boom impact and to plan future flight trajectories where flight conditions and ground elevation might have a significant effect on ground signatures. The key enabler for off-track sonic boom analysis is accurate computational fluid dynamics (CFD) solutions for off-body pressure distributions. To ensure the numerical accuracy of the off-body pressure distributions, a mesh study is performed with Cart3D to determine the mesh requirements for off- body CFD analysis and comparisons are made between the Cart3D and USM3D results. The variations in ground signatures that result from changes in the initial location of the near-field waveform are also examined. Finally, a complete under- and off-track sonic boom analysis is presented for two distinct supersonic concepts to demonstrate the capability of the integrated analysis process.

  16. Conceptual design and multidisciplinary optimization of in-plane morphing wing structures

    NASA Astrophysics Data System (ADS)

    Inoyama, Daisaku; Sanders, Brian P.; Joo, James J.

    2006-03-01

    In this paper, the topology optimization methodology for the synthesis of distributed actuation system with specific applications to the morphing air vehicle is discussed. The main emphasis is placed on the topology optimization problem formulations and the development of computational modeling concepts. For demonstration purposes, the inplane morphing wing model is presented. The analysis model is developed to meet several important criteria: It must allow large rigid-body displacements, as well as variation in planform area, with minimum strain on structural members while retaining acceptable numerical stability for finite element analysis. Preliminary work has indicated that addressed modeling concept meets the criteria and may be suitable for the purpose. Topology optimization is performed on the ground structure based on this modeling concept with design variables that control the system configuration. In other words, states of each element in the model are design variables and they are to be determined through optimization process. In effect, the optimization process assigns morphing members as 'soft' elements, non-morphing load-bearing members as 'stiff' elements, and non-existent members as 'voids.' In addition, the optimization process determines the location and relative force intensities of distributed actuators, which is represented computationally as equal and opposite nodal forces with soft axial stiffness. Several different optimization problem formulations are investigated to understand their potential benefits in solution quality, as well as meaningfulness of formulation itself. Sample in-plane morphing problems are solved to demonstrate the potential capability of the methodology introduced in this paper.

  17. Enriched biodiversity data as a resource and service.

    PubMed

    Vos, Rutger Aldo; Biserkov, Jordan Valkov; Balech, Bachir; Beard, Niall; Blissett, Matthew; Brenninkmeijer, Christian; van Dooren, Tom; Eades, David; Gosline, George; Groom, Quentin John; Hamann, Thomas D; Hettling, Hannes; Hoehndorf, Robert; Holleman, Ayco; Hovenkamp, Peter; Kelbert, Patricia; King, David; Kirkup, Don; Lammers, Youri; DeMeulemeester, Thibaut; Mietchen, Daniel; Miller, Jeremy A; Mounce, Ross; Nicolson, Nicola; Page, Rod; Pawlik, Aleksandra; Pereira, Serrano; Penev, Lyubomir; Richards, Kevin; Sautter, Guido; Shorthouse, David Peter; Tähtinen, Marko; Weiland, Claus; Williams, Alan R; Sierra, Soraya

    2014-01-01

    Recent years have seen a surge in projects that produce large volumes of structured, machine-readable biodiversity data. To make these data amenable to processing by generic, open source "data enrichment" workflows, they are increasingly being represented in a variety of standards-compliant interchange formats. Here, we report on an initiative in which software developers and taxonomists came together to address the challenges and highlight the opportunities in the enrichment of such biodiversity data by engaging in intensive, collaborative software development: The Biodiversity Data Enrichment Hackathon. The hackathon brought together 37 participants (including developers and taxonomists, i.e. scientific professionals that gather, identify, name and classify species) from 10 countries: Belgium, Bulgaria, Canada, Finland, Germany, Italy, the Netherlands, New Zealand, the UK, and the US. The participants brought expertise in processing structured data, text mining, development of ontologies, digital identification keys, geographic information systems, niche modeling, natural language processing, provenance annotation, semantic integration, taxonomic name resolution, web service interfaces, workflow tools and visualisation. Most use cases and exemplar data were provided by taxonomists. One goal of the meeting was to facilitate re-use and enhancement of biodiversity knowledge by a broad range of stakeholders, such as taxonomists, systematists, ecologists, niche modelers, informaticians and ontologists. The suggested use cases resulted in nine breakout groups addressing three main themes: i) mobilising heritage biodiversity knowledge; ii) formalising and linking concepts; and iii) addressing interoperability between service platforms. Another goal was to further foster a community of experts in biodiversity informatics and to build human links between research projects and institutions, in response to recent calls to further such integration in this research domain. Beyond deriving prototype solutions for each use case, areas of inadequacy were discussed and are being pursued further. It was striking how many possible applications for biodiversity data there were and how quickly solutions could be put together when the normal constraints to collaboration were broken down for a week. Conversely, mobilising biodiversity knowledge from their silos in heritage literature and natural history collections will continue to require formalisation of the concepts (and the links between them) that define the research domain, as well as increased interoperability between the software platforms that operate on these concepts.

  18. A Biogeography-Based Optimization Algorithm Hybridized with Tabu Search for the Quadratic Assignment Problem

    PubMed Central

    Lim, Wee Loon; Wibowo, Antoni; Desa, Mohammad Ishak; Haron, Habibollah

    2016-01-01

    The quadratic assignment problem (QAP) is an NP-hard combinatorial optimization problem with a wide variety of applications. Biogeography-based optimization (BBO), a relatively new optimization technique based on the biogeography concept, uses the idea of migration strategy of species to derive algorithm for solving optimization problems. It has been shown that BBO provides performance on a par with other optimization methods. A classical BBO algorithm employs the mutation operator as its diversification strategy. However, this process will often ruin the quality of solutions in QAP. In this paper, we propose a hybrid technique to overcome the weakness of classical BBO algorithm to solve QAP, by replacing the mutation operator with a tabu search procedure. Our experiments using the benchmark instances from QAPLIB show that the proposed hybrid method is able to find good solutions for them within reasonable computational times. Out of 61 benchmark instances tested, the proposed method is able to obtain the best known solutions for 57 of them. PMID:26819585

  19. A Biogeography-Based Optimization Algorithm Hybridized with Tabu Search for the Quadratic Assignment Problem.

    PubMed

    Lim, Wee Loon; Wibowo, Antoni; Desa, Mohammad Ishak; Haron, Habibollah

    2016-01-01

    The quadratic assignment problem (QAP) is an NP-hard combinatorial optimization problem with a wide variety of applications. Biogeography-based optimization (BBO), a relatively new optimization technique based on the biogeography concept, uses the idea of migration strategy of species to derive algorithm for solving optimization problems. It has been shown that BBO provides performance on a par with other optimization methods. A classical BBO algorithm employs the mutation operator as its diversification strategy. However, this process will often ruin the quality of solutions in QAP. In this paper, we propose a hybrid technique to overcome the weakness of classical BBO algorithm to solve QAP, by replacing the mutation operator with a tabu search procedure. Our experiments using the benchmark instances from QAPLIB show that the proposed hybrid method is able to find good solutions for them within reasonable computational times. Out of 61 benchmark instances tested, the proposed method is able to obtain the best known solutions for 57 of them.

  20. How the World Changes By Going from One- to Two-Dimensional Polymers in Solution.

    PubMed

    Schlüter, A Dieter; Payamyar, Payam; Öttinger, Hans Christian

    2016-10-01

    Scaling behavior of one-dimensional (1D) and two-dimensional (2D) polymers in dilute solution is discussed with the goal of stimulating experimental work by chemists, physicists, and material scientists in the emerging field of 2D polymers. The arguments are based on renormalization-group theory, which is explained for a general audience. Many ideas and methods successfully applied to 1D polymers are found not to work if one goes to 2D polymers. The role of the various states exhibiting universal behavior is turned upside down. It is expected that solubility will be a serious challenge for 2D polymers. Therefore, given the crucial importance of solutions in characterization and processing, synthetic concepts are proposed that allow the local bending rigidity and the molar mass to be tuned and the long-range interactions to be engineered, all with the goal of preventing the polymer from falling into flat or compact states. © 2016 The Authors. Published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Thin-layer voltammetry of soluble species on screen-printed electrodes: proof of concept.

    PubMed

    Botasini, S; Martí, A C; Méndez, E

    2016-10-17

    Thin-layer diffusion conditions were accomplished on screen-printed electrodes by placing a controlled-weight onto the cast solution and allowing for its natural spreading. The restricted diffusive conditions were assessed by cyclic voltammetry at low voltage scan rates and electrochemical impedance spectroscopy. The relationship between the weight exerted over the drop and the thin-layer thickness achieved was determined, in such a way that the simple experimental set-up designed for this work could be developed into a commercial device with variable control of the thin-layer conditions. The experimental results obtained resemble those reported for the voltammetric features of electroactive soluble species employing electrodes modified with carbon nanotubes or graphene layers, suggesting that the attainment of the benefits reported for these nanomaterials could be done simply by forcing the solution to spread over the screen-printed electrodic system to form a thin layer solution. The advantages of thin-layer voltammetry in the kinetic characterization of quasi-reversible and irreversible processes are highlighted.

  2. Use of the Collaborative Optimization Architecture for Launch Vehicle Design

    NASA Technical Reports Server (NTRS)

    Braun, R. D.; Moore, A. A.; Kroo, I. M.

    1996-01-01

    Collaborative optimization is a new design architecture specifically created for large-scale distributed-analysis applications. In this approach, problem is decomposed into a user-defined number of subspace optimization problems that are driven towards interdisciplinary compatibility and the appropriate solution by a system-level coordination process. This decentralized design strategy allows domain-specific issues to be accommodated by disciplinary analysts, while requiring interdisciplinary decisions to be reached by consensus. The present investigation focuses on application of the collaborative optimization architecture to the multidisciplinary design of a single-stage-to-orbit launch vehicle. Vehicle design, trajectory, and cost issues are directly modeled. Posed to suit the collaborative architecture, the design problem is characterized by 5 design variables and 16 constraints. Numerous collaborative solutions are obtained. Comparison of these solutions demonstrates the influence which an priori ascent-abort criterion has on development cost. Similarly, objective-function selection is discussed, demonstrating the difference between minimum weight and minimum cost concepts. The operational advantages of the collaborative optimization

  3. ["Baltic Declaration"--telemedicine and mHealth as support for clinical processes in cardiology. The opinion of the Committee of Informatics and Telemedicine of the Polish Society of Cardiology and Telemedicine Clinical Sciences Committee of the PAS].

    PubMed

    Piotrowicz, Ryszard; Grabowski, Marcin; Balsam, Paweł; Kołtowski, Łukasz; Kozierkiewicz, Adam; Zajdel, Justyna; Piotrowicz, Ewa; Kowalski, Oskar; Mitkowski, Przemysław; Kaźmierczak, Jarosław; Kalarus, Zbigniew; Opolski, Grzegorz

    2015-01-01

    For several decades we have observed the development of data transmission technology on an unprecedented scale. With the development of such technology there has also appeared concepts on the use of these solutions in health care systems. Over the last decade telemedicine has been joined by the concept of mHealth, which is based on mobile devices mainly to monitor selected biomedical parameters. On 10 October 2014, during the conference Baltic Electrocardiology Autumn - Telemedicine and Arrhythmia (BEATA), a debate was held with the participation of physicians, politicians, businessmen, and representatives of the Government (Ministry of Health, National Health Fund, Social Insurance Institution) concerning the use of telecardiology services in daily practice. During the meeting issues were discussed such as: telemedicine solutions available throughout the world, analysis of their effectiveness based on clinical trials, funding opportunities, their legal status, and the development perspectives of telecardiology in Poland. The result of the meeting was a document called the "Baltic Declaration". The declaration is a call for proven and profitable technologies to be introduced into clinical practice. The declaration also indicates that the variety of available technological solutions are merely tools, and the utility of such tools stems not only from their modernity, but also primarily from matching their functionality to the features of the health interventions that are to be improved.

  4. Portable long trace profiler: Concept and solution

    NASA Astrophysics Data System (ADS)

    Qian, Shinan; Takacs, Peter; Sostero, Giovanni; Cocco, Daniele

    2001-08-01

    Since the early development of the penta-prism long trace profiler (LTP) and the in situ LTP, and following the completion of the first in situ distortion profile measurements at Sincrotrone Trieste (ELETTRA) in Italy in 1995, a concept was developed for a compact, portable LTP with the following characteristics: easily installed on synchrotron radiation beam lines, easily carried to different laboratories around the world for measurements and calibration, convenient for use in evaluating the LTP as an in-process tool in the optical workshop, and convenient for use in temporarily installation as required by other special applications. The initial design of a compact LTP optical head was made at ELETTRA in 1995. Since 1997 further efforts to reduce the optical head size and weight, and to improve measurement stability have been made at Brookhaven National Laboratory. This article introduces the following solutions and accomplishments for the portable LTP: (1) a new design for a compact and very stable optical head, (2) the use of a small detector connected to a laptop computer directly via an enhanced parallel port, and there is no extra frame grabber interface and control box, (3) a customized small mechanical slide that uses a compact motor with a connector-sized motor controller, and (4) the use of a laptop computer system. These solutions make the portable LTP able to be packed into two laptop-size cases: one for the computer and one for the rest of the system.

  5. Follow up on the crystal growth experiments of the LDEF

    NASA Technical Reports Server (NTRS)

    Nielsen, K. F.; Lind, M. D.

    1993-01-01

    The results of the 4 solution growth experiments on the LDEF have been published elsewhere. Both the crystals of CaCO3, which were large and well shaped, and the much smaller TTF-TCNQ crystals showed unusual morphological behavior. The follow up on these experiments was begun in 1981, when ESA initiated a 'Concept Definition Study' on a large, 150 kg, Solution Growth Facility (SGF) to be included in the payload of EURECA-1, the European Retrievable Carrier. This carrier was a continuation of the European Spacelab and at that time planned for launch in 1987. The long delay of the LDEF retrieval and of subsequent missions brought about reflections both on the concept of crystal growth in space and on the choice of crystallization materials that had been made for the LDEF. Already before the LDEF retrieval, research on TTF-TCNQ had been stopped, and a planned growth experiment with TTF-TCNQ on the SGF/EURECA had been cancelled. The target of the SGF investigation is now more fundamental in nature. None of the crystals to be grown here are, like TTF-TCNQ, in particular demand by science or industry, and the crystals only serve the purpose of model crystals. The real purpose of the investigation is to study the growth behavior. One of the experiments, the Soret Coefficient Measurement experiment is not growing crystals at all, but has it as its sole purpose to obtain accurate information on thermal diffusion, a process of importance in crystal growth from solution.

  6. Genetic algorithms for protein threading.

    PubMed

    Yadgari, J; Amir, A; Unger, R

    1998-01-01

    Despite many years of efforts, a direct prediction of protein structure from sequence is still not possible. As a result, in the last few years researchers have started to address the "inverse folding problem": Identifying and aligning a sequence to the fold with which it is most compatible, a process known as "threading". In two meetings in which protein folding predictions were objectively evaluated, it became clear that threading as a concept promises a real breakthrough, but that much improvement is still needed in the technique itself. Threading is a NP-hard problem, and thus no general polynomial solution can be expected. Still a practical approach with demonstrated ability to find optimal solutions in many cases, and acceptable solutions in other cases, is needed. We applied the technique of Genetic Algorithms in order to significantly improve the ability of threading algorithms to find the optimal alignment of a sequence to a structure, i.e. the alignment with the minimum free energy. A major progress reported here is the design of a representation of the threading alignment as a string of fixed length. With this representation validation of alignments and genetic operators are effectively implemented. Appropriate data structure and parameters have been selected. It is shown that Genetic Algorithm threading is effective and is able to find the optimal alignment in a few test cases. Furthermore, the described algorithm is shown to perform well even without pre-definition of core elements. Existing threading methods are dependent on such constraints to make their calculations feasible. But the concept of core elements is inherently arbitrary and should be avoided if possible. While a rigorous proof is hard to submit yet an, we present indications that indeed Genetic Algorithm threading is capable of finding consistently good solutions of full alignments in search spaces of size up to 10(70).

  7. Architectural concepts of Martian bases built: of domes, around greenhouses and into slopes -the human aspect and the technology

    NASA Astrophysics Data System (ADS)

    Kozicki, Janek; Kozicka, Joanna

    Human missions to Mars are a special kind of space missions due to their long duration. The human aspect of such missions becomes as important as the technological one. The need for a human friendly and comfortable habitat arises. Studies of human behavior in ICEs have shown that larger groups of people mean a lower occurrence of conflicts. However, for a larger crew a larger habitat has to be designed -a Martian base. The research deals with psychological, sociological and technological aspects influencing the architectural design of a Martian Base. Extreme conditions present on Mars demand a partic-ular approach to technological and architectural design. To reduce the cost of building a bigger habitat, low cost solutions have been inquired into. A series of analyses has been performed to identify the best architectural solutions for a Martian base. A review of existing technologies and extreme condition habitats (both terrestrial and extraterrestrial) has revealed solutions that are the most reliable and efficient ones. Additionally, innovative technologies have been analyzed in search of the best candidates for actual base construction. Low cost solutions have been prioritized in the process. An in-depth study of architectural problems inherent in the design of a Martian base has resulted in a number of guidelines for the architect. The main ones are introduced in this review. Based on them, several concepts have been drafted as examples of user-friendly and aesthetically pleasing habitats. They are discussed in the following order: habitats made of domes, those built around greenhouses and those situated in sloping terrain. One of them is presented in detail, including interior design.

  8. Comparing the acidities of aqueous, frozen, and freeze-dried phosphate buffers: Is there a "pH memory" effect?

    PubMed

    Vetráková, Ľubica; Vykoukal, Vít; Heger, Dominik

    2017-09-15

    The concept of "pH memory" has been established in the literature for the correlation between the pH of a pre-lyophilization solution and the ionization state of freeze-dried powder (lyophile). In this paper, the concept of "pH memory" is explored for the system of an aqueous solution, a frozen solution, and a lyophile. Sodium and potassium phosphate buffers in the pH range of 5-9 were frozen and lyophilized with sulfonephthalein indicators as acidity probes, and their Hammett acidity functions were compared to the initial pH of the aqueous solution. The results show that the acidities of the lyophiles are somewhat changed compared to the initial pHs, but the acidities in the frozen state differ more substantially. The Hammett acidity functions of the frozen buffers were found to be markedly dissimilar from the initial pH, especially in the sodium phosphate frozen at 233K, where an increase in the initial pH led to a decrease in the Hammett acidity function of the frozen state at a certain pH range. The large acidification observed after freezing the sodium phosphate buffer was not detected in the lyophiles after the sample had been dried; the phenomenon is explained considering the formed crystals analyzed by X-ray powder diffraction. The results suggest that monitoring the final acidity of a lyophile is not sufficient to predict all the acidity changes throughout the whole lyophilization process. The importance of well-controlled freezing and lyophilization conditions follows from the results of the research. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Case-based medical informatics

    PubMed Central

    Pantazi, Stefan V; Arocha, José F; Moehr, Jochen R

    2004-01-01

    Background The "applied" nature distinguishes applied sciences from theoretical sciences. To emphasize this distinction, we begin with a general, meta-level overview of the scientific endeavor. We introduce the knowledge spectrum and four interconnected modalities of knowledge. In addition to the traditional differentiation between implicit and explicit knowledge we outline the concepts of general and individual knowledge. We connect general knowledge with the "frame problem," a fundamental issue of artificial intelligence, and individual knowledge with another important paradigm of artificial intelligence, case-based reasoning, a method of individual knowledge processing that aims at solving new problems based on the solutions to similar past problems. We outline the fundamental differences between Medical Informatics and theoretical sciences and propose that Medical Informatics research should advance individual knowledge processing (case-based reasoning) and that natural language processing research is an important step towards this goal that may have ethical implications for patient-centered health medicine. Discussion We focus on fundamental aspects of decision-making, which connect human expertise with individual knowledge processing. We continue with a knowledge spectrum perspective on biomedical knowledge and conclude that case-based reasoning is the paradigm that can advance towards personalized healthcare and that can enable the education of patients and providers. We center the discussion on formal methods of knowledge representation around the frame problem. We propose a context-dependent view on the notion of "meaning" and advocate the need for case-based reasoning research and natural language processing. In the context of memory based knowledge processing, pattern recognition, comparison and analogy-making, we conclude that while humans seem to naturally support the case-based reasoning paradigm (memory of past experiences of problem-solving and powerful case matching mechanisms), technical solutions are challenging. Finally, we discuss the major challenges for a technical solution: case record comprehensiveness, organization of information on similarity principles, development of pattern recognition and solving ethical issues. Summary Medical Informatics is an applied science that should be committed to advancing patient-centered medicine through individual knowledge processing. Case-based reasoning is the technical solution that enables a continuous individual knowledge processing and could be applied providing that challenges and ethical issues arising are addressed appropriately. PMID:15533257

  10. Fast production of methane by anaerobic digestion. Annual progress report, May 24, 1976--May 23, 1977

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Finney, C.D.; Evans II, R.S.; Finney, K.A.

    1977-06-01

    Since the productional cost of methane generated by anaerobic digestion of cellulose is on the economic borderline and the cost could be reduced by increasing the rate of the digestion process, a research program was undertaken to delineate the most promising areas of development. The concept that the step involving transfer of products from solution is rate-limiting and inhibiting in anaerobic digestion was supported by all evidence available. The most significant design implication of this concept is that faster gas production can be achieved in a two-stage digestion system in which unreactive solids are eliminated after the hydrolysis step somore » that the effluent to the gas-producing stage possesses a low viscosity. The advantages and disadvantages of three hydrolysis methods (enzymatic, anaerobic, and acid) are reviewed.« less

  11. Electrodynamic pressure modulation of protein stability in cosolvents.

    PubMed

    Damodaran, Srinivasan

    2013-11-19

    Cosolvents affect structural stability of proteins in aqueous solutions. A clear understanding of the mechanism by which cosolvents impact protein stability is critical to understanding protein folding in a biological milieu. In this study, we investigated the Lifshitz-van der Waals dispersion interaction of seven different solutes with nine globular proteins and report that in an aqueous medium the structure-stabilizing solutes exert a positive electrodynamic pressure, whereas the structure-destabilizing solutes exert a negative electrodynamic pressure on the proteins. The net increase in the thermal denaturation temperature (ΔTd) of a protein in 1 M solution of various solutes was linearly related to the electrodynamic pressure (PvdW) between the solutes and the protein. The slope of the PvdW versus ΔTd plots was protein-dependent. However, we find a positive linear relationship (r(2) = 0.79) between the slope (i.e., d(ΔTd)/dPvdW) and the adiabatic compressibility (βs) of the proteins. Together, these results clearly indicate that the Lifshitz's dispersion forces are inextricably involved in solute-induced stabilization/destabilization of globular proteins. The positive and/or negative electrodynamic pressure generated by the solute-protein interaction across the water medium seems to be the fundamental mechanism by which solutes affect protein stability. This is at variance with the existing preferential hydration concept. The implication of these results is significant in the sense that, in addition to the hydrophobic effect that drives protein folding, the electrodynamic forces between the proteins and solutes in the biological milieu also might play a role in the folding process as well as in the stability of the folded state.

  12. Computer Simulation in Predicting Biochemical Processes and Energy Balance at WWTPs

    NASA Astrophysics Data System (ADS)

    Drewnowski, Jakub; Zaborowska, Ewa; Hernandez De Vega, Carmen

    2018-02-01

    Nowadays, the use of mathematical models and computer simulation allow analysis of many different technological solutions as well as testing various scenarios in a short time and at low financial budget in order to simulate the scenario under typical conditions for the real system and help to find the best solution in design or operation process. The aim of the study was to evaluate different concepts of biochemical processes and energy balance modelling using a simulation platform GPS-x and a comprehensive model Mantis2. The paper presents the example of calibration and validation processes in the biological reactor as well as scenarios showing an influence of operational parameters on the WWTP energy balance. The results of batch tests and full-scale campaign obtained in the former work were used to predict biochemical and operational parameters in a newly developed plant model. The model was extended with sludge treatment devices, including anaerobic digester. Primary sludge removal efficiency was found as a significant factor determining biogas production and further renewable energy production in cogeneration. Water and wastewater utilities, which run and control WWTP, are interested in optimizing the process in order to save environment, their budget and decrease the pollutant emissions to water and air. In this context, computer simulation can be the easiest and very useful tool to improve the efficiency without interfering in the actual process performance.

  13. Synthetic Foveal Imaging Technology

    NASA Technical Reports Server (NTRS)

    Hoenk, Michael; Monacos, Steve; Nikzad, Shouleh

    2009-01-01

    Synthetic Foveal imaging Technology (SyFT) is an emerging discipline of image capture and image-data processing that offers the prospect of greatly increased capabilities for real-time processing of large, high-resolution images (including mosaic images) for such purposes as automated recognition and tracking of moving objects of interest. SyFT offers a solution to the image-data processing problem arising from the proposed development of gigapixel mosaic focal-plane image-detector assemblies for very wide field-of-view imaging with high resolution for detecting and tracking sparse objects or events within narrow subfields of view. In order to identify and track the objects or events without the means of dynamic adaptation to be afforded by SyFT, it would be necessary to post-process data from an image-data space consisting of terabytes of data. Such post-processing would be time-consuming and, as a consequence, could result in missing significant events that could not be observed at all due to the time evolution of such events or could not be observed at required levels of fidelity without such real-time adaptations as adjusting focal-plane operating conditions or aiming of the focal plane in different directions to track such events. The basic concept of foveal imaging is straightforward: In imitation of a natural eye, a foveal-vision image sensor is designed to offer higher resolution in a small region of interest (ROI) within its field of view. Foveal vision reduces the amount of unwanted information that must be transferred from the image sensor to external image-data-processing circuitry. The aforementioned basic concept is not new in itself: indeed, image sensors based on these concepts have been described in several previous NASA Tech Briefs articles. Active-pixel integrated-circuit image sensors that can be programmed in real time to effect foveal artificial vision on demand are one such example. What is new in SyFT is a synergistic combination of recent advances in foveal imaging, computing, and related fields, along with a generalization of the basic foveal-vision concept to admit a synthetic fovea that is not restricted to one contiguous region of an image.

  14. Characterization and Detection of ϵ-Berge-Zhukovskii Equilibria

    PubMed Central

    Lung, Rodica Ioana; Suciu, Mihai; Gaskó, Noémi; Dumitrescu, D.

    2015-01-01

    The Berge-Zhukovskii equilibrium is an alternate solution concept in non-cooperative game theory that formalizes cooperation in a noncooperative setting. In this paper, the ϵ-Berge-Zhukovskii equilibrium is introduced and characterized by using a generative relation. The generative relation also provides a solution to the problem of computing the ϵ-Berge-Zhukovskii equilibrium for large games, by using evolutionary algorithms. Numerical examples illustrate the approach and provide a possible application for this equilibrium concept. PMID:26177217

  15. Design Rules and Analysis of a Capture Mechanism for Rendezvous between a Space Tether and Payload

    NASA Technical Reports Server (NTRS)

    Sorensen, Kirk F.; Canfield, Stephen L.; Norris, Marshall A.

    2006-01-01

    Momentum-exchange/electrodynamic reboost (MXER) tether systems have been proposed to serve as an "upper stage in space". A MXER tether station would boost spacecraft from low Earth orbit to a high-energy orbit quickly, like a high-thrust rocket. Then, it would slowly rebuild its orbital momentum through electrodynamic thrust, minimizing the use of propellant. One of the primary challenges in developing a momentum-exchange/electrodynamic reboost tether system as identified by the 2003 MXER Technology Assessment Group is in the development of a mechanism that will enable the processes of capture, carry and release of a payload by the rotating tether as required by the MXER tether approach. This paper will present a concept that will achieve the desired goals of the capture system. This solution is presented as a multi-DOF (degree-of-freedom) capture mechanism with nearly passive operation that features matching of the capture space and expected window of capture error, efficient use of mass and nearly passive actuation during the capture process. This paper will describe the proposed capture mechanism concept and provide an evaluation of the concept through a dynamic model and experimental tests performed on a prototype article of the mechanism in a dynamically similar environment. This paper will also develop a set of rules to guide the design of such a capture mechanism based on analytical and experimental analyses. The primary contributions of this paper will be a description of the proposed capture mechanism concept, a collection of rules to guide its design, and empirical and model information that can be used to evaluate the capability of the concept

  16. Advanced evaporator technology progress report FY 1992

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chamberlain, D.; Hutter, J.C.; Leonard, R.A.

    1995-01-01

    This report summarizes the work that was completed in FY 1992 on the program {open_quotes}Technology Development for Concentrating Process Streams.{close_quotes} The purpose of this program is to evaluate and develop evaporator technology for concentrating radioactive waste and product streams such as those generated by the TRUEX process. Concentrating these streams and minimizing the volume of waste generated can significantly reduce disposal costs; however, equipment to concentrate the streams and recycle the decontaminated condensates must be installed. LICON, Inc., is developing an evaporator that shows a great deal of potential for this application. In this report, concepts that need to bemore » incorporated into the design of an evaporator operated in a radioactive environment are discussed. These concepts include criticality safety, remote operation and maintenance, and materials of construction. Both solubility and vapor-liquid equilibrium data are needed to design an effective process for concentrating process streams. Therefore, literature surveys were completed and are summarized in this report. A model that is being developed to predict vapor phase compositions is described. A laboratory-scale evaporator was purchased and installed to study the evaporation process and to collect additional data. This unit is described in detail. Two new LICON evaporators are being designed for installation at Argonne-East in FY 1993 to process low-level radioactive waste generated throughout the laboratory. They will also provide operating data from a full-sized evaporator processing radioactive solutions. Details on these evaporators are included in this report.« less

  17. Utopian Kinetic Structures and Their Impact on the Contemporary Architecture

    NASA Astrophysics Data System (ADS)

    Cudzik, Jan; Nyka, Lucyna

    2017-10-01

    This paper delves into relationships between twentieth century utopian concepts of movable structures and the kinematic solutions implemented in contemporary architectural projects. The reason for conducting this study is to determine the impact of early architectural conceptions on today’s solutions. This paper points out close links that stem from the imagination of artists and architects working in 1960s and 70s and the solutions implemented by contemporary architects of that era. The research method of this paper is based on comparative analyses of architectural forms with adopted kinematic solutions. It is based on archive drawings’ studies and the examination of theoretical concepts. The research pertains to different forms of such mobility that evolved in 1960s and 70s. Many of them, usually based on the simple forms of movement were realized. The more complicated ones remained in the sphere of utopian visionary architecture. In this case, projects often exceed technical limitations and capabilities of design tools. Finally, after some decades, with the development of innovative architectural design tools and new building technologies many early visions materialized into architectural forms. In conclusion, this research indicates that modern kinematic design solutions are often based on conceptual designs formed from the beginning of the second half of the twentieth century.

  18. A multipurpose model of Hermes-Columbus docking mechanism

    NASA Technical Reports Server (NTRS)

    Gonzalez-Vallejo, J. J.; Fehse, W.; Tobias, A.

    1992-01-01

    One of the foreseen missions of the HERMES spacevehicle is the servicing to the Columbus Free Flying Laboratory (MTFF). Docking between the two spacecraft is a critical operation in which the Docking Mechanism (DM) has a major role. In order to analyze and assess robustness of initially selected concepts and to identify suitable implementation solutions, through the investigation of main parameters involved in the docking functions, a multipurpose model of DM was developed and tested. This paper describes the main design features as well as the process of calibrating and testing.

  19. Consistent Correlations for Parameterised Boolean Equation Systems with Applications in Correctness Proofs for Manipulations

    NASA Astrophysics Data System (ADS)

    Willemse, Tim A. C.

    We introduce the concept of consistent correlations for parameterised Boolean equation systems (PBESs), motivated largely by the laborious proofs of correctness required for most manipulations in this setting. Consistent correlations focus on relating the equations that occur in PBESs, rather than their solutions. For a fragment of PBESs, consistent correlations are shown to coincide with a recently introduced form of bisimulation. Finally, we show that bisimilarity on processes induces consistent correlations on PBESs encoding model checking problems. We apply our theory to two example manipulations from the literature.

  20. The role of partitioning of reagents in grafting and curing reactions initiated by ionizing radiation and UV

    NASA Astrophysics Data System (ADS)

    Chaplin, R. P.; Dworjanyn, P. A.; Gamage, N. J. W.; Garnett, J. L.; Jankiewicz, S. V.; Khan, M. A.; Sangster, D. F.

    1996-03-01

    Experimental evidence involving monomer absorption studies using tritiated styrene is shown to support the proposal that additives such as mineral acids and certain inorganic salts when dissolved in the monomer solution enhance radiation grafting yields by a mechanism involving partitioning of reagents. Photoinitiators such as benzoin ethyl ether and its methyl analogue are reported as new additives for grafting of styrene in methanol to cellulose and polypropylene initiated by ionizing radiation. The partitioning concept is shown to be relevant in analogous UV grafting and curing processes.

  1. On-line upgrade of program modules using AdaPT

    NASA Technical Reports Server (NTRS)

    Waldrop, Raymond S.; Volz, Richard A.; Smith, Gary W.; Goldsack, Stephen J.; Holzbach-Valero, A. A.

    1993-01-01

    One purpose of our research is the investigation of the effectiveness and expressiveness of AdaPT, a set of language extensions to Ada 83, for distributed systems. As a part of that effort, we are now investigating the subject of replacing, e.g. upgrading, software modules while the software system remains in operation. The AdaPT language extensions provide a good basis for this investigation for several reasons: they include the concept of specific, self-contained program modules which can be manipulated; support for program configuration is included in the language; and although the discussion will be in terms of the AdaPT language, the AdaPT to Ada 83 conversion methodology being developed as another part of this project will provide a basis for the application of our findings to Ada 83 and Ada 9X systems. The purpose of this investigation is to explore the basic mechanisms of the replacement process. With this purpose in mind, we will avoid including issues whose presence would obscure these basic mechanisms by introducing additional, unrelated concerns. Thus, while replacement in the presence of real-time deadlines, heterogeneous systems, and unreliable networks is certainly a topic of interest, we will first gain an understanding of the basic processes in the absence of such concerns. The extension of the replacement process to more complex situations can be made later. A previous report established an overview of the module replacement problem, a taxonomy of the various aspects of the replacement process, and a solution to one case in the replacement taxonomy. This report provides solutions to additional cases in the replacement process taxonomy: replacement of partitions with state and replacement of nodes. The solutions presented here establish the basic principles for module replacement. Extension of these solutions to other more complicated cases in the replacement taxonomy is direct, though requiring substantial work beyond the available funding.

  2. Characteristic time scales for diffusion processes through layers and across interfaces

    NASA Astrophysics Data System (ADS)

    Carr, Elliot J.

    2018-04-01

    This paper presents a simple tool for characterizing the time scale for continuum diffusion processes through layered heterogeneous media. This mathematical problem is motivated by several practical applications such as heat transport in composite materials, flow in layered aquifers, and drug diffusion through the layers of the skin. In such processes, the physical properties of the medium vary across layers and internal boundary conditions apply at the interfaces between adjacent layers. To characterize the time scale, we use the concept of mean action time, which provides the mean time scale at each position in the medium by utilizing the fact that the transition of the transient solution of the underlying partial differential equation model, from initial state to steady state, can be represented as a cumulative distribution function of time. Using this concept, we define the characteristic time scale for a multilayer diffusion process as the maximum value of the mean action time across the layered medium. For given initial conditions and internal and external boundary conditions, this approach leads to simple algebraic expressions for characterizing the time scale that depend on the physical and geometrical properties of the medium, such as the diffusivities and lengths of the layers. Numerical examples demonstrate that these expressions provide useful insight into explaining how the parameters in the model affect the time it takes for a multilayer diffusion process to reach steady state.

  3. Characteristic time scales for diffusion processes through layers and across interfaces.

    PubMed

    Carr, Elliot J

    2018-04-01

    This paper presents a simple tool for characterizing the time scale for continuum diffusion processes through layered heterogeneous media. This mathematical problem is motivated by several practical applications such as heat transport in composite materials, flow in layered aquifers, and drug diffusion through the layers of the skin. In such processes, the physical properties of the medium vary across layers and internal boundary conditions apply at the interfaces between adjacent layers. To characterize the time scale, we use the concept of mean action time, which provides the mean time scale at each position in the medium by utilizing the fact that the transition of the transient solution of the underlying partial differential equation model, from initial state to steady state, can be represented as a cumulative distribution function of time. Using this concept, we define the characteristic time scale for a multilayer diffusion process as the maximum value of the mean action time across the layered medium. For given initial conditions and internal and external boundary conditions, this approach leads to simple algebraic expressions for characterizing the time scale that depend on the physical and geometrical properties of the medium, such as the diffusivities and lengths of the layers. Numerical examples demonstrate that these expressions provide useful insight into explaining how the parameters in the model affect the time it takes for a multilayer diffusion process to reach steady state.

  4. Impact of molecular solvophobicity vs. solvophilicity on device performances of dimeric perylene diimide based solution-processed non-fullerene organic solar cells.

    PubMed

    Lu, Zhenhuan; Zhang, Xin; Zhan, Chuanlang; Jiang, Bo; Zhang, Xinliang; Chen, Lili; Yao, Jiannian

    2013-07-21

    Because of their outstanding molecular optoelectronic properties, perylene diimides (PDIs) are promising alternatives to the commonly used PCBM. However, the overly strong aggregation ability, poor solution-processability and compatibility of PDIs severely limit their photovoltaic applications. We turned to borrowing the amphiphile concept to improve these supramolecular properties. Practically, we fine-tuned the molecular solvophobicity with respect to the molecular solvophilicity, e.g. F(solvophob/solvophil), by changing the number of the weakly solvophobic 2-methoxyethoxyl (EG) groups in the bay-region of the thienyl-bridged dimeric PDI backbone, forming three PDI dimers of Bis-PDI-T (0 EG), Bis-PDI-T-EG (2 EG) and Bis-PDI-T-di-EG (4 EG) (Scheme 1). The photovoltaic properties using these dimers as the solution-processed non-fullerene electron-acceptor and P3HT as the electron-donor were investigated via the device configuration of ITO/PEDOT:PSS/P3HT:PDI dimer/Ca/Al. Bis-PDI-T exhibited overly strong aggregation ability and very poor solution-processability, which severely limited compatibility, giving a very poor power conversion efficiency (PCE) of 0.007%. When two EG groups were attached at the 1,1'-positions, the resulted Bis-PDI-T-EG showed dramatically reduced aggregation ability, improved solution-processability, compatibility and proper phase separation. Small sized phases (∼20 nm) dominated in the active layer and the best PCE was increased to 0.39%. When four solvophobic EG functions were introduced, affording Bis-PDI-T-di-EG with excellent supramolecular properties, particularly, the improvement of the phase separation with an increased phase size of 24 nm and the enhanced electron and hole mobilities, by 2-4 times, with respect to that of Bis-PDI-T-EG. The best PCE was further enhanced to 0.88%. After using 1-chloronaphthalene as the co-solvent of 1,2-dichlorobenzene to further improve the compatibility, the PCE was improved further up to 0.41% for Bis-PDI-T, 0.76% for Bis-PDI-T-EG and 1.54% for Bis-PDI-T-di-EG.

  5. Pharmaceutical Perspective on Opalescence and Liquid-Liquid Phase Separation in Protein Solutions.

    PubMed

    Raut, Ashlesha S; Kalonia, Devendra S

    2016-05-02

    Opalescence in protein solutions reduces aesthetic appeal of a formulation and can be an indicator of the presence of aggregates or precursor to phase separation in solution signifying reduced product stability. Liquid-liquid phase separation of a protein solution into a protein-rich and a protein-poor phase has been well-documented for globular proteins and recently observed for monoclonal antibody solutions, resulting in physical instability of the formulation. The present review discusses opalescence and liquid-liquid phase separation (LLPS) for therapeutic protein formulations. A brief discussion on theoretical concepts based on thermodynamics, kinetics, and light scattering is presented. This review also discusses theoretical concepts behind intense light scattering in the vicinity of the critical point termed as "critical opalescence". Both opalescence and LLPS are affected by the formulation factors including pH, ionic strength, protein concentration, temperature, and excipients. Literature reports for the effect of these formulation factors on attractive protein-protein interactions in solution as assessed by the second virial coefficient (B2) and the cloud-point temperature (Tcloud) measurements are also presented. The review also highlights pharmaceutical implications of LLPS in protein solutions.

  6. Restoration of Secondary Containment in Double Shell Tank (DST) Pits

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    SHEN, E.J.

    2000-10-05

    Cracks found in many of the double-shell tank (DST) pump and valve pits bring into question the ability of the pits to provide secondary containment and remain in compliance with State and Federal regulations. This study was commissioned to identify viable options for maintain/restoring secondary containment capability in these pits. The basis for this study is the decision analysis process which identifies the requirements to be met and the desired goals (decision criteria) that each option will be weighed against. A facilitated workshop was convened with individuals knowledgeable of Tank Farms Operations, engineering practices, and safety/environmental requirements. The outcome ofmore » this workshop was the validation or identification of the critical requirements, definition of the current problem, identification and weighting of the desired goals, baselining of the current repair methods, and identification of potential alternate solutions. The workshop was followed up with further investigations into the potential solutions that were identified in the workshop and through other efforts. These solutions are identified in the body of this report. Each of the potential solutions were screened against the list of requirements and only those meeting the requirements were considered viable options. To expand the field of viable options, hybrid concepts that combine the strongest features of different individual approaches were also examined. Several were identified. The decision analysis process then ranked each of the viable options against the weighted decision criteria, which resulted in a recommended solution. The recommended approach is based upon installing a sprayed on coating system.« less

  7. Robust media processing on programmable power-constrained systems

    NASA Astrophysics Data System (ADS)

    McVeigh, Jeff

    2005-03-01

    To achieve consumer-level quality, media systems must process continuous streams of audio and video data while maintaining exacting tolerances on sampling rate, jitter, synchronization, and latency. While it is relatively straightforward to design fixed-function hardware implementations to satisfy worst-case conditions, there is a growing trend to utilize programmable multi-tasking solutions for media applications. The flexibility of these systems enables support for multiple current and future media formats, which can reduce design costs and time-to-market. This paper provides practical engineering solutions to achieve robust media processing on such systems, with specific attention given to power-constrained platforms. The techniques covered in this article utilize the fundamental concepts of algorithm and software optimization, software/hardware partitioning, stream buffering, hierarchical prioritization, and system resource and power management. A novel enhancement to dynamically adjust processor voltage and frequency based on buffer fullness to reduce system power consumption is examined in detail. The application of these techniques is provided in a case study of a portable video player implementation based on a general-purpose processor running a non real-time operating system that achieves robust playback of synchronized H.264 video and MP3 audio from local storage and streaming over 802.11.

  8. The Experience Factory: Strategy and Practice

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.; Caldiera, Gianluigi

    1995-01-01

    The quality movement, that has had in recent years a dramatic impact on all industrial sectors, has recently reached the system and software industry. Although some concepts of quality management, originally developed for other product types, can be applied to software, its specificity as a product which is developed and not produced requires a special approach. This paper introduces a quality paradigm specifically tailored on the problem of the systems and software industry. Reuse of products, processes and experiences originating from the system life cycle is seen today as a feasible solution to the problem of developing higher quality systems at a lower cost. In fact, quality improvement is very often achieved by defining and developing an appropriate set of strategic capabilities and core competencies to support them. A strategic capability is, in this context, a corporate goal defined by the business position of the organization and implemented by key business processes. Strategic capabilities are supported by core competencies, which are aggregate technologies tailored to the specific needs of the organization in performing the needed business processes. Core competencies are non-transitional, have a consistent evolution, and are typically fueled by multiple technologies. Their selection and development requires commitment, investment and leadership. The paradigm introduced in this paper for developing core competencies is the Quality Improvement Paradigm which consists of six steps: (1) Characterize the environment, (2) Set the goals, (3) Choose the process, (4) Execute the process, (5) Analyze the process data, and (6) Package experience. The process must be supported by a goal oriented approach to measurement and control, and an organizational infrastructure, called Experience Factory. The Experience Factory is a logical and physical organization distinct from the project organizations it supports. Its goal is development and support of core competencies through capitalization and reuse of its cycle experience and products. The paper introduces the major concepts of the proposed approach, discusses their relationship with other approaches used in the industry, and presents a case in which those concepts have been successfully applied.

  9. Developing Pre-Service Teachers' Noticing of Students' Understanding of the Derivative Concept

    ERIC Educational Resources Information Center

    Sánchez-Matamoros, Gloria; Fernández, Ceneida; Llinares, Salvador

    2015-01-01

    This research study examines the development of the ability of pre-service teachers to notice signs of students' understanding of the derivative concept. It analyses preservice teachers' interpretations of written solutions to problems involving the derivative concept before and after participating in a teacher training module. The results…

  10. Using CFD as a Rocket Injector Design Tool: Recent Progress at Marshall Space Flight Center

    NASA Technical Reports Server (NTRS)

    Tucker, Kevin; West, Jeff; Williams, Robert; Lin, Jeff; Canabal, Francisco; Rocker, marvin; Robles, Bryan; Garcia, Robert; Chenoweth, James

    2005-01-01

    New programs are forcing American propulsion system designers into unfamiliar territory. For instance, industry s answer to the cost and reliability goals set out by the Next Generation Launch Technology Program are engine concepts based on the Oxygen- Rich Staged Combustion Cycle. Historical injector design tools are not well suited for this new task. The empirical correlations do not apply directly to the injector concepts associated with the ORSC cycle. These legacy tools focus primarily on performance with environment evaluation a secondary objective. Additionally, the environmental capability of these tools is usually one-dimensional while the actual environments are at least two- and often three-dimensional. CFD has the potential to calculate performance and multi-dimensional environments but its use in the injector design process has been retarded by long solution turnaround times and insufficient demonstrated accuracy. This paper has documented the parallel paths of program support and technology development currently employed at Marshall Space Flight Center in an effort to move CFD to the forefront of injector design. MSFC has established a long-term goal for use of CFD for combustion devices design. The work on injector design is the heart of that vision and the Combustion Devices CFD Simulation Capability Roadmap that focuses the vision. The SRL concept, combining solution fidelity, robustness and accuracy, has been established as a quantitative gauge of current and desired capability. Three examples of current injector analysis for program support have been presented and discussed. These examples are used to establish the current capability at MSFC for these problems. Shortcomings identified from this experience are being used as inputs to the Roadmap process. The SRL evaluation identified lack of demonstrated solution accuracy as a major issue. Accordingly, the MSFC view of code validation and current MSFC-funded validation efforts were discussed in some detail. The objectives of each effort were noted. Issues relative to code validation for injector design were discussed in some detail. The requirement for CFD support during the design of the experiment was noted and discussed in terms of instrumentation placement and experimental rig uncertainty. In conclusion, MSFC has made significant progress in the last two years in advancing CFD toward the goal of application to injector design. A parallel effort focused on program support and technology development via the SCIT Task have enabled the progress.

  11. Axicons, prisms and integrators: searching for simple laser beam shaping solutions

    NASA Astrophysics Data System (ADS)

    Lizotte, Todd

    2010-08-01

    Over the last thirty five years there have been many papers presented at numerous conferences and published within a host of optical journals. What is presented in many cases is either too exotic or technically challenging in practical application terms and it could be said both are testaments to the imagination of engineers and researchers. For many brute force laser processing applications such as paint stripping, large area ablation or general skiving of flex circuits, the opportunity to use a beam shaper that is inexpensive is a welcomed tool. Shaping the laser beam for less demanding applications, provides for a more uniform removal rate and increases the overall quality of the part being processed. It is a well known fact customers like their parts to look good. Many times, complex optical beam shaping techniques are considered because no one is aware of the historical solutions that have been lost to the ages. These complex solutions can range in price from 10,000 to 60,000 and require many months to design and fabricate. This paper will provide an overview of various beam shaping techniques that are both elegant and simple in concept and design. Optical techniques using axicons, prisms and reflective integrators will be discussed in an overview format.

  12. Implementation of the NANoREG Safe-by-Design approach for different nanomaterial applications

    NASA Astrophysics Data System (ADS)

    Micheletti, C.; Roman, M.; Tedesco, E.; Olivato, I.; Benetti, F.

    2017-06-01

    The Safe-by-Design (SbD) concept is already in use in different industrial sectors as an integral part of the innovation process management. However, the adopted approach is often limited to design solutions aiming at hazard reduction. Safety is not always considered during the innovation process, mainly due to the lack of knowledge (e.g. in small and medium companies, SMEs) and the lack of dialogue between actors along the innovation chain. The net result is that safety is considered only at the end of the innovation process at the market authorization phase, with potential loss of time and money. This is especially valid for manufactured nanomaterials (MNM) for which the regulatory context is not completely developed, and the safety knowledge is not readily available. In order to contribute to a sustainable innovation process in the nanotechnology field by maximising both benefits and safety, the NANoREG project developed a Safe Innovation approach, based on two elements: the Safe-by-Design approach which aims at including risk assessment into all innovation stages; and the Regulatory Preparedness, focused on the dialogue with stakeholders along the innovation chain. In this work we present some examples about the implementation in our Laboratory of this approach for different MNM applications, covering different steps of the innovation chain. The case studies include: the feasibility study of a medical device including substances, for topical application; the testing of two potential nanotech solutions for the consolidation of cultural heritage artifacts; the testing of coatings already on the market for other uses, which was tested as food contact materials (FCM) to evaluate the conformity to food applications. These three examples represent a good opportunity to show the importance of NANoREG SbD and Safe Innovation Approach in general, for developing new nanotechnology-based products, also highlighting the crucial role of EU ProSafe project in promoting this concept to industries and interested stakeholders.

  13. Critical time scales for advection-diffusion-reaction processes

    NASA Astrophysics Data System (ADS)

    Ellery, Adam J.; Simpson, Matthew J.; McCue, Scott W.; Baker, Ruth E.

    2012-04-01

    The concept of local accumulation time (LAT) was introduced by Berezhkovskii and co-workers to give a finite measure of the time required for the transient solution of a reaction-diffusion equation to approach the steady-state solution [A. M. Berezhkovskii, C. Sample, and S. Y. Shvartsman, Biophys. J.BIOJAU0006-349510.1016/j.bpj.2010.07.045 99, L59 (2010); A. M. Berezhkovskii, C. Sample, and S. Y. Shvartsman, Phys. Rev. EPLEEE81539-375510.1103/PhysRevE.83.051906 83, 051906 (2011)]. Such a measure is referred to as a critical time. Here, we show that LAT is, in fact, identical to the concept of mean action time (MAT) that was first introduced by McNabb [A. McNabb and G. C. Wake, IMA J. Appl. Math.IJAMDM0272-496010.1093/imamat/47.2.193 47, 193 (1991)]. Although McNabb's initial argument was motivated by considering the mean particle lifetime (MPLT) for a linear death process, he applied the ideas to study diffusion. We extend the work of these authors by deriving expressions for the MAT for a general one-dimensional linear advection-diffusion-reaction problem. Using a combination of continuum and discrete approaches, we show that MAT and MPLT are equivalent for certain uniform-to-uniform transitions; these results provide a practical interpretation for MAT by directly linking the stochastic microscopic processes to a meaningful macroscopic time scale. We find that for more general transitions, the equivalence between MAT and MPLT does not hold. Unlike other critical time definitions, we show that it is possible to evaluate the MAT without solving the underlying partial differential equation (pde). This makes MAT a simple and attractive quantity for practical situations. Finally, our work explores the accuracy of certain approximations derived using MAT, showing that useful approximations for nonlinear kinetic processes can be obtained, again without treating the governing pde directly.

  14. Distinguishing values from science in decision making: Setting harvest quotas for mountain lions in Montana

    USGS Publications Warehouse

    Mitchell, Michael S.; Cooley, Hilary; Gude, Justin A.; Kolbe, Jay; Nowak, J. Joshua; Proffitt, Kelly M.; Sells, Sarah N.; Thompson, Mike

    2018-01-01

    The relative roles of science and human values can be difficult to distinguish when informal processes are used to make complex and contentious decisions in wildlife management. Structured Decision Making (SDM) offers a formal process for making such decisions, where scientific results and concepts can be disentangled from the values of differing stakeholders. We used SDM to formally integrate science and human values for a citizen working group of ungulate hunting advocates, lion hunting advocates, and outfitters convened to address the contentious allocation of harvest quotas for mountain lions (Puma concolor) in west‐central Montana, USA, during 2014. A science team consisting of mountain lion biologists and population ecologists convened to support the working group. The science team used integrated population models that incorporated 4 estimates of mountain lion density to estimate population trajectories for 5 alternative harvest quotas developed by the working group. Results of the modeling predicted that effects of each harvest quota were consistent across the 4 density estimates; harvest quotas affected predicted population trajectories for 5 years after implementation but differences were not strong. Based on these results, the focus of the working group changed to differences in values among stakeholders that were the true impediment to allocating harvest quotas. By distinguishing roles of science and human values in this process, the working group was able to collaboratively recommend a compromise solution. This solution differed little from the status quo that had been the focus of debate, but the SDM process produced understanding and buy‐in among stakeholders involved, reducing disagreements, misunderstanding, and unproductive arguments founded on informal application of scientific data and concepts. Whereas investments involved in conducting SDM may be unnecessary for many decisions in wildlife management, the investment may be beneficial for complex, contentious, and multiobjective decisions that integrate science and human values.

  15. Industrial noise control: Some case histories, volume 1

    NASA Technical Reports Server (NTRS)

    Hart, F. D.; Neal, C. L.; Smetana, F. O.

    1974-01-01

    A collection of solutions to industrial noise problems is presented. Each problem is described in simple terms, with noise measurements where available, and the solution is given, often with explanatory figures. Where the solution rationale is not obvious, an explanatory paragraph is usually appended. As a preface to these solutions, a short exposition is provided of some of the guiding concepts used by noise control engineers in devising their solutions.

  16. Illustrating Chemical Concepts through Food Systems: Introductory Chemistry Experiments.

    ERIC Educational Resources Information Center

    Chambers, E., IV; Setser, C. S.

    1980-01-01

    Demonstrations involving foods that illustrate chemical concepts are described, including vaporization of liquids and Graham's law of diffusion, chemical reaction rates, adsorption, properties of solutions, colloidal dispersions, suspensions, and hydrogen ion concentration. (CS)

  17. Applications of aerospace technology to petroleum extraction and reservoir engineering

    NASA Technical Reports Server (NTRS)

    Jaffe, L. D.; Back, L. H.; Berdahl, C. M.; Collins, E. E., Jr.; Gordon, P. G.; Houseman, J.; Humphrey, M. F.; Hsu, G. C.; Ham, J. D.; Marte, J. E.; hide

    1977-01-01

    Through contacts with the petroleum industry, the petroleum service industry, universities and government agencies, important petroleum extraction problems were identified. For each problem, areas of aerospace technology that might aid in its solution were also identified, where possible. Some of the problems were selected for further consideration. Work on these problems led to the formulation of specific concepts as candidate for development. Each concept is addressed to the solution of specific extraction problems and makes use of specific areas of aerospace technology.

  18. Accurate Micro-Tool Manufacturing by Iterative Pulsed-Laser Ablation

    NASA Astrophysics Data System (ADS)

    Warhanek, Maximilian; Mayr, Josef; Dörig, Christian; Wegener, Konrad

    2017-12-01

    Iterative processing solutions, including multiple cycles of material removal and measurement, are capable of achieving higher geometric accuracy by compensating for most deviations manifesting directly on the workpiece. Remaining error sources are the measurement uncertainty and the repeatability of the material-removal process including clamping errors. Due to the lack of processing forces, process fluids and wear, pulsed-laser ablation has proven high repeatability and can be realized directly on a measuring machine. This work takes advantage of this possibility by implementing an iterative, laser-based correction process for profile deviations registered directly on an optical measurement machine. This way efficient iterative processing is enabled, which is precise, applicable for all tool materials including diamond and eliminates clamping errors. The concept is proven by a prototypical implementation on an industrial tool measurement machine and a nanosecond fibre laser. A number of measurements are performed on both the machine and the processed workpieces. Results show production deviations within 2 μm diameter tolerance.

  19. Spectroscopic methods of process monitoring for safeguards of used nuclear fuel separations

    NASA Astrophysics Data System (ADS)

    Warburton, Jamie Lee

    To support the demonstration of a more proliferation-resistant nuclear fuel processing plant, techniques and instrumentation to allow the real-time, online determination of special nuclear material concentrations in-process must be developed. An ideal materials accountability technique for proliferation resistance should provide nondestructive, realtime, on-line information of metal and ligand concentrations in separations streams without perturbing the process. UV-Visible spectroscopy can be adapted for this precise purpose in solvent extraction-based separations. The primary goal of this project is to understand fundamental URanium EXtraction (UREX) and Plutonium-URanium EXtraction (PUREX) reprocessing chemistry and corresponding UV-Visible spectroscopy for application in process monitoring for safeguards. By evaluating the impact of process conditions, such as acid concentration, metal concentration and flow rate, on the sensitivity of the UV-Visible detection system, the process-monitoring concept is developed from an advanced application of fundamental spectroscopy. Systematic benchtop-scale studies investigated the system relevant to UREX or PUREX type reprocessing systems, encompassing 0.01-1.26 M U and 0.01-8 M HNO3. A laboratory-scale TRansUranic Extraction (TRUEX) demonstration was performed and used both to analyze for potential online monitoring opportunities in the TRUEX process, and to provide the foundation for building and demonstrating a laboratory-scale UREX demonstration. The secondary goal of the project is to simulate a diversion scenario in UREX and successfully detect changes in metal concentration and solution chemistry in a counter current contactor system with a UV-Visible spectroscopic process monitor. UREX uses the same basic solvent extraction flowsheet as PUREX, but has a lower acid concentration throughout and adds acetohydroxamic acid (AHA) as a complexant/reductant to the feed solution to prevent the extraction of Pu. By examining UV-Visible spectra gathered in real time, the objective is to detect the conversion from the UREX process, which does not separate Pu, to the PUREX process, which yields a purified Pu product. The change in process chemistry can be detected in the feed solution, aqueous product or in the raffinate stream by identifying the acid concentration, metal distribution and the presence or absence of AHA. A fiber optic dip probe for UV-Visible spectroscopy was integrated into a bank of three counter-current centrifugal contactors to demonstrate the online process monitoring concept. Nd, Fe and Zr were added to the uranyl nitrate system to explore spectroscopic interferences and identify additional species as candidates for online monitoring. This milestone is a demonstration of the potential of this technique, which lies in the ability to simultaneously and directly monitor the chemical process conditions in a reprocessing plant, providing inspectors with another tool to detect nuclear material diversion attempts. Lastly, dry processing of used nuclear fuel is often used as a head-end step before solvent extraction-based separations such as UREX or TRUEX. A non-aqueous process, used fuel treatment by dry processing generally includes chopping of used fuel rods followed by repeated oxidation-reduction cycles and physical separation of the used fuel from the cladding. Thus, dry processing techniques are investigated and opportunities for online monitoring are proposed for continuation of this work in future studies.

  20. A new scheme of the time-domain fluorescence tomography for a semi-infinite turbid medium

    NASA Astrophysics Data System (ADS)

    Prieto, Kernel; Nishimura, Goro

    2017-04-01

    A new scheme for reconstruction of a fluorophore target embedded in a semi-infinite medium was proposed and evaluated. In this scheme, we neglected the presence of the fluorophore target for the excitation light and used an analytical solution of the time-dependent radiative transfer equation (RTE) for the excitation light in a homogeneous semi-infinite media instead of solving the RTE numerically in the forward calculation. The inverse problem for imaging the fluorophore target was solved using the Landweber-Kaczmarz method with the concept of the adjoint fields. Numerical experiments show that the proposed scheme provides acceptable results of the reconstructed shape and location of the target. The computation times of the solution of the forward problem and the whole reconstruction process were reduced by about 40 and 15%, respectively.

  1. Analytical applications of emulsions and microemulsions.

    PubMed

    Burguera, José Luis; Burguera, Marcela

    2012-07-15

    Dispersion systems like emulsions and microemulsions are able to solubilize both polar and non-polar substances due to the special arrangement of the oil and aqueous phases. The main advantages of using emulsions or microemulsions in analytical chemistry are that they do not require the previous destruction of the sample matrix or the use of organic solvents as diluents, and behave similarly to aqueous solutions, frequently allowing the use of aqueous standard solutions for calibration. However, it appears that there are many contradictory concepts and misunderstandings often related to terms definition when referring to such systems. The main aim of this review is to outline the differences between these two aggregates and to give an overview of the most recent advances on their analytical applications with emphasis on the potentiality of the on-line emulsification processes. Copyright © 2012 Elsevier B.V. All rights reserved.

  2. Managing Communication among Geographically Distributed Teams: A Brazilian Case

    NASA Astrophysics Data System (ADS)

    Almeida, Ana Carina M.; de Farias Junior, Ivaldir H.; de S. Carneiro, Pedro Jorge

    The growing demand for qualified professionals is making software companies opt for distributed software development (DSD). At the project conception, communication and synchronization of information are critical factors for success. However problems such as time-zone difference between teams, culture, language and different development processes among sites could difficult the communication among teams. In this way, the main goal of this paper is to describe the solution adopted by a Brazilian team to improve communication in a multisite project environment. The purposed solution was based on the best practices described in the literature, and the communication plan was created based on the infrastructure needed by the project. The outcome of this work is to minimize the impact of communication issues in multisite projects, increasing productivity, good understanding and avoiding rework on code and document writing.

  3. Distributed heterogeneous inspecting system and its middleware-based solution.

    PubMed

    Huang, Li-can; Wu, Zhao-hui; Pan, Yun-he

    2003-01-01

    There are many cases when an organization needs to monitor the data and operations of its supervised departments, especially those departments which are not owned by this organization and are managed by their own information systems. Distributed Heterogeneous Inspecting System (DHIS) is the system an organization uses to monitor its supervised departments by inspecting their information systems. In DHIS, the inspected systems are generally distributed, heterogeneous, and constructed by different companies. DHIS has three key processes-abstracting core data sets and core operation sets, collecting these sets, and inspecting these collected sets. In this paper, we present the concept and mathematical definition of DHIS, a metadata method for solving the interoperability, a security strategy for data transferring, and a middleware-based solution of DHIS. We also describe an example of the inspecting system at WENZHOU custom.

  4. The generalized formula for angular velocity vector of the moving coordinate system

    NASA Astrophysics Data System (ADS)

    Ermolin, Vladislav S.; Vlasova, Tatyana V.

    2018-05-01

    There are various ways for introducing the concept of the instantaneous angular velocity vector. In this paper we propose a method based on introducing of this concept by construction of the solution for the system of kinematic equations. These equations connect the function vectors defining the motion of the basis, and their derivatives. Necessary and sufficient conditions for the existence and uniqueness of the solution of this system are established. The instantaneous angular velocity vector is a solution of the algebraic system of equations. It is built explicitly. The derived formulas for the angular velocity vector generalize the earlier results, both for a basis of an affine oblique coordinate system and for an orthonormal basis.

  5. Employment of Gibbs-Donnan-based concepts for interpretation of the properties of linear polyelectrolyte solutions

    USGS Publications Warehouse

    Marinsky, J.A.; Reddy, M.M.

    1991-01-01

    Earlier research has shown that the acid dissociation and metal ion complexation equilibria of linear, weak-acid polyelectrolytes and their cross-linked gel analogues are similarly sensitive to the counterion concentration levels of their solutions. Gibbs-Donnan-based concepts, applicable to the gel, are equally applicable to the linear polyelectrolyte for the accommodation of this sensitivity to ionic strength. This result is presumed to indicate that the linear polyelectrolyte in solution develops counterion-concentrating regions that closely resemble the gel phase of their analogues. Advantage has been taken of this description of linear polyelectrolytes to estimate the solvent uptake by these regions. ?? 1991 American Chemical Society.

  6. ASP archiving solution of regional HUSpacs.

    PubMed

    Pohjonen, Hanna; Kauppinen, Tomi; Ahovuo, Juhani

    2004-09-01

    The application service provider (ASP) model is not novel, but widely used in several non-health care-related business areas. In this article, ASP is described as a potential solution for long-term and back-up archiving of the picture archiving and communication system (PACS) of the Hospital District of Helsinki and Uusimaa (HUS). HUSpacs is a regional PACS for 21 HUS hospitals serving altogether 1.4 million citizens. The ultimate goal of this study was to define the specifications for the ASP archiving service and to compare different commercial options for archiving solutions (costs derived by unofficial requests for proposal): in-house PACS components, the regional ASP concept and the hospital-based ASP concept. In conclusion, the large scale of the HUS installation enables a cost-effective regional ASP archiving, resulting in a four to five times more economical solution than hospital-based ASP.

  7. EFFECTIVE POROSITY IMPLIES EFFECTIVE BULK DENSITY IN SORBING SOLUTE TRANSPORT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Flach, G.

    2012-02-27

    The concept of an effective porosity is widely used in solute transport modeling to account for the presence of a fraction of the medium that effectively does not influence solute migration, apart from taking up space. This non-participating volume or ineffective porosity plays the same role as the gas phase in single-phase liquid unsaturated transport: it increases pore velocity, which is useful towards reproducing observed solute travel times. The prevalent use of the effective porosity concept is reflected by its prominent inclusion in popular texts, e.g., de Marsily (1986), Fetter (1988, 1993) and Zheng and Bennett (2002). The purpose ofmore » this commentary is to point out that proper application of the concept for sorbing solutes requires more than simply reducing porosity while leaving other material properties unchanged. More specifically, effective porosity implies the corresponding need for an effective bulk density in a conventional single-porosity model. The reason is that the designated non-participating volume is composed of both solid and fluid phases, both of which must be neglected for consistency. Said another way, if solute does not enter the ineffective porosity then it also cannot contact the adjoining solid. Conceptually neglecting the fluid portion of the non-participating volume leads to a lower (effective) porosity. Likewise, discarding the solid portion of the non-participating volume inherently leads to a lower or effective bulk density. In the author's experience, practitioners virtually never adjust bulk density when adopting the effective porosity approach.« less

  8. Application of the suggestion system in the improvement of the production process and product quality control

    NASA Astrophysics Data System (ADS)

    Gołaś, H.; Mazur, A.; Gruszka, J.; Szafer, P.

    2016-08-01

    The elaboration is a case study and the research was carried out in the company Alco-Mot Ltd., which employs 120 people. The company specializes in the production of lead poles for industrial and traction batteries using gravity casting. The elements embedded in the cast are manufactured on a machining centre, which provides the stability of the process and of the dimensions of the product as well as a very short production time. As a result of observation and analysis the authors have developed a concept for the implementation of a dynamic suggestion system in ALCO-MOT, including, among others, a standard for actions in the implementation of the suggestion system, as well as clear guidelines for the processing and presentation of the activities undertaken in the time between the establishment of the concept (suggestions) and the benefits analysis after the proposed solutions have been implemented. The authors also present how suggestions proposed by ALCO-MOT staff contributed to the improvement of the processes of production and quality control. Employees offered more than 30 suggestions, of which more than a half are being implemented now and further actions are being prepared for implementation. The authors will present the results of improvements in, for example, tool replacement time, scrap reduction. The authors will present how kaizen can improve the production and quality control processes. They will present how the production and quality control processes looked before and after the implementation of employee suggestions.

  9. Elements of an algorithm for optimizing a parameter-structural neural network

    NASA Astrophysics Data System (ADS)

    Mrówczyńska, Maria

    2016-06-01

    The field of processing information provided by measurement results is one of the most important components of geodetic technologies. The dynamic development of this field improves classic algorithms for numerical calculations in the aspect of analytical solutions that are difficult to achieve. Algorithms based on artificial intelligence in the form of artificial neural networks, including the topology of connections between neurons have become an important instrument connected to the problem of processing and modelling processes. This concept results from the integration of neural networks and parameter optimization methods and makes it possible to avoid the necessity to arbitrarily define the structure of a network. This kind of extension of the training process is exemplified by the algorithm called the Group Method of Data Handling (GMDH), which belongs to the class of evolutionary algorithms. The article presents a GMDH type network, used for modelling deformations of the geometrical axis of a steel chimney during its operation.

  10. Teaching Workflow Analysis and Lean Thinking via Simulation: A Formative Evaluation

    PubMed Central

    Campbell, Robert James; Gantt, Laura; Congdon, Tamara

    2009-01-01

    This article presents the rationale for the design and development of a video simulation used to teach lean thinking and workflow analysis to health services and health information management students enrolled in a course on the management of health information. The discussion includes a description of the design process, a brief history of the use of simulation in healthcare, and an explanation of how video simulation can be used to generate experiential learning environments. Based on the results of a survey given to 75 students as part of a formative evaluation, the video simulation was judged effective because it allowed students to visualize a real-world process (concrete experience), contemplate the scenes depicted in the video along with the concepts presented in class in a risk-free environment (reflection), develop hypotheses about why problems occurred in the workflow process (abstract conceptualization), and develop solutions to redesign a selected process (active experimentation). PMID:19412533

  11. Numerical stability in problems of linear algebra.

    NASA Technical Reports Server (NTRS)

    Babuska, I.

    1972-01-01

    Mathematical problems are introduced as mappings from the space of input data to that of the desired output information. Then a numerical process is defined as a prescribed recurrence of elementary operations creating the mapping of the underlying mathematical problem. The ratio of the error committed by executing the operations of the numerical process (the roundoff errors) to the error introduced by perturbations of the input data (initial error) gives rise to the concept of lambda-stability. As examples, several processes are analyzed from this point of view, including, especially, old and new processes for solving systems of linear algebraic equations with tridiagonal matrices. In particular, it is shown how such a priori information can be utilized as, for instance, a knowledge of the row sums of the matrix. Information of this type is frequently available where the system arises in connection with the numerical solution of differential equations.

  12. Wikipedia Entries as a Source of CAR Navigation Landmarks

    NASA Astrophysics Data System (ADS)

    Binski, N.; Zhang, L.; Dalyot, S.

    2016-06-01

    Car navigation system devices provide today with an easy and simple solution to the basic concept of reaching a destination. Although these systems usually achieve this goal, they still deliver a limited and poor sequence of instructions that do not consider the human nature of using landmarks during wayfinding. This research paper addresses the concept of enriching navigation route instructions by adding supplementary route information in the form of landmarks. We aim at using a contributed source of landmarks information, which is easy to access, available, show high update rate, and have a large scale of information. For this, Wikipedia was chosen, since it represents the world's largest free encyclopaedia that includes information about many spatial entities. A survey and classification of available landmarks is implemented, coupled with ranking algorithms based on the entries' categories and attributes. These are aimed at retrieving the most relevant landmark information required that are valuable for the enrichment of a specific navigation route. The paper will present this methodology, together with examples and results, showing the feasibility of using this concept and its potential of enriching navigation processes.

  13. Considerations in the Integration of Small Aircraft Transportation System Higher Volume Operations (SATSHVO) in the National Airspace System (NAS)

    NASA Technical Reports Server (NTRS)

    Lohr, Gary W.; Williams, Dan; Abbott, Terence; Baxley, Brian; Greco, Adam; Ridgway, Richard

    2005-01-01

    The Small Aircraft Transportation System Higher Volume Operations (SATS HVO) concept holds the promise for increased efficiency and throughput at many of the nations under-used airports. This concept allows for concurrent operations at uncontrolled airports that under today s procedures are restricted to one arrival or one departure operation at a time, when current-day IFR separation standards are applied. To allow for concurrent operations, SATS HVO proposes several fundamental changes to today's system. These changes include: creation of dedicated airspace, development of new procedures and communications (phraseologies), and assignment of roles and responsibilities for pilots and controllers, among others. These changes would affect operations on the airborne side (pilot) as well as the groundside (controller and air traffic flow process). The focus of this paper is to discuss some of the issues and potential problems that have been considered in the development of the SATS HVO concept, in particular from the ground side perspective. Reasonable solutions to the issues raised here have been proposed by the SATS HVO team, and are discussed in this paper.

  14. A Concept Analysis of Systems Thinking.

    PubMed

    Stalter, Ann M; Phillips, Janet M; Ruggiero, Jeanne S; Scardaville, Debra L; Merriam, Deborah; Dolansky, Mary A; Goldschmidt, Karen A; Wiggs, Carol M; Winegardner, Sherri

    2017-10-01

    This concept analysis, written by the National Quality and Safety Education for Nurses (QSEN) RN-BSN Task Force, defines systems thinking in relation to healthcare delivery. A review of the literature was conducted using five databases with the keywords "systems thinking" as well as "nursing education," "nursing curriculum," "online," "capstone," "practicum," "RN-BSN/RN to BSN," "healthcare organizations," "hospitals," and "clinical agencies." Only articles that focused on systems thinking in health care were used. The authors identified defining attributes, antecedents, consequences, and empirical referents of systems thinking. Systems thinking was defined as a process applied to individuals, teams, and organizations to impact cause and effect where solutions to complex problems are accomplished through collaborative effort according to personal ability with respect to improving components and the greater whole. Four primary attributes characterized systems thinking: dynamic system, holistic perspective, pattern identification, and transformation. Using the platform provided in this concept analysis, interprofessional practice has the ability to embrace planned efforts to improve critically needed quality and safety initiatives across patients' lifespans and all healthcare settings. © 2016 Wiley Periodicals, Inc.

  15. Confusion of recovery: one solution.

    PubMed

    Collier, Elizabeth

    2010-02-01

    This paper questions the current mental health discourse that offers new definitions of the concept of 'recovery' and offers a different perspective that aims to clarify its meaning. Confusion is caused when medical language continues to be used in discussions that aim to challenge traditional medical understanding of the term 'recovery' (meaning cure). Medical and non-medical concepts of recovery are referred to interchangeably in many narratives and the common references to and acceptance of the Harding et al. papers and similar that report on how people can 'get better' from schizophrenia perpetuates this confusion. In this paper, it is suggested that 'recovery' should not be viewed as having new meaning, but that two different concepts have been confused, with the same word having been used to describe two completely different things altogether. This means that what is referred to in this paper as 'medical' recovery (traditional definitions of recovery that aims for cure), becomes subordinate to 'life' recovery (personal development and change) in which psychiatric classification might have no part in a person's understanding of their experience and where improving 'symptoms' could be irrelevant in the personal process of growth and discovery.

  16. KaBOB: ontology-based semantic integration of biomedical databases.

    PubMed

    Livingston, Kevin M; Bada, Michael; Baumgartner, William A; Hunter, Lawrence E

    2015-04-23

    The ability to query many independent biological databases using a common ontology-based semantic model would facilitate deeper integration and more effective utilization of these diverse and rapidly growing resources. Despite ongoing work moving toward shared data formats and linked identifiers, significant problems persist in semantic data integration in order to establish shared identity and shared meaning across heterogeneous biomedical data sources. We present five processes for semantic data integration that, when applied collectively, solve seven key problems. These processes include making explicit the differences between biomedical concepts and database records, aggregating sets of identifiers denoting the same biomedical concepts across data sources, and using declaratively represented forward-chaining rules to take information that is variably represented in source databases and integrating it into a consistent biomedical representation. We demonstrate these processes and solutions by presenting KaBOB (the Knowledge Base Of Biomedicine), a knowledge base of semantically integrated data from 18 prominent biomedical databases using common representations grounded in Open Biomedical Ontologies. An instance of KaBOB with data about humans and seven major model organisms can be built using on the order of 500 million RDF triples. All source code for building KaBOB is available under an open-source license. KaBOB is an integrated knowledge base of biomedical data representationally based in prominent, actively maintained Open Biomedical Ontologies, thus enabling queries of the underlying data in terms of biomedical concepts (e.g., genes and gene products, interactions and processes) rather than features of source-specific data schemas or file formats. KaBOB resolves many of the issues that routinely plague biomedical researchers intending to work with data from multiple data sources and provides a platform for ongoing data integration and development and for formal reasoning over a wealth of integrated biomedical data.

  17. Continuous API-crystal coating via coacervation in a tubular reactor.

    PubMed

    Besenhard, M O; Thurnberger, A; Hohl, R; Faulhammer, E; Rattenberger, J; Khinast, J G

    2014-11-20

    We present a proof-of-concept study of a continuous coating process of single API crystals in a tubular reactor using coacervation as a microencapsulation technique. Continuous API crystal coating can have several advantages, as in a single step (following crystallization) individual crystals can be prepared with a functional coating, either to change the release behavior, to protect the API from gastric juice or to modify the surface energetics of the API (i.e., to tailor the hydrophobic/hydrophilic characteristics, flowability or agglomeration tendency, etc.). The coating process was developed for the microencapsulation of a lipophilic core material (ibuprofen crystals of 20 μm- to 100 μm-size), with either hypromellose phthalate (HPMCP) or Eudragit L100-55. The core material was suspended in an aqueous solution containing one of these enteric polymers, fed into the tubing and mixed continuously with a sodium sulfate solution as an antisolvent to induce coacervation. A subsequent temperature treatment was applied to optimize the microencapsulation of crystals via the polymer-rich coacervate phase. Cross-linking of the coating shell was achieved by mixing the processed material with an acidic solution (pH<3). Flow rates, temperature profiles and polymer-to-antisolvent ratios had to be tightly controlled to avoid excessive aggregation, leading to pipe plugging. This work demonstrates the potential of a tubular reactor design for continuous coating applications and is the basis for future work, combining continuous crystallization and coating. Copyright © 2014 Elsevier B.V. All rights reserved.

  18. Radiolysis aspects of the aqueous self-cooled blanket concept and the problem of tritium extraction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bruggeman, A.; Snykers, M.; DeRegge, P.

    1988-09-01

    In the Aqueous Self-Cooled Blanket (ASCB) concept, an aqueous /sup 6/Li solution in a metallic structure is used as a fusion reactor shielding-breeding blanket. Radiolysis effects could be very important for the design and the use of an ASCB. Although many aspects of the radiation chemistry of water and dilute aqueous solutions are now reasonably well understood, it is not possible to predict the radiochemical behaviour of the concentrated candidate ASCB solutions quantitatively. However, by means of a worst case calculation for a possible ASCB for the Next European Torus (NET) it is shown that even with an important ratemore » of water decomposition the ASCB concept is still workable. Gas bubbles and explosive mixtures can be avoided by increasing the pressure in the neutron irradiated zone and by extracting and/or recombining the radiolytically produced hydrogen and oxygen. This could require an additional inert gas loop, which could also be used as part of the tritium extraction installation.« less

  19. A New Framework and Prototype Solution for Clinical Decision Support and Research in Genomics and Other Data-intensive Fields of Medicine.

    PubMed

    Evans, James P; Wilhelmsen, Kirk C; Berg, Jonathan; Schmitt, Charles P; Krishnamurthy, Ashok; Fecho, Karamarie; Ahalt, Stanley C

    2016-01-01

    In genomics and other fields, it is now possible to capture and store large amounts of data in electronic medical records (EMRs). However, it is not clear if the routine accumulation of massive amounts of (largely uninterpretable) data will yield any health benefits to patients. Nevertheless, the use of large-scale medical data is likely to grow. To meet emerging challenges and facilitate optimal use of genomic data, our institution initiated a comprehensive planning process that addresses the needs of all stakeholders (e.g., patients, families, healthcare providers, researchers, technical staff, administrators). Our experience with this process and a key genomics research project contributed to the proposed framework. We propose a two-pronged Genomic Clinical Decision Support System (CDSS) that encompasses the concept of the "Clinical Mendeliome" as a patient-centric list of genomic variants that are clinically actionable and introduces the concept of the "Archival Value Criterion" as a decision-making formalism that approximates the cost-effectiveness of capturing, storing, and curating genome-scale sequencing data. We describe a prototype Genomic CDSS that we developed as a first step toward implementation of the framework. The proposed framework and prototype solution are designed to address the perspectives of stakeholders, stimulate effective clinical use of genomic data, drive genomic research, and meet current and future needs. The framework also can be broadly applied to additional fields, including other '-omics' fields. We advocate for the creation of a Task Force on the Clinical Mendeliome, charged with defining Clinical Mendeliomes and drafting clinical guidelines for their use.

  20. Opportunities and challenges in biological lignin valorization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beckham, Gregg T.; Johnson, Christopher W.; Karp, Eric M.

    Lignin is a primary component of lignocellulosic biomass that is an underutilized feedstock in the growing biofuels industry. Despite the fact that lignin depolymerization has long been studied, the intrinsic heterogeneity of lignin typically leads to heterogeneous streams of aromatic compounds, which in turn present significant technical challenges when attempting to produce lignin-derived chemicals where purity is often a concern. In Nature, microorganisms often encounter this same problem during biomass turnover wherein powerful oxidative enzymes produce heterogeneous slates of aromatics compounds. Some microbes have evolved metabolic pathways to convert these aromatic species via ‘upper pathways’ into central intermediates, which canmore » then be funneled through ‘lower pathways’ into central carbon metabolism in a process we dubbed ‘biological funneling’. This funneling approach offers a direct, biological solution to overcome heterogeneity problems in lignin valorization for the modern biorefinery. Coupled to targeted separations and downstream chemical catalysis, this concept offers the ability to produce a wide range of molecules from lignin. This perspective describes research opportunities and challenges ahead for this new field of research, which holds significant promise towards a biorefinery concept wherein polysaccharides and lignin are treated as equally valuable feedstocks. In particular, we discuss tailoring the lignin substrate for microbial utilization, host selection for biological funneling, ligninolytic enzyme–microbe synergy, metabolic engineering, expanding substrate specificity for biological funneling, and process integration, each of which presents key challenges. Ultimately, for biological solutions to lignin valorization to be viable, multiple questions in each of these areas will need to be addressed, making biological lignin valorization a multidisciplinary, co-design problem.« less

  1. Using the Tower of Hanoi Puzzle to Infuse Your Mathematics Classroom with Computer Science Concepts

    ERIC Educational Resources Information Center

    Marzocchi, Alison S.

    2016-01-01

    This article suggests that logic puzzles, such as the well-known Tower of Hanoi puzzle, can be used to introduce computer science concepts to mathematics students of all ages. Mathematics teachers introduce their students to computer science concepts that are enacted spontaneously and subconsciously throughout the solution to the Tower of Hanoi…

  2. Ecosystem services in sustainable groundwater management.

    PubMed

    Tuinstra, Jaap; van Wensem, Joke

    2014-07-01

    The ecosystem services concept seems to get foothold in environmental policy and management in Europe and, for instance, The Netherlands. With respect to groundwater management there is a challenge to incorporate this concept in such a way that it contributes to the sustainability of decisions. Groundwater is of vital importance to societies, which is reflected in the presented overview of groundwater related ecosystem services. Classifications of these services vary depending on the purpose of the listing (valuation, protection, mapping et cetera). Though the scientific basis is developing, the knowledge-availability still can be a critical factor in decision making based upon ecosystem services. The examples in this article illustrate that awareness of the value of groundwater can result in balanced decisions with respect to the use of ecosystem services. The ecosystem services concept contributes to this awareness and enhances the visibility of the groundwater functions in the decision making process. The success of the ecosystem services concept and its contribution to sustainable groundwater management will, however, largely depend on other aspects than the concept itself. Local and actual circumstances, policy ambitions and knowledge availability will play an important role. Solutions can be considered more sustainable when more of the key elements for sustainable groundwater management, as defined in this article, are fully used and the presented guidelines for long term use of ecosystem services are respected. Copyright © 2014 Elsevier B.V. All rights reserved.

  3. Dialysis Cannot be Dosed

    PubMed Central

    Meyer, Timothy W.; Sirich, Tammy L.; Hostetter, Thomas H.

    2014-01-01

    Adequate dialysis is difficult to define because we have not identified the toxic solutes that contribute most to uremic illness. Dialysis prescriptions therefore cannot be adjusted to control the levels of these solutes. The current solution to this problem is to define an adequate dose of dialysis on the basis of fraction of urea removed from the body. This has provided a practical guide to treatment as the dialysis population has grown over the past 25 years. Indeed, a lower limit to Kt/Vurea (or the related urea reduction ratio) is now established as a quality indicator by the Centers for Medicare and Medicaid for chronic hemodialysis patients in the United States. For the present, this urea-based standard provides a useful tool to avoid grossly inadequate dialysis. Dialysis dosing, however, based on measurement of a single, relatively nontoxic solute can provide only a very limited guide toward improved treatment. Prescriptions which have similar effects on the index solute can have widely different effects on other solutes. The dose concept discourages attempts to increase the removal of such solutes independent of the index solute. The dose concept further assumes that important solutes are produced at a constant rate relative to body size, and discourages attempts to augment dialysis treatment by reducing solute production. Identification of toxic solutes would provide a more rational basis for the prescription of dialysis and ultimately for improved treatment of patients with renal failure. PMID:21929590

  4. Improvement of Automated POST Case Success Rate Using Support Vector Machines

    NASA Technical Reports Server (NTRS)

    Zwack, Mathew R.; Dees, Patrick D.

    2017-01-01

    During early conceptual design of complex systems, concept down selection can have a large impact upon program life-cycle cost. Therefore, any concepts selected during early design will inherently commit program costs and affect the overall probability of program success. For this reason it is important to consider as large a design space as possible in order to better inform the down selection process. For conceptual design of launch vehicles, trajectory analysis and optimization often presents the largest obstacle to evaluating large trade spaces. This is due to the sensitivity of the trajectory discipline to changes in all other aspects of the vehicle design. Small deltas in the performance of other subsystems can result in relatively large fluctuations in the ascent trajectory because the solution space is non-linear and multi-modal. In order to help capture large design spaces for new launch vehicles, the authors have performed previous work seeking to automate the execution of the industry standard tool, Program to Optimize Simulated Trajectories (POST). This work initially focused on implementation of analyst heuristics to enable closure of cases in an automated fashion, with the goal of applying the concepts of design of experiments (DOE) and surrogate modeling to enable near instantaneous throughput of vehicle cases.3 As noted in [4] work was then completed to improve the DOE process by utilizing a graph theory based approach to connect similar design points.

  5. Review and analysis of dense linear system solver package for distributed memory machines

    NASA Technical Reports Server (NTRS)

    Narang, H. N.

    1993-01-01

    A dense linear system solver package recently developed at the University of Texas at Austin for distributed memory machine (e.g. Intel Paragon) has been reviewed and analyzed. The package contains about 45 software routines, some written in FORTRAN, and some in C-language, and forms the basis for parallel/distributed solutions of systems of linear equations encountered in many problems of scientific and engineering nature. The package, being studied by the Computer Applications Branch of the Analysis and Computation Division, may provide a significant computational resource for NASA scientists and engineers in parallel/distributed computing. Since the package is new and not well tested or documented, many of its underlying concepts and implementations were unclear; our task was to review, analyze, and critique the package as a step in the process that will enable scientists and engineers to apply it to the solution of their problems. All routines in the package were reviewed and analyzed. Underlying theory or concepts which exist in the form of published papers or technical reports, or memos, were either obtained from the author, or from the scientific literature; and general algorithms, explanations, examples, and critiques have been provided to explain the workings of these programs. Wherever the things were still unclear, communications were made with the developer (author), either by telephone or by electronic mail, to understand the workings of the routines. Whenever possible, tests were made to verify the concepts and logic employed in their implementations. A detailed report is being separately documented to explain the workings of these routines.

  6. Connecting long distance: semantic distance in analogical reasoning modulates frontopolar cortex activity.

    PubMed

    Green, Adam E; Kraemer, David J M; Fugelsang, Jonathan A; Gray, Jeremy R; Dunbar, Kevin N

    2010-01-01

    Solving problems often requires seeing new connections between concepts or events that seemed unrelated at first. Innovative solutions of this kind depend on analogical reasoning, a relational reasoning process that involves mapping similarities between concepts. Brain-based evidence has implicated the frontal pole of the brain as important for analogical mapping. Separately, cognitive research has identified semantic distance as a key characteristic of the kind of analogical mapping that can support innovation (i.e., identifying similarities across greater semantic distance reveals connections that support more innovative solutions and models). However, the neural substrates of semantically distant analogical mapping are not well understood. Here, we used functional magnetic resonance imaging (fMRI) to measure brain activity during an analogical reasoning task, in which we parametrically varied the semantic distance between the items in the analogies. Semantic distance was derived quantitatively from latent semantic analysis. Across 23 participants, activity in an a priori region of interest (ROI) in left frontopolar cortex covaried parametrically with increasing semantic distance, even after removing effects of task difficulty. This ROI was centered on a functional peak that we previously associated with analogical mapping. To our knowledge, these data represent a first empirical characterization of how the brain mediates semantically distant analogical mapping.

  7. Using modified fruit fly optimisation algorithm to perform the function test and case studies

    NASA Astrophysics Data System (ADS)

    Pan, Wen-Tsao

    2013-06-01

    Evolutionary computation is a computing mode established by practically simulating natural evolutionary processes based on the concept of Darwinian Theory, and it is a common research method. The main contribution of this paper was to reinforce the function of searching for the optimised solution using the fruit fly optimization algorithm (FOA), in order to avoid the acquisition of local extremum solutions. The evolutionary computation has grown to include the concepts of animal foraging behaviour and group behaviour. This study discussed three common evolutionary computation methods and compared them with the modified fruit fly optimization algorithm (MFOA). It further investigated the ability of the three mathematical functions in computing extreme values, as well as the algorithm execution speed and the forecast ability of the forecasting model built using the optimised general regression neural network (GRNN) parameters. The findings indicated that there was no obvious difference between particle swarm optimization and the MFOA in regards to the ability to compute extreme values; however, they were both better than the artificial fish swarm algorithm and FOA. In addition, the MFOA performed better than the particle swarm optimization in regards to the algorithm execution speed, and the forecast ability of the forecasting model built using the MFOA's GRNN parameters was better than that of the other three forecasting models.

  8. High speed all optical networks

    NASA Technical Reports Server (NTRS)

    Chlamtac, Imrich; Ganz, Aura

    1990-01-01

    An inherent problem of conventional point-to-point wide area network (WAN) architectures is that they cannot translate optical transmission bandwidth into comparable user available throughput due to the limiting electronic processing speed of the switching nodes. The first solution to wavelength division multiplexing (WDM) based WAN networks that overcomes this limitation is presented. The proposed Lightnet architecture takes into account the idiosyncrasies of WDM switching/transmission leading to an efficient and pragmatic solution. The Lightnet architecture trades the ample WDM bandwidth for a reduction in the number of processing stages and a simplification of each switching stage, leading to drastically increased effective network throughputs. The principle of the Lightnet architecture is the construction and use of virtual topology networks, embedded in the original network in the wavelength domain. For this construction Lightnets utilize the new concept of lightpaths which constitute the links of the virtual topology. Lightpaths are all-optical, multihop, paths in the network that allow data to be switched through intermediate nodes using high throughput passive optical switches. The use of the virtual topologies and the associated switching design introduce a number of new ideas, which are discussed in detail.

  9. Excitonic Materials for Hybrid Solar Cells and Energy Efficient Lighting

    NASA Astrophysics Data System (ADS)

    Kabra, Dinesh; Lu, Li Ping; Vaynzof, Yana; Song, Myounghoon; Snaith, Henry J.; Friend, Richard H.

    2011-07-01

    Conventional photovoltaic technology will certainly contribute this century, but to generate a significant fraction of our global power from solar energy, a radically new disruptive technology is required. Research primarily focused on developing the physics and technologies being low cost photovoltaic concepts are required. The materials with carbon-based solution processible organic semiconductors with power conversion efficiency as high as ˜8.2%, which have emerged over the last decade as promising alternatives to expensive silicon based technologies. We aim at exploring the morphological and optoelectronic properties of blends of newly synthesized polymer semiconductors as a route to enhance the performance of organic semiconductor based optoelectronic devices, like photovoltaic diodes (PV) and Light Emitting Diodes (LED). OLED efficiency has reached upto 150 lm/W and going to be next generation cheap and eco friendly solid state lighting solution. Hybrid electronics represent a valuable alternative for the production of easy processible, flexible and reliable optoelectronic thin film devices. I will be presenting recent advancement of my work in the area of hybrid photovoltaics, PLED and research path towards realization electrically injectable organic laser diodes.

  10. Streaming Support for Data Intensive Cloud-Based Sequence Analysis

    PubMed Central

    Issa, Shadi A.; Kienzler, Romeo; El-Kalioby, Mohamed; Tonellato, Peter J.; Wall, Dennis; Bruggmann, Rémy; Abouelhoda, Mohamed

    2013-01-01

    Cloud computing provides a promising solution to the genomics data deluge problem resulting from the advent of next-generation sequencing (NGS) technology. Based on the concepts of “resources-on-demand” and “pay-as-you-go”, scientists with no or limited infrastructure can have access to scalable and cost-effective computational resources. However, the large size of NGS data causes a significant data transfer latency from the client's site to the cloud, which presents a bottleneck for using cloud computing services. In this paper, we provide a streaming-based scheme to overcome this problem, where the NGS data is processed while being transferred to the cloud. Our scheme targets the wide class of NGS data analysis tasks, where the NGS sequences can be processed independently from one another. We also provide the elastream package that supports the use of this scheme with individual analysis programs or with workflow systems. Experiments presented in this paper show that our solution mitigates the effect of data transfer latency and saves both time and cost of computation. PMID:23710461

  11. Practical Solutions for Pesticide Safety: A Farm and Research Team Participatory Model

    PubMed Central

    Galvin, Kit; Krenz, Jen; Harrington, Marcy; Palmández, Pablo; Fenske, Richard A.

    2018-01-01

    Development of the Practical Solutions for Pesticide Safety guide used participatory research strategies to identify and evaluate solutions that reduce pesticide exposures for workers and their families and to disseminate these solutions. Project principles were (1) workplace chemicals belong in the workplace, and (2) pesticide handlers and farm managers are experts, with direct knowledge of production practices. The project’s participatory methods were grounded in self-determination theory. Practical solutions were identified and evaluated based on five criteria: practicality, adaptability, health and safety, novelty, and regulatory compliance. Research activities that had more personal contact provided better outcomes. The Expert Working Group, composed of farm managers and pesticide handlers, was key to the identification of solutions, as were farm site visits. Audience participation, hands-on testing, and orchard field trials were particularly effective in the evaluation of potential solutions. Small work groups in a Regional Advisory Committee provided the best direction and guidance for a “user-friendly” translational document that provided evidence-based practical solutions. The “farmer to farmer” format of the guide was endorsed by both the Expert Working Group and the Regional Advisory Committee. Managers and pesticide handlers wanted to share their solutions in order to “help others stay safe,” and they appreciated attribution in the guide. The guide is now being used in educational programs across the region. The fundamental concept that farmers and farmworkers are innovators and experts in agricultural production was affirmed by this study. The success of this process demonstrates the value of participatory industrial hygiene in agriculture. PMID:26488540

  12. Practical Solutions for Pesticide Safety: A Farm and Research Team Participatory Model.

    PubMed

    Galvin, Kit; Krenz, Jen; Harrington, Marcy; Palmández, Pablo; Fenske, Richard A

    2016-01-01

    Development of the Practical Solutions for Pesticide Safety guide used participatory research strategies to identify and evaluate solutions that reduce pesticide exposures for workers and their families and to disseminate these solutions. Project principles were (1) workplace chemicals belong in the workplace, and (2) pesticide handlers and farm managers are experts, with direct knowledge of production practices. The project's participatory methods were grounded in self-determination theory. Practical solutions were identified and evaluated based on five criteria: practicality, adaptability, health and safety, novelty, and regulatory compliance. Research activities that had more personal contact provided better outcomes. The Expert Working Group, composed of farm managers and pesticide handlers, was key to the identification of solutions, as were farm site visits. Audience participation, hands-on testing, and orchard field trials were particularly effective in the evaluation of potential solutions. Small work groups in a Regional Advisory Committee provided the best direction and guidance for a "user-friendly" translational document that provided evidence-based practical solutions. The "farmer to farmer" format of the guide was endorsed by both the Expert Working Group and the Regional Advisory Committee. Managers and pesticide handlers wanted to share their solutions in order to "help others stay safe," and they appreciated attribution in the guide. The guide is now being used in educational programs across the region. The fundamental concept that farmers and farmworkers are innovators and experts in agricultural production was affirmed by this study. The success of this process demonstrates the value of participatory industrial hygiene in agriculture.

  13. Telepresence and telerobotics

    NASA Technical Reports Server (NTRS)

    Garin, John; Matteo, Joseph; Jennings, Von Ayre

    1988-01-01

    The capability for a single operator to simultaneously control complex remote multi degree of freedom robotic arms and associated dextrous end effectors is being developed. An optimal solution within the realm of current technology, can be achieved by recognizing that: (1) machines/computer systems are more effective than humans when the task is routine and specified, and (2) humans process complex data sets and deal with the unpredictable better than machines. These observations lead naturally to a philosophy in which the human's role becomes a higher level function associated with planning, teaching, initiating, monitoring, and intervening when the machine gets into trouble, while the machine performs the codifiable tasks with deliberate efficiency. This concept forms the basis for the integration of man and telerobotics, i.e., robotics with the operator in the control loop. The concept of integration of the human in the loop and maximizing the feed-forward and feed-back data flow is referred to as telepresence.

  14. A knowledge-based system for prototypical reasoning

    NASA Astrophysics Data System (ADS)

    Lieto, Antonio; Minieri, Andrea; Piana, Alberto; Radicioni, Daniele P.

    2015-04-01

    In this work we present a knowledge-based system equipped with a hybrid, cognitively inspired architecture for the representation of conceptual information. The proposed system aims at extending the classical representational and reasoning capabilities of the ontology-based frameworks towards the realm of the prototype theory. It is based on a hybrid knowledge base, composed of a classical symbolic component (grounded on a formal ontology) with a typicality based one (grounded on the conceptual spaces framework). The resulting system attempts to reconcile the heterogeneous approach to the concepts in Cognitive Science with the dual process theories of reasoning and rationality. The system has been experimentally assessed in a conceptual categorisation task where common sense linguistic descriptions were given in input, and the corresponding target concepts had to be identified. The results show that the proposed solution substantially extends the representational and reasoning 'conceptual' capabilities of standard ontology-based systems.

  15. The complex system of environmental monitoring (CSEM). An analysis of concept

    NASA Astrophysics Data System (ADS)

    Kazatsev, Yury I.; Mitchenkov, Igor G.

    2018-01-01

    Researches of ecological processes in Russia testify to rather difficult and adverse situation. It is quite obviously that we need a reasonable concept of an exit from the situation. It could become the strategic program for the solution of ecological tasks at the same time. In this regard, it is obviously necessary not just to develop new scientific and technological mechanisms to overcome of critical situations, but to offer a new research platform which will be able to give ideas to predict tendencies of ecological development and to analyses the consequences of their embodiment. Our offer for it is “the composite system of environmental monitoring” (CSEM) of the territory or region. We use a method of the conceptual analysis. Also, we will try to show how definition of the term influences to contents of social practicing, namely, environmental monitoring of the region.

  16. Standardized data collection to build prediction models in oncology: a prototype for rectal cancer.

    PubMed

    Meldolesi, Elisa; van Soest, Johan; Damiani, Andrea; Dekker, Andre; Alitto, Anna Rita; Campitelli, Maura; Dinapoli, Nicola; Gatta, Roberto; Gambacorta, Maria Antonietta; Lanzotti, Vito; Lambin, Philippe; Valentini, Vincenzo

    2016-01-01

    The advances in diagnostic and treatment technology are responsible for a remarkable transformation in the internal medicine concept with the establishment of a new idea of personalized medicine. Inter- and intra-patient tumor heterogeneity and the clinical outcome and/or treatment's toxicity's complexity, justify the effort to develop predictive models from decision support systems. However, the number of evaluated variables coming from multiple disciplines: oncology, computer science, bioinformatics, statistics, genomics, imaging, among others could be very large thus making traditional statistical analysis difficult to exploit. Automated data-mining processes and machine learning approaches can be a solution to organize the massive amount of data, trying to unravel important interaction. The purpose of this paper is to describe the strategy to collect and analyze data properly for decision support and introduce the concept of an 'umbrella protocol' within the framework of 'rapid learning healthcare'.

  17. Patient-Centered Precision Health In A Learning Health Care System: Geisinger's Genomic Medicine Experience.

    PubMed

    Williams, Marc S; Buchanan, Adam H; Davis, F Daniel; Faucett, W Andrew; Hallquist, Miranda L G; Leader, Joseph B; Martin, Christa L; McCormick, Cara Z; Meyer, Michelle N; Murray, Michael F; Rahm, Alanna K; Schwartz, Marci L B; Sturm, Amy C; Wagner, Jennifer K; Williams, Janet L; Willard, Huntington F; Ledbetter, David H

    2018-05-01

    Health care delivery is increasingly influenced by the emerging concepts of precision health and the learning health care system. Although not synonymous with precision health, genomics is a key enabler of individualized care. Delivering patient-centered, genomics-informed care based on individual-level data in the current national landscape of health care delivery is a daunting challenge. Problems to overcome include data generation, analysis, storage, and transfer; knowledge management and representation for patients and providers at the point of care; process management; and outcomes definition, collection, and analysis. Development, testing, and implementation of a genomics-informed program requires multidisciplinary collaboration and building the concepts of precision health into a multilevel implementation framework. Using the principles of a learning health care system provides a promising solution. This article describes the implementation of population-based genomic medicine in an integrated learning health care system-a working example of a precision health program.

  18. Design of a cooperative problem-solving system for enroute flight planning: An empirical study of its use by airline dispatchers

    NASA Technical Reports Server (NTRS)

    Smith, Philip J.; Mccoy, C. Elaine; Layton, Charles; Orasanu, Judith; Chappel, Sherry; Palmer, EV; Corker, Kevin

    1993-01-01

    In a previous report, an empirical study of 30 pilots using the Flight Planning Testbed was reported. An identical experiment using the Flight Planning Testbed (FPT), except that 27 airline dispatchers were studied, is described. Five general questions were addressed in this study: (1) under what circumstances do the introduction of computer-generated suggestions (flight plans) influence the planning behavior of dispatchers (either in a beneficial or adverse manner); (2) what is the nature of such influences (i.e., how are the person's cognitive processes changed); (3) how beneficial are the general design concepts underlying FPT (use of a graphical interface, embedding graphics in a spreadsheet, etc.); (4) how effective are the specific implementation decisions made in realizing these general design concepts; and (5) how effectively do dispatchers evaluate situations requiring replanning, and how effectively do they identify appropriate solutions to these situations.

  19. MODFLOW-2000, The U.S. Geological Survey Modular Ground-Water Model - User Guide to Modularization Concepts and the Ground-Water Flow Process

    USGS Publications Warehouse

    Harbaugh, Arlen W.; Banta, Edward R.; Hill, Mary C.; McDonald, Michael G.

    2000-01-01

    MODFLOW is a computer program that numerically solves the three-dimensional ground-water flow equation for a porous medium by using a finite-difference method. Although MODFLOW was designed to be easily enhanced, the design was oriented toward additions to the ground-water flow equation. Frequently there is a need to solve additional equations; for example, transport equations and equations for estimating parameter values that produce the closest match between model-calculated heads and flows and measured values. This report documents a new version of MODFLOW, called MODFLOW-2000, which is designed to accommodate the solution of equations in addition to the ground-water flow equation. This report is a user's manual. It contains an overview of the old and added design concepts, documents one new package, and contains input instructions for using the model to solve the ground-water flow equation.

  20. Single-Molecule FRET Spectroscopy and the Polymer Physics of Unfolded and Intrinsically Disordered Proteins.

    PubMed

    Schuler, Benjamin; Soranno, Andrea; Hofmann, Hagen; Nettels, Daniel

    2016-07-05

    The properties of unfolded proteins have long been of interest because of their importance to the protein folding process. Recently, the surprising prevalence of unstructured regions or entirely disordered proteins under physiological conditions has led to the realization that such intrinsically disordered proteins can be functional even in the absence of a folded structure. However, owing to their broad conformational distributions, many of the properties of unstructured proteins are difficult to describe with the established concepts of structural biology. We have thus seen a reemergence of polymer physics as a versatile framework for understanding their structure and dynamics. An important driving force for these developments has been single-molecule spectroscopy, as it allows structural heterogeneity, intramolecular distance distributions, and dynamics to be quantified over a wide range of timescales and solution conditions. Polymer concepts provide an important basis for relating the physical properties of unstructured proteins to folding and function.

  1. Teaching concepts of clinical measurement variation to medical students.

    PubMed

    Hodder, R A; Longfield, J N; Cruess, D F; Horton, J A

    1982-09-01

    An exercise in clinical epidemiology was developed for medical students to demonstrate the process and limitations of scientific measurement using models that simulate common clinical experiences. All scales of measurement (nominal, ordinal and interval) were used to illustrate concepts of intra- and interobserver variation, systematic error, recording error, and procedural error. In a laboratory, students a) determined blood pressures on six videotaped subjects, b) graded sugar content of unknown solutions from 0 to 4+ using Clinitest tablets, c) measured papules that simulated PPD reactions, d) measured heart and kidney size on X-rays and, e) described a model skin lesion (melanoma). Traditionally, measurement variation is taught in biostatistics or epidemiology courses using previously collected data. Use of these models enables students to produce their own data using measurements commonly employed by the clinician. The exercise provided material for a meaningful discussion of the implications of measurement error in clinical decision-making.

  2. A Primer on Autonomous Aerial Vehicle Design

    PubMed Central

    Coppejans, Hugo H. G.; Myburgh, Herman C.

    2015-01-01

    There is a large amount of research currently being done on autonomous micro-aerial vehicles (MAV), such as quadrotor helicopters or quadcopters. The ability to create a working autonomous MAV depends mainly on integrating a simultaneous localization and mapping (SLAM) solution with the rest of the system. This paper provides an introduction for creating an autonomous MAV for enclosed environments, aimed at students and professionals alike. The standard autonomous system and MAV automation are discussed, while we focus on the core concepts of SLAM systems and trajectory planning algorithms. The advantages and disadvantages of using remote processing are evaluated, and recommendations are made regarding the viability of on-board processing. Recommendations are made regarding best practices to serve as a guideline for aspirant MAV designers. PMID:26633410

  3. A Primer on Autonomous Aerial Vehicle Design.

    PubMed

    Coppejans, Hugo H G; Myburgh, Herman C

    2015-12-02

    There is a large amount of research currently being done on autonomous micro-aerial vehicles (MAV), such as quadrotor helicopters or quadcopters. The ability to create a working autonomous MAV depends mainly on integrating a simultaneous localization and mapping (SLAM) solution with the rest of the system. This paper provides an introduction for creating an autonomous MAV for enclosed environments, aimed at students and professionals alike. The standard autonomous system and MAV automation are discussed, while we focus on the core concepts of SLAM systems and trajectory planning algorithms. The advantages and disadvantages of using remote processing are evaluated, and recommendations are made regarding the viability of on-board processing. Recommendations are made regarding best practices to serve as a guideline for aspirant MAV designers.

  4. Inertial vestibular coding of motion: concepts and evidence

    NASA Technical Reports Server (NTRS)

    Hess, B. J.; Angelaki, D. E.

    1997-01-01

    Central processing of inertial sensory information about head attitude and motion in space is crucial for motor control. Vestibular signals are coded relative to a non-inertial system, the head, that is virtually continuously in motion. Evidence for transformation of vestibular signals from head-fixed sensory coordinates to gravity-centered coordinates have been provided by studies of the vestibulo-ocular reflex. The underlying central processing depends on otolith afferent information that needs to be resolved in terms of head translation related inertial forces and head attitude dependent pull of gravity. Theoretical solutions have been suggested, but experimental evidence is still scarce. It appears, along these lines, that gaze control systems are intimately linked to motor control of head attitude and posture.

  5. Precorrection concepts for mobile terminals with processing satellites

    NASA Astrophysics Data System (ADS)

    Nakamoto, F. S.; Oreilly, M. P.; Wolfson, C. R.

    It is pointed out that when the spacecraft must process a large number of users simultaneously, it becomes impractical for it to acquire and track each uplink signal. A solution is for the terminals to precorrect their uplink transmissions so that they reach the spacecraft in time and frequency synchronism with the spacecraft receiver. Two dimensions of precorrection, namely time and frequency, are addressed. Precorrection approaches are classified as open loop, pseudo-open loop, or pseudo-closed loop. Performance relationships are established, and the applicability, requirements, advantages, and disadvantages of each class are discussed. It is found that since time and frequency precorrection have opposite sensitivities to the frequency hopping rate, different classes will often be adopted for the two dimensions.

  6. High-temperature microelectromechanical pressure sensors based on a SOI heterostructure for an electronic automatic aircraft engine control system

    NASA Astrophysics Data System (ADS)

    Sokolov, Leonid V.

    2010-08-01

    There is a need of measuring distributed pressure on the aircraft engine inlet with high precision within a wide operating temperature range in the severe environment to improve the efficiency of aircraft engine control. The basic solutions and principles of designing high-temperature (to 523K) microelectromechanical pressure sensors based on a membrane-type SOI heterostructure with a monolithic integral tensoframe (MEMS-SOIMT) are proposed in accordance with the developed concept, which excludes the use of electric p-n junctions in semiconductor microelectromechanical sensors. The MEMS-SOIMT technology relies on the group processes of microelectronics and micromechanics for high-precision microprofiling of a three-dimension micromechanical structure, which exclude high-temperature silicon doping processes.

  7. Voronoi Tessellation for reducing the processing time of correlation functions

    NASA Astrophysics Data System (ADS)

    Cárdenas-Montes, Miguel; Sevilla-Noarbe, Ignacio

    2018-01-01

    The increase of data volume in Cosmology is motivating the search of new solutions for solving the difficulties associated with the large processing time and precision of calculations. This is specially true in the case of several relevant statistics of the galaxy distribution of the Large Scale Structure of the Universe, namely the two and three point angular correlation functions. For these, the processing time has critically grown with the increase of the size of the data sample. Beyond parallel implementations to overcome the barrier of processing time, space partitioning algorithms are necessary to reduce the computational load. These can delimit the elements involved in the correlation function estimation to those that can potentially contribute to the final result. In this work, Voronoi Tessellation is used to reduce the processing time of the two-point and three-point angular correlation functions. The results of this proof-of-concept show a significant reduction of the processing time when preprocessing the galaxy positions with Voronoi Tessellation.

  8. Multi-disciplinary interoperability challenges (Ian McHarg Medal Lecture)

    NASA Astrophysics Data System (ADS)

    Annoni, Alessandro

    2013-04-01

    Global sustainability research requires multi-disciplinary efforts to address the key research challenges to increase our understanding of the complex relationships between environment and society. For this reason dependence on ICT systems interoperability is rapidly growing but, despite some relevant technological improvement is observed, in practice operational interoperable solutions are still lacking. Among the causes is the absence of a generally accepted definition of "interoperability" in all its broader aspects. In fact the concept of interoperability is just a concept and the more popular definitions are not addressing all challenges to realize operational interoperable solutions. The problem become even more complex when multi-disciplinary interoperability is required because in that case solutions for interoperability of different interoperable solution should be envisaged. In this lecture the following definition will be used: "interoperability is the ability to exchange information and to use it". In the lecture the main challenges for addressing multi-disciplinary interoperability will be presented and a set of proposed approaches/solutions shortly introduced.

  9. Who will save the tokamak - Harry Potter, Arnold Schwarzenegger, or Shaquille O'Neil?

    NASA Astrophysics Data System (ADS)

    Freidberg, J.; Mangiarotti, F.; Minervini, J.

    2014-10-01

    The tokamak is the current leading contender for a fusion power reactor. The reason for the preeminence of the tokamak is its high quality plasma physics performance relative to other concepts. Even so, it is well known that the tokamak must still overcome two basic physics challenges before becoming viable as a DEMO and ultimately a reactor: (1) the achievement of non-inductive steady state operation, and (2) the achievement of robust disruption free operation. These are in addition to the PMI problems faced by all concepts. The work presented here demonstrates by means of a simple but highly credible analytic calculation that a ``standard'' tokamak cannot lead to a reactor - it is just not possible to simultaneously satisfy all the plasma physics plus engineering constraints. Three possible solutions, some more well-known than others, to the problem are analyzed. These visual image generating solutions are defined as (1) the Harry Potter solution, (2) the Arnold Schwarzenegger solution, and (3) the Shaquille O'Neil solution. Each solution will be described both qualitatively and quantitatively at the meeting.

  10. Lax Integrability and the Peakon Problem for the Modified Camassa-Holm Equation

    NASA Astrophysics Data System (ADS)

    Chang, Xiangke; Szmigielski, Jacek

    2018-02-01

    Peakons are special weak solutions of a class of nonlinear partial differential equations modelling non-linear phenomena such as the breakdown of regularity and the onset of shocks. We show that the natural concept of weak solutions in the case of the modified Camassa-Holm equation studied in this paper is dictated by the distributional compatibility of its Lax pair and, as a result, it differs from the one proposed and used in the literature based on the concept of weak solutions used for equations of the Burgers type. Subsequently, we give a complete construction of peakon solutions satisfying the modified Camassa-Holm equation in the sense of distributions; our approach is based on solving certain inverse boundary value problem, the solution of which hinges on a combination of classical techniques of analysis involving Stieltjes' continued fractions and multi-point Padé approximations. We propose sufficient conditions needed to ensure the global existence of peakon solutions and analyze the large time asymptotic behaviour whose special features include a formation of pairs of peakons that share asymptotic speeds, as well as Toda-like sorting property.

  11. Inertial Fusion Power Plant Concept of Operations and Maintenance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anklam, T.; Knutson, B.; Dunne, A. M.

    2015-01-15

    Parsons and LLNL scientists and engineers performed design and engineering work for power plant pre-conceptual designs based on the anticipated laser fusion demonstrations at the National Ignition Facility (NIF). Work included identifying concepts of operations and maintenance (O&M) and associated requirements relevant to fusion power plant systems analysis. A laser fusion power plant would incorporate a large process and power conversion facility with a laser system and fusion engine serving as the heat source, based in part on some of the systems and technologies advanced at NIF. Process operations would be similar in scope to those used in chemical, oilmore » refinery, and nuclear waste processing facilities, while power conversion operations would be similar to those used in commercial thermal power plants. While some aspects of the tritium fuel cycle can be based on existing technologies, many aspects of a laser fusion power plant presents several important and unique O&M requirements that demand new solutions. For example, onsite recovery of tritium; unique remote material handling systems for use in areas with high radiation, radioactive materials, or high temperatures; a five-year fusion engine target chamber replacement cycle with other annual and multi-year cycles anticipated for major maintenance of other systems, structures, and components (SSC); and unique SSC for fusion target waste recycling streams. This paper describes fusion power plant O&M concepts and requirements, how O&M requirements could be met in design, and how basic organizational and planning issues can be addressed for a safe, reliable, economic, and feasible fusion power plant.« less

  12. Inertial fusion power plant concept of operations and maintenance

    NASA Astrophysics Data System (ADS)

    Knutson, Brad; Dunne, Mike; Kasper, Jack; Sheehan, Timothy; Lang, Dwight; Anklam, Tom; Roberts, Valerie; Mau, Derek

    2015-02-01

    Parsons and LLNL scientists and engineers performed design and engineering work for power plant pre-conceptual designs based on the anticipated laser fusion demonstrations at the National Ignition Facility (NIF). Work included identifying concepts of operations and maintenance (O&M) and associated requirements relevant to fusion power plant systems analysis. A laser fusion power plant would incorporate a large process and power conversion facility with a laser system and fusion engine serving as the heat source, based in part on some of the systems and technologies advanced at NIF. Process operations would be similar in scope to those used in chemical, oil refinery, and nuclear waste processing facilities, while power conversion operations would be similar to those used in commercial thermal power plants. While some aspects of the tritium fuel cycle can be based on existing technologies, many aspects of a laser fusion power plant presents several important and unique O&M requirements that demand new solutions. For example, onsite recovery of tritium; unique remote material handling systems for use in areas with high radiation, radioactive materials, or high temperatures; a five-year fusion engine target chamber replacement cycle with other annual and multi-year cycles anticipated for major maintenance of other systems, structures, and components (SSC); and unique SSC for fusion target waste recycling streams. This paper describes fusion power plant O&M concepts and requirements, how O&M requirements could be met in design, and how basic organizational and planning issues can be addressed for a safe, reliable, economic, and feasible fusion power plant.

  13. Upwind schemes and bifurcating solutions in real gas computations

    NASA Technical Reports Server (NTRS)

    Suresh, Ambady; Liou, Meng-Sing

    1992-01-01

    The area of high speed flow is seeing a renewed interest due to advanced propulsion concepts such as the National Aerospace Plane (NASP), Space Shuttle, and future civil transport concepts. Upwind schemes to solve such flows have become increasingly popular in the last decade due to their excellent shock capturing properties. In the first part of this paper the authors present the extension of the Osher scheme to equilibrium and non-equilibrium gases. For simplicity, the source terms are treated explicitly. Computations based on the above scheme are presented to demonstrate the feasibility, accuracy and efficiency of the proposed scheme. One of the test problems is a Chapman-Jouguet detonation problem for which numerical solutions have been known to bifurcate into spurious weak detonation solutions on coarse grids. Results indicate that the numerical solution obtained depends both on the upwinding scheme used and the limiter employed to obtain second order accuracy. For example, the Osher scheme gives the correct CJ solution when the super-bee limiter is used, but gives the spurious solution when the Van Leer limiter is used. With the Roe scheme the spurious solution is obtained for all limiters.

  14. Use of Solar Energy Hybrid Dryer with Techno-Ergonomic Application to Increase Productivity of Dodol Wokers in Buleleng, Bali

    NASA Astrophysics Data System (ADS)

    Santosa, I. G.; Sutarna, I. N.

    2018-01-01

    Penglatan Village is one of dodol industrial centres in Buleleng Regency, Bali. Dodol is a balinese traditional snack that usually used for religion ceremonies offering or “sesajen”. Dodol’s making processes had several layer of stages, from making dough, stirring dough, packaging and drying dodol. Because the drying process is done with traditional work tools, this process cause several ergonomic problems. Based on preliminary research, the complaints that felt by dodol workers are pain in the neck, shoulders, back, waist, head, and hands. Therefore, productivity of workers are decreasing. This problem solved by designing a hybrid solar drying tools with the application of techno-ergonomic, appropriate technology concept (TTG) and from ergonomic science with the application of SHIP concept (systemic, holistic, interdisciplinary, participatory). This research use 20 people as sample. The sample performance is observed while working traditionally and using techno-ergonomic hybrid solar dryer tools. The measurements conducted three times which are First Period (PI), Second Period (PII), and Third Period (PIII) with interspersed time for rest. WOP (washing out period) is intended to eliminate residual effects. The data were analyzed with SPSS program with significance level of 0.05. Hopefully, with this solution will improve worker productivity.

  15. Is it possible to give scientific solutions to Grand Challenges? On the idea of grand challenges for life science research.

    PubMed

    Efstathiou, Sophia

    2016-04-01

    This paper argues that challenges that are grand in scope such as "lifelong health and wellbeing", "climate action", or "food security" cannot be addressed through scientific research only. Indeed scientific research could inhibit addressing such challenges if scientific analysis constrains the multiple possible understandings of these challenges into already available scientific categories and concepts without translating between these and everyday concerns. This argument builds on work in philosophy of science and race to postulate a process through which non-scientific notions become part of science. My aim is to make this process available to scrutiny: what I call founding everyday ideas in science is both culturally and epistemologically conditioned. Founding transforms a common idea into one or more scientifically relevant ones, which can be articulated into descriptively thicker and evaluatively deflated terms and enable operationalisation and measurement. The risk of founding however is that it can invisibilise or exclude from realms of scientific scrutiny interpretations that are deemed irrelevant, uninteresting or nonsensical in the domain in question-but which may remain salient for addressing grand-in-scope challenges. The paper considers concepts of "wellbeing" in development economics versus in gerontology to illustrate this process. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. Advanced ensemble modelling of flexible macromolecules using X-ray solution scattering.

    PubMed

    Tria, Giancarlo; Mertens, Haydyn D T; Kachala, Michael; Svergun, Dmitri I

    2015-03-01

    Dynamic ensembles of macromolecules mediate essential processes in biology. Understanding the mechanisms driving the function and molecular interactions of 'unstructured' and flexible molecules requires alternative approaches to those traditionally employed in structural biology. Small-angle X-ray scattering (SAXS) is an established method for structural characterization of biological macromolecules in solution, and is directly applicable to the study of flexible systems such as intrinsically disordered proteins and multi-domain proteins with unstructured regions. The Ensemble Optimization Method (EOM) [Bernadó et al. (2007 ▶). J. Am. Chem. Soc. 129, 5656-5664] was the first approach introducing the concept of ensemble fitting of the SAXS data from flexible systems. In this approach, a large pool of macromolecules covering the available conformational space is generated and a sub-ensemble of conformers coexisting in solution is selected guided by the fit to the experimental SAXS data. This paper presents a series of new developments and advancements to the method, including significantly enhanced functionality and also quantitative metrics for the characterization of the results. Building on the original concept of ensemble optimization, the algorithms for pool generation have been redesigned to allow for the construction of partially or completely symmetric oligomeric models, and the selection procedure was improved to refine the size of the ensemble. Quantitative measures of the flexibility of the system studied, based on the characteristic integral parameters of the selected ensemble, are introduced. These improvements are implemented in the new EOM version 2.0, and the capabilities as well as inherent limitations of the ensemble approach in SAXS, and of EOM 2.0 in particular, are discussed.

  17. Quasi-integrability in the modified defocusing non-linear Schrödinger model and dark solitons

    NASA Astrophysics Data System (ADS)

    Blas, H.; Zambrano, M.

    2016-03-01

    The concept of quasi-integrability has been examined in the context of deformations of the defocusing non-linear Schrödinger model (NLS). Our results show that the quasi-integrability concept, recently discussed in the context of deformations of the sine-Gordon, Bullough-Dodd and focusing NLS models, holds for the modified defocusing NLS model with dark soliton solutions and it exhibits the new feature of an infinite sequence of alternating conserved and asymptotically conserved charges. For the special case of two dark soliton solutions, where the field components are eigenstates of a space-reflection symmetry, the first four and the sequence of even order charges are exactly conserved in the scattering process of the solitons. Such results are obtained through analytical and numerical methods, and employ adaptations of algebraic techniques used in integrable field theories. We perform extensive numerical simulations and consider the scattering of dark solitons for the cubic-quintic NLS model with potential V=η {I}^2-in /6{I}^3 and the saturable type potential satisfying [InlineEquation not available: see fulltext.], with a deformation parameter ɛ ∈ [InlineMediaObject not available: see fulltext.] and I = | ψ|2. The issue of the renormalization of the charges and anomalies, and their (quasi)conservation laws are properly addressed. The saturable NLS supports elastic scattering of two soliton solutions for a wide range of values of { η, ɛ, q}. Our results may find potential applications in several areas of non-linear science, such as the Bose-Einstein condensation.

  18. An open, object-based modeling approach for simulating subsurface heterogeneity

    NASA Astrophysics Data System (ADS)

    Bennett, J.; Ross, M.; Haslauer, C. P.; Cirpka, O. A.

    2017-12-01

    Characterization of subsurface heterogeneity with respect to hydraulic and geochemical properties is critical in hydrogeology as their spatial distribution controls groundwater flow and solute transport. Many approaches of characterizing subsurface heterogeneity do not account for well-established geological concepts about the deposition of the aquifer materials; those that do (i.e. process-based methods) often require forcing parameters that are difficult to derive from site observations. We have developed a new method for simulating subsurface heterogeneity that honors concepts of sequence stratigraphy, resolves fine-scale heterogeneity and anisotropy of distributed parameters, and resembles observed sedimentary deposits. The method implements a multi-scale hierarchical facies modeling framework based on architectural element analysis, with larger features composed of smaller sub-units. The Hydrogeological Virtual Reality simulator (HYVR) simulates distributed parameter models using an object-based approach. Input parameters are derived from observations of stratigraphic morphology in sequence type-sections. Simulation outputs can be used for generic simulations of groundwater flow and solute transport, and for the generation of three-dimensional training images needed in applications of multiple-point geostatistics. The HYVR algorithm is flexible and easy to customize. The algorithm was written in the open-source programming language Python, and is intended to form a code base for hydrogeological researchers, as well as a platform that can be further developed to suit investigators' individual needs. This presentation will encompass the conceptual background and computational methods of the HYVR algorithm, the derivation of input parameters from site characterization, and the results of groundwater flow and solute transport simulations in different depositional settings.

  19. Obesity as a Socially Defined Disease: Philosophical Considerations and Implications for Policy and Care.

    PubMed

    Hofmann, Bjørn

    2016-03-01

    Obesity has generated significant worries amongst health policy makers and has obtained increased attention in health care. Obesity is unanimously defined as a disease in the health care and health policy literature. However, there are pragmatic and not principled reasons for this. This warrants an analysis of obesity according to standard conceptions of disease in the literature of philosophy of medicine. According to theories and definitions of disease referring to (abnormal functioning of) internal processes, obesity is not a disease. Obesity undoubtedly can result in disease, making it a risk factor for disease, but not a disease per se. According to several social conceptions of disease, however, obesity clearly is a disease. Obesity can conflict with aesthetic, moral, or other social norms. Making obesity a "social disease" may very well be a wise health policy, assuring and improving population health, especially if we address the social determinants of obesity, such as the food supply and marketing system. However, applying biomedical solutions to social problems may also have severe side effects. It can result in medicalization and enhance stigmatization and discrimination of persons based on appearance or behavior. Approaching social problems with biomedical means may also serve commercial and professionals' interests more than the health and welfare of individuals; it may make quick fix medical solutions halt more sustainable structural solutions. This urges health insurers, health care professionals, and health policy makers to be cautious. Especially if we want to help and respect persons that we classify and treat as obese.

  20. Method for PE Pipes Fusion Jointing Based on TRIZ Contradictions Theory

    NASA Astrophysics Data System (ADS)

    Sun, Jianguang; Tan, Runhua; Gao, Jinyong; Wei, Zihui

    The core of the TRIZ theories is the contradiction detection and solution. TRIZ provided various methods for the contradiction solution, but all that is not systematized. Combined with the technique system conception, this paper summarizes an integration solution method for contradiction solution based on the TRIZ contradiction theory. According to the method, a flowchart of integration solution method for contradiction is given. As a casestudy, method of fusion jointing PE pipe is analysised.

  1. Elements of orbit-determination theory - Textbook

    NASA Technical Reports Server (NTRS)

    Solloway, C. B.

    1971-01-01

    Text applies to solution of various optimization problems. Concepts are logically introduced and refinements and complexities for computerized numerical solutions are avoided. Specific topics and essential equivalence of several different approaches to various aspects of the problem are given.

  2. Analysis of chemical concepts as the basic of virtual laboratory development and process science skills in solubility and solubility product subject

    NASA Astrophysics Data System (ADS)

    Syafrina, R.; Rohman, I.; Yuliani, G.

    2018-05-01

    This study aims to analyze the concept characteristics of solubility and solubility products that will serve as the basis for the development of virtual laboratory and students' science process skills. Characteristics of the analyzed concepts include concept definitions, concept attributes, and types of concepts. The concept analysis method uses concept analysis according to Herron. The results of the concept analysis show that there are twelve chemical concepts that become the prerequisite concept before studying the solubility and solubility and five core concepts that students must understand in the solubility and Solubility product. As many as 58.3% of the definitions of the concepts contained in high school textbooks support students' science process skills, the rest of the definition of the concept is memorized. Concept attributes that meet three levels of chemical representation and can be poured into a virtual laboratory have a percentage of 66.6%. Type of concept, 83.3% is a concept based on principle; and 16.6% concepts that state the process. Meanwhile, the science process skills that can be developed based on concept analysis are the ability to observe, calculate, measure, predict, interpret, hypothesize, apply, classify, and inference.

  3. Case Analysis Of The Joint High Speed Vessel Program: Defense Acquisition

    DTIC Science & Technology

    2016-09-01

    reviews resulted in a series of Advanced Concept Technology Demonstrations (ACTD) designed to explore the military utility of converted commercial...requirements into a final and unique materiel solution for a system capability that is fielded. 14. SUBJECT TERMS Advanced Concept and Technology ...Advanced Concept Technology Demonstrations (ACTD) designed to explore the military utility of converted commercial, high-speed, shallow-draft

  4. Common Capabilities for Trust and Security in Service Oriented Infrastructures

    NASA Astrophysics Data System (ADS)

    Brossard, David; Colombo, Maurizio

    In order to achieve agility of the enterprise and shorter concept-to-market timescales for new services, IT and communication providers and their customers increasingly use technologies and concepts which come together under the banner of the Service Oriented Infrastructure (SOI) approach. In this paper we focus on the challenges relating to SOI security. The solutions presented cover the following areas: i) identity federation, ii) distributed usage & access management, and iii) context-aware secure messaging, routing & transformation. We use a scenario from the collaborative engineering space to illustrate the challenges and the solutions.

  5. Symposium: Diffusing Communication into the Secondary School Curriculum: The Need to Begin Diffusing Communication Concepts

    ERIC Educational Resources Information Center

    Harrington, Anne White

    1977-01-01

    Emphasizes the need for diffusion of communication concepts by citing secondary schools failure to accept speech communication as a 'fundamental', and the inability of educators to provide innovative communication solutions to educational problems. (MH)

  6. Runoff and Solute Mobilisation in a Semi-arid Headwater Catchment

    NASA Astrophysics Data System (ADS)

    Hughes, J. D.; Khan, S.; Crosbie, R.; Helliwell, S.; Michalk, D.

    2006-12-01

    Runoff and solute transport processes contributing to stream flow were determined in a small headwater catchment in the eastern Murray-Darling Basin of Australia using hydrometric and tracer methods. Stream flow and electrical conductivity were monitored from two gauges draining a portion of upper catchment area (UCA), and a saline scalded area respectively. Results show that the bulk of catchment solute export, occurs via a small saline scald (< 2% of catchment area) where solutes are concentrated in the near surface zone (0-40 cm). Non-scalded areas of the catchment are likely to provide the bulk of catchment runoff, although the scalded area is a higher contributor on an areal basis. Runoff from the non-scalded area is about two orders of magnitude lower in electrical conductivity than the scalded area. This study shows that the scalded zone and non-scalded parts of the catchment can be managed separately since they are effectively de-coupled except over long time scales, and produce runoff of contrasting quality. Such differences are "averaged out" by investigations that operate at larger scales, illustrating that observations need to be conducted at a range of scales. EMMA modelling using six solutes shows that "event" or "new" water dominated the stream hydrograph from the scald. This information together with hydrometric data and soil physical properties indicate that saturated overland flow is the main form of runoff generation in both the scalded area and the UCA. Saturated areas make up a small proportion of the catchment, but are responsible for production of all run off in conditions experienced throughout the experimental period. The process of saturation and runoff bears some similarities to the VSA concept (Hewlett and Hibbert 1967).

  7. From Data to Improved Decisions: Operations Research in Healthcare Delivery.

    PubMed

    Capan, Muge; Khojandi, Anahita; Denton, Brian T; Williams, Kimberly D; Ayer, Turgay; Chhatwal, Jagpreet; Kurt, Murat; Lobo, Jennifer Mason; Roberts, Mark S; Zaric, Greg; Zhang, Shengfan; Schwartz, J Sanford

    2017-11-01

    The Operations Research Interest Group (ORIG) within the Society of Medical Decision Making (SMDM) is a multidisciplinary interest group of professionals that specializes in taking an analytical approach to medical decision making and healthcare delivery. ORIG is interested in leveraging mathematical methods associated with the field of Operations Research (OR) to obtain data-driven solutions to complex healthcare problems and encourage collaborations across disciplines. This paper introduces OR for the non-expert and draws attention to opportunities where OR can be utilized to facilitate solutions to healthcare problems. Decision making is the process of choosing between possible solutions to a problem with respect to certain metrics. OR concepts can help systematically improve decision making through efficient modeling techniques while accounting for relevant constraints. Depending on the problem, methods that are part of OR (e.g., linear programming, Markov Decision Processes) or methods that are derived from related fields (e.g., regression from statistics) can be incorporated into the solution approach. This paper highlights the characteristics of different OR methods that have been applied to healthcare decision making and provides examples of emerging research opportunities. We illustrate OR applications in healthcare using previous studies, including diagnosis and treatment of diseases, organ transplants, and patient flow decisions. Further, we provide a selection of emerging areas for utilizing OR. There is a timely need to inform practitioners and policy makers of the benefits of using OR techniques in solving healthcare problems. OR methods can support the development of sustainable long-term solutions across disease management, service delivery, and health policies by optimizing the performance of system elements and analyzing their interaction while considering relevant constraints.

  8. Rapid process development of chromatographic process using direct analysis in real time mass spectrometry as a process analytical technology tool.

    PubMed

    Yan, Binjun; Chen, Teng; Xu, Zhilin; Qu, Haibin

    2014-06-01

    The concept of quality by design (QbD) is widely applied in the process development of pharmaceuticals. However, the additional cost and time have caused some resistance about QbD implementation. To show a possible solution, this work proposed a rapid process development method, which used direct analysis in real time mass spectrometry (DART-MS) as a process analytical technology (PAT) tool for studying the chromatographic process of Ginkgo biloba L., as an example. The breakthrough curves were fast determined by DART-MS at-line. A high correlation coefficient of 0.9520 was found between the concentrations of ginkgolide A determined by DART-MS and HPLC. Based on the PAT tool, the impacts of process parameters on the adsorption capacity were discovered rapidly, which showed a decreased adsorption capacity with the increase of the flow rate. This work has shown the feasibility and advantages of integrating PAT into QbD implementation for rapid process development. Copyright © 2014 Elsevier B.V. All rights reserved.

  9. Biomimetics: its practice and theory.

    PubMed

    Vincent, Julian F V; Bogatyreva, Olga A; Bogatyrev, Nikolaj R; Bowyer, Adrian; Pahl, Anja-Karina

    2006-08-22

    Biomimetics, a name coined by Otto Schmitt in the 1950s for the transfer of ideas and analogues from biology to technology, has produced some significant and successful devices and concepts in the past 50 years, but is still empirical. We show that TRIZ, the Russian system of problem solving, can be adapted to illuminate and manipulate this process of transfer. Analysis using TRIZ shows that there is only 12% similarity between biology and technology in the principles which solutions to problems illustrate, and while technology solves problems largely by manipulating usage of energy, biology uses information and structure, two factors largely ignored by technology.

  10. Improving the Reverse Logistics Respecting Principles of Sustainable Development in an Industrial Company

    NASA Astrophysics Data System (ADS)

    Fidlerová, Helena; Mĺkva, Miroslava

    2016-06-01

    Reverse logistics, the movement of materials back up the supply chain, is recognised by many organisations as an opportunity for adding value. The paper considers the theoretical framework and the conception of reverse logistics in literature and practice. The objective of the article is to propose tangible solutions which eliminate the imbalances in reverse logistics and improve the waste management in the company. The case study focuses on the improvement in the process of waste packaging in the context of sustainable development as a part of reverse logistics in the surveyed industrial company in Slovakia.

  11. Exploring cogging free magnetic gears

    NASA Astrophysics Data System (ADS)

    Borgers, Stefan; Völkel, Simeon; Schöpf, Wolfgang; Rehberg, Ingo

    2018-06-01

    The coupling of two rotating spherical magnets is investigated experimentally, with particular emphasis on those motions in which the driven magnet follows the driving one with a uniform angular speed, which is a feature of the so called cogging free couplings. The experiment makes use of standard equipment and digital image processing. The theory for these couplings is based on fundamental dipole-dipole interactions with analytically accessible solutions. Technical applications of this kind of coupling are foreseeable particularly for small machines, an advantage which also comes in handy for classroom demonstrations of this feature of the fundamental concept of dipole-dipole coupling.

  12. [Bioethical aspects of health care reform in Chile. II. Discrimination, free election and informed consent].

    PubMed

    Rosselot, Eduardo

    2003-11-01

    Bioethical issues emerge each time health care reform projects are discussed. These affect diverse moral values and principles and have an impact on cultural, social and political areas. Thus, they demand more than just organizational, financial or administrative solutions. This review analyses discrimination, free election of professionals and informed consent. All three concepts are alluded in the legislative debate raised upon the actual process for health reform. Having clear ideas about these subjects is crucial to foresee the reactions expected to arise among physicians and the general public, when confronting the proposed changes.

  13. Airline business continuity and IT disaster recovery sites.

    PubMed

    Haji, Jassim

    2016-01-01

    Business continuity is defined as the capability of the organisation to continue delivery of products or services at acceptable predefined levels following a disruptive incident. Business continuity is fast evolving to become a critical and strategic decision for any organisation. Transportation in general, and airlines in particular, is a unique sector with a specialised set of requirements, challenges and opportunities. Business continuity in the airline sector is a concept that is generally overlooked by the airline managements. This paper reviews different risks related to airline processes and will also propose solutions to these risks based on experiences and good industry practices.

  14. Domestic applications for aerospace waste and water management technologies

    NASA Technical Reports Server (NTRS)

    Disanto, F.; Murray, R. W.

    1972-01-01

    Some of the aerospace developments in solid waste disposal and water purification, which are applicable to specific domestic problems are explored. Also provided is an overview of the management techniques used in defining the need, in utilizing the available tools, and in synthesizing a solution. Specifically, several water recovery processes will be compared for domestic applicability. Examples are filtration, distillation, catalytic oxidation, reverse osmosis, and electrodialysis. Solid disposal methods will be discussed, including chemical treatment, drying, incineration, and wet oxidation. The latest developments in reducing household water requirements and some concepts for reusing water will be outlined.

  15. Computation of the shock-wave boundary layer interaction with flow separation

    NASA Technical Reports Server (NTRS)

    Ardonceau, P.; Alziary, T.; Aymer, D.

    1980-01-01

    The boundary layer concept is used to describe the flow near the wall. The external flow is approximated by a pressure displacement relationship (tangent wedge in linearized supersonic flow). The boundary layer equations are solved in finite difference form and the question of the presence and unicity of the solution is considered for the direct problem (assumed pressure) or converse problem (assumed displacement thickness, friction ratio). The coupling algorithm presented implicitly processes the downstream boundary condition necessary to correctly define the interacting boundary layer problem. The algorithm uses a Newton linearization technique to provide a fast convergence.

  16. Breakdown of Burton Prim Slichter approach and lateral solute segregation in radially converging flows

    NASA Astrophysics Data System (ADS)

    Priede, J.; Gerbeth, G.

    2005-11-01

    A theoretical study is presented of the effect of a radially converging melt flow, which is directed away from the solidification front, on the radial solute segregation in simple solidification models. We show that the classical Burton-Prim-Slichter (BPS) solution describing the effect of a diverging flow on the solute incorporation into the solidifying material breaks down for the flows converging along the solidification front. The breakdown is caused by a divergence of the integral defining the effective boundary layer thickness which is the basic concept of the BPS theory. Although such a divergence can formally be avoided by restricting the axial extension of the melt to a layer of finite height, radially uniform solute distributions are possible only for weak melt flows with an axial velocity away from the solidification front comparable to the growth rate. There is a critical melt velocity for each growth rate at which the solution passes through a singularity and becomes physically inconsistent for stronger melt flows. To resolve these inconsistencies we consider a solidification front presented by a disk of finite radius R0 subject to a strong converging melt flow and obtain an analytic solution showing that the radial solute concentration depends on the radius r as ˜ln(R0/r) and ˜ln(R0/r) close to the rim and at large distances from it. The logarithmic increase of concentration is limited in the vicinity of the symmetry axis by the diffusion becoming effective at a distance comparable to the characteristic thickness of the solute boundary layer. The converging flow causes a solute pile-up forming a logarithmic concentration peak at the symmetry axis which might be an undesirable feature for crystal growth processes.

  17. Design of high-rise dwelling houses for Ho Chi Minh City within the framework of the "smart city" concept

    NASA Astrophysics Data System (ADS)

    Loan, Nguyen Hong; Van Tin, Nguyen

    2018-03-01

    There are differences in the concepts of smart cities, which are reflected in many ideas and solutions. Globally one of the similarities of the goals for achieving smart cities is sustainable developmentwith the provision of best living conditions for people beingthe first priority. Ho Chi Minh City is not out of trend, taking the planning steps for the goal of becoming a smart city. It is necessary that design and construction of high-rise dwelling houses meet the criteria of "smart city" concept. This paper explores the design of high-rise dwelling houses forHo Chi Minh City with regards tothe framework of "smart city" concept. Methods used in the paper includedata collection, analytical - synthetical and modeling method.In order to proposedesign tasks and solutions of high-rise dwelling houses forHo Chi Minh Cityinthe concept "smart city"in the current period and near future, we present new approach, whichcan alsobe applied in practice for different cities in Vietnam.Moveover, it can also establishinformation resources, which areuseful in connecting and promotingfurther development for the success of a "smart city" program.

  18. The Apollo Expericence Lessons Learned for Constellation Lunar Dust Management

    NASA Astrophysics Data System (ADS)

    Wagner, Sandra

    2006-09-01

    Lunar dust will present significant challenges to NASA's Lunar Exploration Missions. The challenges can be overcome by using best practices in system engineering design. For successful lunar surface missions, all systems that come into contact with lunar dust must consider the effects throughout the entire design process. Interfaces between all these systems with other systems also must be considered. Incorporating dust management into Concept of Operations and Requirements development are the best place to begin to mitigate the risks presented by lunar dust. However, that is only the beginning. To be successful, every person who works on NASA's Constellation lunar missions must be mindful of this problem. Success will also require fiscal responsibility. NASA must learn from Apollo the root cause of problems caused by dust, and then find the most cost-effective solutions to address each challenge. This will require a combination of common sense existing technologies and promising, innovative technical solutions

  19. Worry, problem elaboration and suppression of imagery: the role of concreteness.

    PubMed

    Stöber, J

    1998-01-01

    Both lay concept and scientific theory claim that worry may be helpful for defining and analyzing problems. Recent studies, however, indicate that worrisome problem elaborations are less concrete than worry-free problem elaborations. This challenges the problem solving view of worry because abstract problem analyses are unlikely to lead to concrete problem solutions. Instead the findings support the avoidance theory of worry which claims that worry suppresses aversive imagery. Following research findings in the dual-coding framework [Paivio, A. (1971). Imagery and verbal processes. New York: Holt, Rhinehart and Winston; Paivio, A. (1986). Mental representations: a dual coding approach. New York: Oxford University Press.], the present article proposes that reduced concreteness may play a central role in the understanding of worry. First, reduced concreteness can explain how worry reduces imagery. Second, it offers an explanation why worrisome problem analyses are unlikely to arrive at solutions. Third, it provides a key for the understanding of worry maintenance.

  20. Theoretical calculations and performance results of a PZT thin film actuator.

    PubMed

    Hoffmann, Marcus; Küppers, Hartmut; Schneller, Theodor; Böttger, Ulrich; Schnakenberg, Uwe; Mokwa, Wilfried; Waser, Rainer

    2003-10-01

    High piezoelectric coupling coefficients of PZT-based material systems can be employed for actuator functions in micro-electro-mechanical systems (MEMS) offering displacements and forces which outperform standard solutions. This paper presents simulation, fabrication, and development results of a stress-compensated, PZT-coated cantilever concept in which a silicon bulk micromachining process is used in combination with a chemical solution deposition (CSD) technique. Due to an analytical approach and a finite element method (FEM) simulation for a tip displacement of 10 microm, the actuator was designed with a cantilever length of 300 microm to 1000 microm. Special attention was given to the Zr/Ti ratio of the PZT thin films to obtain a high piezoelectric coefficient. For first characterizations X-ray diffraction (XRD), scanning electron microscopy (SEM), hysteresis-, current-voltage I(V)- and capacitance-voltage C(V)-measurements were carried out.

  1. The Apollo Experience Lessons Learned for Constellation Lunar Dust Management

    NASA Technical Reports Server (NTRS)

    Wagner, Sandra

    2006-01-01

    Lunar dust will present significant challenges to NASA's Lunar Exploration Missions. The challenges can be overcome by using best practices in system engineering design. For successful lunar surface missions, all systems that come into contact with lunar dust must consider the effects throughout the entire design process. Interfaces between all these systems with other systems also must be considered. Incorporating dust management into Concept of Operations and Requirements development are the best place to begin to mitigate the risks presented by lunar dust. However, that is only the beginning. To be successful, every person who works on NASA's Constellation lunar missions must be mindful of this problem. Success will also require fiscal responsibility. NASA must learn from Apollo the root cause of problems caused by dust, and then find the most cost-effective solutions to address each challenge. This will require a combination of common sense existing technologies and promising, innovative technical solutions

  2. Improvement of Groundwater Regime Through Innovative Rainwater Harvesting Along Road Sides

    NASA Astrophysics Data System (ADS)

    Jain, S. K.

    The paper deals about viable and immediate solution of shortage of drinking water in the countries like India, Asian, and African continents. The paper highlights rainwater harvesting along both the sides of roads with the help of suitable, simple structures, which are easy to maintain. This may turn out to be long-term solution for the areas, which are draught prone, or having below normal rainfall. The example given in the paper for “Golden Quadrilateral” project of express national highways in India is quite illustrative and is applicable to other countries also falling in almost similar kind of climatic zones. The concept given in the paper would enhance water availability 8—10 times compared to natural process of rainfall infiltration. It would also improve quality of ground water and would save considerable energy in lifting the water due to the rise in water levels.

  3. An Example of Concurrent Engineering

    NASA Technical Reports Server (NTRS)

    Rowe, Sidney; Whitten, David; Cloyd, Richard; Coppens, Chris; Rodriguez, Pedro

    1998-01-01

    The Collaborative Engineering Design and Analysis Room (CEDAR) facility allows on-the- spot design review capability for any project during all phases of development. The required disciplines assemble in this facility to work on any problems (analysis, manufacturing, inspection, etc.) associated with a particular design. A small highly focused team of specialists can meet in this room to better expedite the process of developing a solution to an engineering task within the framework of the constraints that are unique to each discipline. This facility provides the engineering tools and translators to develop a concept within the confines of the room or with remote team members that could access the team's data from other locations. The CEDAR area is envisioned as excellent for failure investigation meetings to be conducted where the computer capabilities can be utilized in conjunction with the Smart Board display to develop failure trees, brainstorm failure modes, and evaluate possible solutions.

  4. Solution mechanism guide: implementing innovation within a research & development organization.

    PubMed

    Keeton, Kathryn E; Richard, Elizabeth E; Davis, Jeffrey R

    2014-10-01

    In order to create a culture more open to novel problem-solving mechanisms, NASA's Human Health and Performance Directorate (HH&P) created a strategic knowledge management tool that educates employees about innovative problem-solving techniques, the Solution Mechanism Guide (SMG). The SMG is a web-based, interactive guide that leverages existing and innovative problem-solving methods and presents this information as a unique user experience so that the employee is empowered to make the best decision about which problem-solving tool best meets their needs. By integrating new and innovative methods with existing problem solving tools, the SMG seamlessly introduces open innovation and collaboration concepts within HH&P to more effectively address human health and performance risks. This commentary reviews the path of creating a more open and innovative culture within HH&P and the process and development steps that were taken to develop the SMG.

  5. An improved genetic algorithm and its application in the TSP problem

    NASA Astrophysics Data System (ADS)

    Li, Zheng; Qin, Jinlei

    2011-12-01

    Concept and research actuality of genetic algorithm are introduced in detail in the paper. Under this condition, the simple genetic algorithm and an improved algorithm are described and applied in an example of TSP problem, where the advantage of genetic algorithm is adequately shown in solving the NP-hard problem. In addition, based on partial matching crossover operator, the crossover operator method is improved into extended crossover operator in order to advance the efficiency when solving the TSP. In the extended crossover method, crossover operator can be performed between random positions of two random individuals, which will not be restricted by the position of chromosome. Finally, the nine-city TSP is solved using the improved genetic algorithm with extended crossover method, the efficiency of whose solution process is much higher, besides, the solving speed of the optimal solution is much faster.

  6. High density operation for reactor-relevant power exhaust

    NASA Astrophysics Data System (ADS)

    Wischmeier, M.; ASDEX Upgrade Team; Jet Efda Contributors

    2015-08-01

    With increasing size of a tokamak device and associated fusion power gain an increasing power flux density towards the divertor needs to be handled. A solution for handling this power flux is crucial for a safe and economic operation. Using purely geometric arguments in an ITER-like divertor this power flux can be reduced by approximately a factor 100. Based on a conservative extrapolation of current technology for an integrated engineering approach to remove power deposited on plasma facing components a further reduction of the power flux density via volumetric processes in the plasma by up to a factor of 50 is required. Our current ability to interpret existing power exhaust scenarios using numerical transport codes is analyzed and an operational scenario as a potential solution for ITER like divertors under high density and highly radiating reactor-relevant conditions is presented. Alternative concepts for risk mitigation as well as strategies for moving forward are outlined.

  7. The TRUSPEAK Concept: Combining CMPO and HDEHP for Separating Trivalent Lanthanides from the Transuranic Elements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lumetta, Gregg J.; Gelis, Artem V.; Braley, Jenifer C.

    2013-04-08

    Combining octyl(phenyl)-N,N-diisobutyl-carbamoylmethyl-phosphine oxide (CMPO) and bis-(2-ethylhexyl) phosphoric acid (HDEHP) into a single process solvent for separating transuranic elements from liquid high-level waste is explored. Co-extraction of americium and the lanthanide elements from nitric acid solution is possible with a solvent mixture consisting of 0.1-M CMPO plus 1-M HDEHP in n-dodecane. Switching the aqueous-phase chemistry to a citrate-buffered solution of diethylene triamine pentaacetic acid (DTPA) allows for selective stripping of americium, separating it from the lanthanide elements. Potential strategies for managing molybdenum and zirconium (both of which co-extract with americium and the lanthanides) have been developed. The work presented here demonstratesmore » the feasibility of combining CMPO and HDEHP into a single extraction solvent for recovering americium from high-level waste and its separation from the lanthanides.« less

  8. A Cross-Age Study of Different Perspectives in Solution Chemistry from Junior to Senior High School

    ERIC Educational Resources Information Center

    Calik, Muammer

    2005-01-01

    This study reports on research examining what students think about aspects of solution chemistry and seeks to establish what alternative conceptions they hold in this area. To achieve this aim the researchers developed a test comprising of open-ended questions that evaluated students understanding of solution chemistry. The test was administered…

  9. Electrodynamics; Problems and solutions

    NASA Astrophysics Data System (ADS)

    Ilie, Carolina C.; Schrecengost, Zachariah S.

    2018-05-01

    This book of problems and solutions is a natural continuation of Ilie and Schrecengost's first book Electromagnetism: Problems and Solutions. Aimed towards students who would like to work independently on more electrodynamics problems in order to deepen their understanding and problem-solving skills, this book discusses main concepts and techniques related to Maxwell's equations, conservation laws, electromagnetic waves, potentials and fields, and radiation.

  10. Apparent Ionic Charge in Electrolyte and Polyelectrolyte Solutions

    ERIC Educational Resources Information Center

    Magdelenat, H.; And Others

    1978-01-01

    Compares average displacements of charged particles under thermal motion alone with those obtained by the action of an external electric field to develop a concept of "apparent charge" to approximate actual structural charge in an electrolyte solution. (SL)

  11. Geometric Model for a Parametric Study of the Blended-Wing-Body Airplane

    NASA Technical Reports Server (NTRS)

    Mastin, C. Wayne; Smith, Robert E.; Sadrehaghighi, Ideen; Wiese, Micharl R.

    1996-01-01

    A parametric model is presented for the blended-wing-body airplane, one concept being proposed for the next generation of large subsonic transports. The model is defined in terms of a small set of parameters which facilitates analysis and optimization during the conceptual design process. The model is generated from a preliminary CAD geometry. From this geometry, airfoil cross sections are cut at selected locations and fitted with analytic curves. The airfoils are then used as boundaries for surfaces defined as the solution of partial differential equations. Both the airfoil curves and the surfaces are generated with free parameters selected to give a good representation of the original geometry. The original surface is compared with the parametric model, and solutions of the Euler equations for compressible flow are computed for both geometries. The parametric model is a good approximation of the CAD model and the computed solutions are qualitatively similar. An optimal NURBS approximation is constructed and can be used by a CAD model for further refinement or modification of the original geometry.

  12. Big data management challenges in health research-a literature review.

    PubMed

    Wang, Xiaoming; Williams, Carolyn; Liu, Zhen Hua; Croghan, Joe

    2017-08-07

    Big data management for information centralization (i.e. making data of interest findable) and integration (i.e. making related data connectable) in health research is a defining challenge in biomedical informatics. While essential to create a foundation for knowledge discovery, optimized solutions to deliver high-quality and easy-to-use information resources are not thoroughly explored. In this review, we identify the gaps between current data management approaches and the need for new capacity to manage big data generated in advanced health research. Focusing on these unmet needs and well-recognized problems, we introduce state-of-the-art concepts, approaches and technologies for data management from computing academia and industry to explore improvement solutions. We explain the potential and significance of these advances for biomedical informatics. In addition, we discuss specific issues that have a great impact on technical solutions for developing the next generation of digital products (tools and data) to facilitate the raw-data-to-knowledge process in health research. Published by Oxford University Press 2017. This work is written by US Government employees and is in the public domain in the US.

  13. Distributed generation of shared RSA keys in mobile ad hoc networks

    NASA Astrophysics Data System (ADS)

    Liu, Yi-Liang; Huang, Qin; Shen, Ying

    2005-12-01

    Mobile Ad Hoc Networks is a totally new concept in which mobile nodes are able to communicate together over wireless links in an independent manner, independent of fixed physical infrastructure and centralized administrative infrastructure. However, the nature of Ad Hoc Networks makes them very vulnerable to security threats. Generation and distribution of shared keys for CA (Certification Authority) is challenging for security solution based on distributed PKI(Public-Key Infrastructure)/CA. The solutions that have been proposed in the literature and some related issues are discussed in this paper. The solution of a distributed generation of shared threshold RSA keys for CA is proposed in the present paper. During the process of creating an RSA private key share, every CA node only has its own private security. Distributed arithmetic is used to create the CA's private share locally, and that the requirement of centralized management institution is eliminated. Based on fully considering the Mobile Ad Hoc network's characteristic of self-organization, it avoids the security hidden trouble that comes by holding an all private security share of CA, with which the security and robustness of system is enhanced.

  14. Crew interface specification development study for in-flight maintenance and stowage functions

    NASA Technical Reports Server (NTRS)

    Carl, J. G.

    1971-01-01

    The need and potential solutions for an orderly systems engineering approach to the definition, management and documentation requirements for in-flight maintenance, assembly, servicing, and stowage process activities of the flight crews of future spacecraft were investigated. These processes were analyzed and described using a new technique (mass/function flow diagramming), developed during the study, to give visibility to crew functions and supporting requirements, including data products. This technique is usable by NASA for specification baselines and can assist the designer in identifying both upper and lower level requirements associated with these processes. These diagrams provide increased visibility into the relationships between functions and related equipments being utilized and managed and can serve as a common communicating vehicle between the designer, program management, and the operational planner. The information and data product requirements to support the above processes were identified along with optimum formats and contents of these products. The resulting data product concepts are presented to support these in-flight maintenance and stowage processes.

  15. The use of computer-aided learning in chemistry laboratory instruction

    NASA Astrophysics Data System (ADS)

    Allred, Brian Robert Tracy

    This research involves developing and implementing computer software for chemistry laboratory instruction. The specific goal is to design the software and investigate whether it can be used to introduce concepts and laboratory procedures without a lecture format. This would allow students to conduct an experiment even though they may not have been introduced to the chemical concept in their lecture course. This would also allow for another type of interaction for those students who respond more positively to a visual approach to instruction. The first module developed was devoted to using computer software to help introduce students to the concepts related to thin-layer chromatography and setting up and running an experiment. This was achieved through the use of digitized pictures and digitized video clips along with written information. A review quiz was used to help reinforce the learned information. The second module was devoted to the concept of the "dry lab". This module presented students with relevant information regarding the chemical concepts and then showed them the outcome of mixing solutions. By these observations, they were to determine the composition of unknown solutions based on provided descriptions and comparison with their written observations. The third piece of the software designed was a computer game. This program followed the first two modules in providing information the students were to learn. The difference here, though, was incorporating a game scenario for students to use to help reinforce the learning. Students were then assessed to see how much information they retained after playing the game. In each of the three cases, a control group exposed to the traditional lecture format was used. Their results were compared to the experimental group using the computer modules. Based upon the findings, it can be concluded that using technology to aid in the instructional process is definitely of benefit and students were more successful in learning. It is important to note, though, that one single type of instructional method is not the best way to inspire learning. It seems multiple methods provide the best educational experience for all.

  16. Mathematics for generative processes: Living and non-living systems

    NASA Astrophysics Data System (ADS)

    Giannantoni, Corrado

    2006-05-01

    The traditional Differential Calculus often shows its limits when describing living systems. These in fact present such a richness of characteristics that are, in the majority of cases, much wider than the description capabilities of the usual differential equations. Such an aspect became particularly evident during the research (completed in 2001) for an appropriate formulation of Odum's Maximum Em-Power Principle (proposed by the Author as a possible Fourth Thermodynamic Principle). In fact, in such a context, the particular non-conservative Algebra, adopted to account for both Quality and quantity of generative processes, suggested we introduce a faithfully corresponding concept of "derivative" (of both integer and fractional order) to describe dynamic conditions however variable. The new concept not only succeeded in pointing out the corresponding differential bases of all the rules of Emergy Algebra, but also represented the preferential guide in order to recognize the most profound physical nature of the basic processes which mostly characterize self-organizing Systems (co-production, co-injection, inter-action, feed-back, splits, etc.).From a mathematical point of view, the most important novelties introduced by such a new approach are: (i) the derivative of any integer or fractional order can be obtained independently from the evaluation of its lower order derivatives; (ii) the exponential function plays an extremely hinge role, much more marked than in the case of traditional differential equations; (iii) wide classes of differential equations, traditionally considered as being non-linear, become "intrinsically" linear when reconsidered in terms of "incipient" derivatives; (iv) their corresponding explicit solutions can be given in terms of new classes of functions (such as "binary" and "duet" functions); (v) every solution shows a sort of "persistence of form" when representing the product generated with respect to the agents of the generating process; (iv) and, at the same time, an intrinsic "genetic" ordinality which reflects the fact that any product "generated" is something more than the sum of the generating elements. Consequently all these properties enable us to follow the evolution of the "product" of any generative process from the very beginning, in its "rising", in its "incipient" act of being born. This is why the new "operator" introduced, specifically apt when describing the above-mentioned aspects, was termed as "incipient" (or "spring") derivative.In addition, even if the considered approach was suggested by the analysis of self-organizing living Systems, some specific examples of non-living Systems will also be mentioned. In fact, what is much more surprising is that such an approach is even more valid (than the traditional one) to describe non-living Systems too. In fact the resulting "drift" between traditional solutions and "incipient" solutions led us to reconsider the phenomenon of Mercury's precessions. The satisfactory agreement with the astronomical data suggested, as a consequential hypothesis, a different interpretation of its physical origin, substantially based on the Maximum Em-Power Principle.

  17. Crepe Paper Colorimetry.

    ERIC Educational Resources Information Center

    Pringle, David L.; And Others

    1995-01-01

    Uses crepe paper for the introduction of spectrophotometric concepts. Dyes used in the manufacturing of the crepe paper dissolve rapidly in water to produce solutions of colors. The variety of colors provides spectra in the visible spectrum that allow students to grasp concepts of absorption and transmission. (AIM)

  18. Using concept mapping in the knowledge-to-action process to compare stakeholder opinions on barriers to use of cancer screening among South Asians.

    PubMed

    Lobb, Rebecca; Pinto, Andrew D; Lofters, Aisha

    2013-03-23

    Using the knowledge-to-action (KTA) process, this study examined barriers to use of evidence-based interventions to improve early detection of cancer among South Asians from the perspective of multiple stakeholders. In 2011, we used concept mapping with South Asian residents, and representatives from health service and community service organizations in the region of Peel Ontario. As part of concept mapping procedures, brainstorming sessions were conducted with stakeholders (n = 53) to identify barriers to cancer screening among South Asians. Participants (n = 46) sorted barriers into groups, and rated barriers from lowest (1) to highest (6) in terms of importance for use of mammograms, Pap tests and fecal occult blood tests, and how feasible it would be to address them. Multi-dimensional scaling, cluster analysis, and descriptive statistics were used to analyze the data. A total of 45 unique barriers to use of mammograms, Pap tests, and fecal occult blood tests among South Asians were classified into seven clusters using concept mapping procedures: patient's beliefs, fears, lack of social support; health system; limited knowledge among residents; limited knowledge among physicians; health education programs; ethno-cultural discordance with the health system; and cost. Overall, the top three ranked clusters of barriers were 'limited knowledge among residents,' 'ethno-cultural discordance,' and 'health education programs' across surveys. Only residents ranked 'cost' second in importance for fecal occult blood testing, and stakeholders from health service organizations ranked 'limited knowledge among physicians' third for the feasibility survey. Stakeholders from health services organizations ranked 'limited knowledge among physicians' fourth for all other surveys, but this cluster consistently ranked lowest among residents. The limited reach of cancer control programs to racial and ethnic minority groups is a critical implementation issue that requires attention. Opinions of community service and health service organizations on why this deficit in implementation occurs are fundamental to understanding the solutions because these are the settings in which evidence-based interventions are implemented. Using concept mapping within a KTA process can facilitate the engagement of multiple stakeholders in the utilization of study results and in identifying next steps for action.

  19. Synergistic approach to high-performance oxide thin film transistors using a bilayer channel architecture.

    PubMed

    Yu, Xinge; Zhou, Nanjia; Smith, Jeremy; Lin, Hui; Stallings, Katie; Yu, Junsheng; Marks, Tobin J; Facchetti, Antonio

    2013-08-28

    We report here a bilayer metal oxide thin film transistor concept (bMO TFT) where the channel has the structure: dielectric/semiconducting indium oxide (In2O3) layer/semiconducting indium gallium oxide (IGO) layer. Both semiconducting layers are grown from solution via a low-temperature combustion process. The TFT mobilities of bottom-gate/top-contact bMO TFTs processed at T = 250 °C are ~5tmex larger (~2.6 cm(2)/(V s)) than those of single-layer IGO TFTs (~0.5 cm(2)/(V s)), reaching values comparable to single-layer combustion-processed In2O3 TFTs (~3.2 cm(2)/(V s)). More importantly, and unlike single-layer In2O3 TFTs, the threshold voltage of the bMO TFTs is ~0.0 V, and the current on/off ratio is significantly enhanced to ~1 × 10(8) (vs ~1 × 10(4) for In2O3). The microstructure and morphology of the In2O3/IGO bilayers are analyzed by X-ray diffraction, atomic force microscopy, X-ray photoelectron spectroscopy, and transmission electron microscopy, revealing the polycrystalline nature of the In2O3 layer and the amorphous nature of the IGO layer. This work demonstrates that solution-processed metal oxides can be implemented in bilayer TFT architectures with significantly enhanced performance.

  20. Moisture-triggered physically transient electronics

    PubMed Central

    Gao, Yang; Zhang, Ying; Wang, Xu; Sim, Kyoseung; Liu, Jingshen; Chen, Ji; Feng, Xue; Xu, Hangxun; Yu, Cunjiang

    2017-01-01

    Physically transient electronics, a form of electronics that can physically disappear in a controllable manner, is very promising for emerging applications. Most of the transient processes reported so far only occur in aqueous solutions or biofluids, offering limited control over the triggering and degradation processes. We report novel moisture-triggered physically transient electronics, which exempt the needs of resorption solutions and can completely disappear within well-controlled time frames. The triggered transient process starts with the hydrolysis of the polyanhydride substrate in the presence of trace amounts of moisture in the air, a process that can generate products of corrosive organic acids to digest various inorganic electronic materials and components. Polyanhydride is the only example of polymer that undergoes surface erosion, a distinct feature that enables stable operation of the functional devices over a predefined time frame. Clear advantages of this novel triggered transience mode include that the lifetime of the devices can be precisely controlled by varying the moisture levels and changing the composition of the polymer substrate. The transience time scale can be tuned from days to weeks. Various transient devices, ranging from passive electronics (such as antenna, resistor, and capacitor) to active electronics (such as transistor, diodes, optoelectronics, and memories), and an integrated system as a platform demonstration have been developed to illustrate the concept and verify the feasibility of this design strategy. PMID:28879237

  1. A study of advanced magnesium-based hydride and development of a metal hydride thermal battery system

    NASA Astrophysics Data System (ADS)

    Zhou, Chengshang

    Metal hydrides are a group of important materials known as energy carriers for renewable energy and thermal energy storage. A concept of thermal battery based on advanced metal hydrides is studied for heating and cooling of cabins in electric vehicles. The system utilizes a pair of thermodynamically matched metal hydrides as energy storage media. The hot hydride that is identified and developed is catalyzed MgH2 due to its high energy density and enhanced kinetics. TiV0.62Mn1.5, TiMn2, and LaNi5 alloys are selected as the matching cold hydride. A systematic experimental survey is carried out in this study to compare a wide range of additives including transitions metals, transition metal oxides, hydrides, intermetallic compounds, and carbon materials, with respect to their effects on dehydrogenation properties of MgH2. The results show that additives such as Ti and V-based metals, hydride, and certain intermetallic compounds have strong catalytic effects. Solid solution alloys of magnesium are exploited as a way to destabilize magnesium hydride thermodynamically. Various elements are alloyed with magnesium to form solid solutions, including indium and aluminum. Thermodynamic properties of the reactions between the magnesium solid solution alloys and hydrogen are investigated, showing that all the solid solution alloys that are investigated in this work have higher equilibrium hydrogen pressures than that of pure magnesium. Cyclic stability of catalyzed MgH2 is characterized and analyzed using a PCT Sievert-type apparatus. Three systems, including MgH2-TiH 2, MgH2-TiMn2, and MgH2-VTiCr, are examined. The hydrogenating and dehydrogenating kinetics at 300°C are stable after 100 cycles. However, the low temperature (25°C to 150°C) hydrogenation kinetics suffer a severe degradation during hydrogen cycling. Further experiments confirm that the low temperature kinetic degradation can be mainly related the extended hydrogenation-dehydrogenation reactions. Proof-of-concept prototypes are built and tested, demonstrating the potential of the system as HVAC for transportation vehicles. The performance of the concept-demonstration-unit show both high heating/cooling power and high energy densities. An extended cycling test shows degradation on the performance of the system. To solve this problem, a metal hydride hydrogen compressor is proposed for aiding the recharge process of the system.

  2. Mechanical Alloying of W-Mo-V-Cr-Ta High Entropy Alloys

    NASA Astrophysics Data System (ADS)

    Das, Sujit; Robi, P. S.

    2018-04-01

    Recent years have seen the emergence of high-entropy alloys (HEAs) consisting of five or more elements in equi-atomic or near equi-atomic ratios. These alloys in single phase solid solution exhibit exceptional mechanical properties viz., high strength at room and elevated temperatures, reasonable ductility and stable microstructure over a wide range of temperatures making it suitable for high temperature structural materials. In spite of the attractive properties, processing of these materials remains a challenge. Reports regarding fabrication and characterisation of a few refractory HEA systems are available. The processing of these alloys have been carried out by arc melting of small button sized materials. The present paper discusses the development of a novel refractory W-Mo-V-Cr-Ta HEA powder based on a new alloy design concept. The powder mixture was milled for time periods up to 64 hours. Single phase alloy powder having body centred cubic structure was processed by mechanical alloying. The milling characteristics and extent of alloying during the ball milling were characterized using X-ray diffractiometre (XRD), field emission scanning electron microscope (FESEM) and transmission electron microscope (TEM). A single phase solid solution alloy powder having body-centred cubic (BCC) structure with a lattice parameter of 3.15486 Å was obtained after milling for 32 hours.

  3. Estimating the rates of mass change, ice volume change and snow volume change in Greenland from ICESat and GRACE data

    NASA Astrophysics Data System (ADS)

    Slobbe, D. C.; Ditmar, P.; Lindenbergh, R. C.

    2009-01-01

    The focus of this paper is on the quantification of ongoing mass and volume changes over the Greenland ice sheet. For that purpose, we used elevation changes derived from the Ice, Cloud, and land Elevation Satellite (ICESat) laser altimetry mission and monthly variations of the Earth's gravity field as observed by the Gravity Recovery and Climate Experiment (GRACE) mission. Based on a stand alone processing scheme of ICESat data, the most probable estimate of the mass change rate from 2003 February to 2007 April equals -139 +/- 68 Gtonyr-1. Here, we used a density of 600+/-300 kgm-3 to convert the estimated elevation change rate in the region above 2000m into a mass change rate. For the region below 2000m, we used a density of 900+/-300 kgm-3. Based on GRACE gravity models from half 2002 to half 2007 as processed by CNES, CSR, DEOS and GFZ, the estimated mass change rate for the whole of Greenland ranges between -128 and -218Gtonyr-1. Most GRACE solutions show much stronger mass losses as obtained with ICESat, which might be related to a local undersampling of the mass loss by ICESat and uncertainties in the used snow/ice densities. To solve the problem of uncertainties in the snow and ice densities, two independent joint inversion concepts are proposed to profit from both GRACE and ICESat observations simultaneously. The first concept, developed to reduce the uncertainty of the mass change rate, estimates this rate in combination with an effective snow/ice density. However, it turns out that the uncertainties are not reduced, which is probably caused by the unrealistic assumption that the effective density is constant in space and time. The second concept is designed to convert GRACE and ICESat data into two totally new products: variations of ice volume and variations of snow volume separately. Such an approach is expected to lead to new insights in ongoing mass change processes over the Greenland ice sheet. Our results show for different GRACE solutions a snow volume change of -11 to 155km3yr-1 and an ice loss with a rate of -136 to -292km3yr-1.

  4. Enriched biodiversity data as a resource and service

    PubMed Central

    Balech, Bachir; Beard, Niall; Blissett, Matthew; Brenninkmeijer, Christian; van Dooren, Tom; Eades, David; Gosline, George; Groom, Quentin John; Hamann, Thomas D.; Hettling, Hannes; Hoehndorf, Robert; Holleman, Ayco; Hovenkamp, Peter; Kelbert, Patricia; King, David; Kirkup, Don; Lammers, Youri; DeMeulemeester, Thibaut; Mietchen, Daniel; Miller, Jeremy A.; Mounce, Ross; Nicolson, Nicola; Page, Rod; Pawlik, Aleksandra; Pereira, Serrano; Penev, Lyubomir; Richards, Kevin; Sautter, Guido; Shorthouse, David Peter; Tähtinen, Marko; Weiland, Claus; Williams, Alan R.; Sierra, Soraya

    2014-01-01

    Abstract Background: Recent years have seen a surge in projects that produce large volumes of structured, machine-readable biodiversity data. To make these data amenable to processing by generic, open source “data enrichment” workflows, they are increasingly being represented in a variety of standards-compliant interchange formats. Here, we report on an initiative in which software developers and taxonomists came together to address the challenges and highlight the opportunities in the enrichment of such biodiversity data by engaging in intensive, collaborative software development: The Biodiversity Data Enrichment Hackathon. Results: The hackathon brought together 37 participants (including developers and taxonomists, i.e. scientific professionals that gather, identify, name and classify species) from 10 countries: Belgium, Bulgaria, Canada, Finland, Germany, Italy, the Netherlands, New Zealand, the UK, and the US. The participants brought expertise in processing structured data, text mining, development of ontologies, digital identification keys, geographic information systems, niche modeling, natural language processing, provenance annotation, semantic integration, taxonomic name resolution, web service interfaces, workflow tools and visualisation. Most use cases and exemplar data were provided by taxonomists. One goal of the meeting was to facilitate re-use and enhancement of biodiversity knowledge by a broad range of stakeholders, such as taxonomists, systematists, ecologists, niche modelers, informaticians and ontologists. The suggested use cases resulted in nine breakout groups addressing three main themes: i) mobilising heritage biodiversity knowledge; ii) formalising and linking concepts; and iii) addressing interoperability between service platforms. Another goal was to further foster a community of experts in biodiversity informatics and to build human links between research projects and institutions, in response to recent calls to further such integration in this research domain. Conclusions: Beyond deriving prototype solutions for each use case, areas of inadequacy were discussed and are being pursued further. It was striking how many possible applications for biodiversity data there were and how quickly solutions could be put together when the normal constraints to collaboration were broken down for a week. Conversely, mobilising biodiversity knowledge from their silos in heritage literature and natural history collections will continue to require formalisation of the concepts (and the links between them) that define the research domain, as well as increased interoperability between the software platforms that operate on these concepts. PMID:25057255

  5. Issues and Strategies in Solving Multidisciplinary Optimization Problems

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya

    2013-01-01

    Optimization research at NASA Glenn Research Center has addressed the design of structures, aircraft and airbreathing propulsion engines. The accumulated multidisciplinary design activity is collected under a testbed entitled COMETBOARDS. Several issues were encountered during the solution of the problems. Four issues and the strategies adapted for their resolution are discussed. This is followed by a discussion on analytical methods that is limited to structural design application. An optimization process can lead to an inefficient local solution. This deficiency was encountered during design of an engine component. The limitation was overcome through an augmentation of animation into optimization. Optimum solutions obtained were infeasible for aircraft and airbreathing propulsion engine problems. Alleviation of this deficiency required a cascading of multiple algorithms. Profile optimization of a beam produced an irregular shape. Engineering intuition restored the regular shape for the beam. The solution obtained for a cylindrical shell by a subproblem strategy converged to a design that can be difficult to manufacture. Resolution of this issue remains a challenge. The issues and resolutions are illustrated through a set of problems: Design of an engine component, Synthesis of a subsonic aircraft, Operation optimization of a supersonic engine, Design of a wave-rotor-topping device, Profile optimization of a cantilever beam, and Design of a cylindrical shell. This chapter provides a cursory account of the issues. Cited references provide detailed discussion on the topics. Design of a structure can also be generated by traditional method and the stochastic design concept. Merits and limitations of the three methods (traditional method, optimization method and stochastic concept) are illustrated. In the traditional method, the constraints are manipulated to obtain the design and weight is back calculated. In design optimization, the weight of a structure becomes the merit function with constraints imposed on failure modes and an optimization algorithm is used to generate the solution. Stochastic design concept accounts for uncertainties in loads, material properties, and other parameters and solution is obtained by solving a design optimization problem for a specified reliability. Acceptable solutions can be produced by all the three methods. The variation in the weight calculated by the methods was found to be modest. Some variation was noticed in designs calculated by the methods. The variation may be attributed to structural indeterminacy. It is prudent to develop design by all three methods prior to its fabrication. The traditional design method can be improved when the simplified sensitivities of the behavior constraint is used. Such sensitivity can reduce design calculations and may have a potential to unify the traditional and optimization methods. Weight versus reliability traced out an inverted-S-shaped graph. The center of the graph corresponded to mean valued design. A heavy design with weight approaching infinity could be produced for a near-zero rate of failure. Weight can be reduced to a small value for a most failure-prone design. Probabilistic modeling of load and material properties remained a challenge.

  6. Phase 0 and phase III transport in various organs: combined concept of phases in xenobiotic transport and metabolism.

    PubMed

    Döring, Barbara; Petzinger, Ernst

    2014-08-01

    The historical phasing concept of drug metabolism and elimination was introduced to comprise the two phases of metabolism: phase I metabolism for oxidations, reductions and hydrolyses, and phase II metabolism for synthesis. With this concept, biological membrane barriers obstructing the accessibility of metabolism sites in the cells for drugs were not considered. The concept of two phases was extended to a concept of four phases when drug transporters were detected that guided drugs and drug metabolites in and out of the cells. In particular, water soluble or charged drugs are virtually not able to overcome the phospholipid membrane barrier. Drug transporters belong to two main clusters of transporter families: the solute carrier (SLC) families and the ATP binding cassette (ABC) carriers. The ABC transporters comprise seven families with about 20 carriers involved in drug transport. All of them operate as pumps at the expense of ATP splitting. Embedded in the former phase concept, the term "phase III" was introduced by Ishikawa in 1992 for drug export by ABC efflux pumps. SLC comprise 52 families, from which many carriers are drug uptake transporters. Later on, this uptake process was referred to as the "phase 0 transport" of drugs. Transporters for xenobiotics in man and animal are most expressed in liver, but they are also present in extra-hepatic tissues such as in the kidney, the adrenal gland and lung. This review deals with the function of drug carriers in various organs and their impact on drug metabolism and elimination.

  7. What Should We Include in a Cultural Competence Curriculum? An Emerging Formative Evaluation Process to Foster Curriculum Development

    PubMed Central

    Crenshaw, Katie; Shewchuk, Richard M.; Qu, Haiyan; Staton, Lisa J.; Bigby, Judy Ann; Houston, Thomas K.; Allison, Jeroan; Estrada, Carlos A.

    2011-01-01

    Purpose To identify, prioritize, and organize components of a cultural competence curriculum to address disparities in cardiovascular disease. Method In 2006, four separate nominal group technique sessions were conducted with medical students, residents, community physicians, and academic physicians to generate and prioritize a list of concepts (i.e., ideas) to include in a curriculum. Afterward, 45 educators and researchers organized and prioritized the concepts using a card-sorting exercise. Multidimensional scaling (MDS) and hierarchical cluster analysis produced homogeneous groupings of related concepts and generated a cognitive map. The main outcome measures were the number of cultural competence concepts, their relative ranks, and the cognitive map. Results Thirty participants generated 61 concepts, 29 were identified by at least 2 participants. The cognitive map organized concepts into four clusters, interpreted as: (1) patient’s cultural background (e.g.,, information on cultures, habits, values); (2) provider and health care (e.g., clinical skills, awareness of one’s bias, patient-centeredness, and professionalism), communication skills (e.g., history, stereotype avoidance, and health disparities epidemiology); (3) cross-culture (e.g., idiomatic expressions, examples of effective communication); and (4) resources to manage cultural diversity (e.g., translator guides, instructions and community resources). The MDS two-dimensional solution demonstrated a good fit (stress=0.07; R2=0.97). Conclusions A novel, combined approach allowed stakeholders’ inputs to identify and cognitively organize critical domains used to guide development of a cultural competence curriculum. Educators may use this approach to develop and organize educational content for their target audiences, especially in ill-defined areas like cultural competence. PMID:21248602

  8. The amino acid's backup bone - storage solutions for proteomics facilities.

    PubMed

    Meckel, Hagen; Stephan, Christian; Bunse, Christian; Krafzik, Michael; Reher, Christopher; Kohl, Michael; Meyer, Helmut Erich; Eisenacher, Martin

    2014-01-01

    Proteomics methods, especially high-throughput mass spectrometry analysis have been continually developed and improved over the years. The analysis of complex biological samples produces large volumes of raw data. Data storage and recovery management pose substantial challenges to biomedical or proteomic facilities regarding backup and archiving concepts as well as hardware requirements. In this article we describe differences between the terms backup and archive with regard to manual and automatic approaches. We also introduce different storage concepts and technologies from transportable media to professional solutions such as redundant array of independent disks (RAID) systems, network attached storages (NAS) and storage area network (SAN). Moreover, we present a software solution, which we developed for the purpose of long-term preservation of large mass spectrometry raw data files on an object storage device (OSD) archiving system. Finally, advantages, disadvantages, and experiences from routine operations of the presented concepts and technologies are evaluated and discussed. This article is part of a Special Issue entitled: Computational Proteomics in the Post-Identification Era. Guest Editors: Martin Eisenacher and Christian Stephan. Copyright © 2013. Published by Elsevier B.V.

  9. Providers and Patients Caught Between Standardization and Individualization: Individualized Standardization as a Solution

    PubMed Central

    Ansmann, Lena; Pfaff, Holger

    2018-01-01

    In their 2017 article, Mannion and Exworthy provide a thoughtful and theory-based analysis of two parallel trends in modern healthcare systems and their competing and conflicting logics: standardization and customization. This commentary further discusses the challenge of treatment decision-making in times of evidence-based medicine (EBM), shared decision-making and personalized medicine. From the perspective of systems theory, we propose the concept of individualized standardization as a solution to the problem. According to this concept, standardization is conceptualized as a guiding framework leaving room for individualization in the patient physician interaction. The theoretical background is the concept of context management according to systems theory. Moreover, the comment suggests multidisciplinary teams as a possible solution for the integration of standardization and individualization, using the example of multidisciplinary tumor conferences and highlighting its limitations. The comment also supports the authors’ statement of the patient as co-producer and introduces the idea that the competing logics of standardization and individualization are a matter of perspective on macro, meso and micro levels. PMID:29626403

  10. Trade Study of System Level Ranked Radiation Protection Concepts for Deep Space Exploration

    NASA Technical Reports Server (NTRS)

    Cerro, Jeffrey A

    2013-01-01

    A strategic focus area for NASA is to pursue the development of technologies which support exploration in space beyond the current inhabited region of low earth orbit. An unresolved issue for crewed deep space exploration involves limiting crew radiation exposure to below acceptable levels, considering both solar particle events and galactic cosmic ray contributions to dosage. Galactic cosmic ray mitigation is not addressed in this paper, but by addressing credible, easily implemented, and mass efficient solutions for the possibility of solar particle events, additional margin is provided that can be used for cosmic ray dose accumulation. As a result, NASA s Advanced Engineering Systems project office initiated this Radiation Storm Shelter design activity. This paper reports on the first year results of an expected 3 year Storm Shelter study effort which will mature concepts and operational scenarios that protect exploration astronauts from solar particle radiation events. Large trade space definition, candidate concept ranking, and a planned demonstration comprised the majority of FY12 activities. A system key performance parameter is minimization of the required increase in mass needed to provide a safe environment. Total system mass along with operational assessments and other defined protection system metrics provide the guiding metrics to proceed with concept developments. After a downselect to four primary methods, the concepts were analyzed for dosage severity and the amount of shielding mass necessary to bring dosage to acceptable values. Besides analytical assessments, subscale models of several concepts and one full scale concept demonstrator were created. FY12 work terminated with a plan to demonstrate test articles of two selected approaches. The process of arriving at these selections and their current envisioned implementation are presented in this paper.

  11. Competition: A Model for Conception.

    ERIC Educational Resources Information Center

    Kildea, Alice E.

    1983-01-01

    The Model for the Conception of Competition is offered as a means of discovering problems and solutions surrounding competition. Distinctions are made between existing definitions, and insight into how competition relates to human life and cosmic interaction is given. Survival, mastery, and transcendence, modes of psychological thought, are…

  12. Contemplating Symbolic Literacy of First Year Mathematics Students

    ERIC Educational Resources Information Center

    Bardini, Caroline; Pierce, Robyn; Vincent, Jill

    2015-01-01

    Analysis of mathematical notations must consider both syntactical aspects of symbols and the underpinning mathematical concept(s) conveyed. We argue that the construct of "syntax template" provides a theoretical framework to analyse undergraduate mathematics students' written solutions, where we have identified several types of…

  13. Recovery approach to the care of people with dementia: decision making and 'best interests' concerns.

    PubMed

    Martin, G

    2009-09-01

    The concept of 'recovery' has been central to the discussion of the care of people with mental health problems in recent years, in this paper these ideas will be applied to the care of people with dementia in an attempt to focus nursing practice on the notion that it is possible to involve this group of patients in their own decision-making processes. It is acknowledged that this is not always possible without support and advocacy by nurses and other carers who must take on board the need to arrive at solutions to problems or change that are in the person's best interests. The provisions of the Mental Capacity Act in 2005 are key to this discussion, and ways forward are recommended, which include a nursing model for change, in an effort to bring together the concepts addressed in this paper. The conclusion reached is that the recovery approach has some difficulties when applied to people with dementia but it remains an essential aspect of the care process which, together with the provisions of the Mental Capacity Act, could bring about radical improvements to the lives of this group of vulnerable people.

  14. Incorporating geoethics into environmental engineering lectures - three years of experience from international students visiting Iceland

    NASA Astrophysics Data System (ADS)

    Finger, David C.

    2017-04-01

    Never before has human kind been facing bigger environmental challenges than today. The challenges are overwhelming: growing human population, increasing ecological footprints, accelerated climate change, severe soil degradation, eutrophication of vital fresh water resources, acidification of oceans, health threatening air pollution and rapid biodiversity loss, to name just a few. It is the task of environmental scientists to transmit established knowledge on these complex and interdisciplinary challenges while demonstrating that management and engineering solutions exist to meet these threats. In this presentation I will outline the concept of my environmental impact (EI) assessment course, where prospective engineering students can select a topic of their choice, assess the environmental impacts, discuss with relevant stakeholders and come up with innovative solutions. The course is structured in three parts: i) lecturing of theoretical methods frequently used within the EI assessment process, ii) interaction with local businesses to acquire first-hand experience and iii) hands on training by writing an EI statement on a selected topic (see link below). Over the course of three years over 70 prospective engineering students from all over the world have not only acquired environmental system understanding, but also enhanced their awareness and developed potential solutions to mitigate, compensate and reverse the persistent environmental challenges. Most importantly, during this process all involved stakeholders (students, teachers, industry partners, governmental bodies and NGO partners) will hopefully develop a mutual understanding of the above mentioned environmental challenges and engage in an open and constructive dialogue necessary to generate acceptable solutions. Link to student projects from previous years: https://fingerd.jimdo.com/teaching/courses/environmental-impact-assessment/

  15. The Long and Winding Road to Innovation.

    PubMed

    Beyar, Rafael

    2015-07-30

    Medicine is developing through biomedical technology and innovations. The goal of any innovation in medicine is to improve patient care. Exponential growth in technology has led to the unprecedented growth of medical technology over the last 50 years. Clinician-scientists need to understand the complexity of the innovation process, from concept to product release, when working to bring new clinical solutions to the bedside. Hence, an overview of the innovation process is provided herein. The process involves an invention designed to solve an unmet need, followed by prototype design and optimization, animal studies, pilot and pivotal studies, and regulatory approval. The post-marketing strategy relative to funding, along with analysis of cost benefit, is a critical component for the adoption of new technologies. Examples of the road to innovation are provided, based on the experience with development of the transcatheter aortic valve. Finally, ideas are presented to contribute to the further development of this worldwide trend in innovation.

  16. The Long and Winding Road to Innovation

    PubMed Central

    Beyar, Rafael

    2015-01-01

    Medicine is developing through biomedical technology and innovations. The goal of any innovation in medicine is to improve patient care. Exponential growth in technology has led to the unprecedented growth of medical technology over the last 50 years. Clinician-scientists need to understand the complexity of the innovation process, from concept to product release, when working to bring new clinical solutions to the bedside. Hence, an overview of the innovation process is provided herein. The process involves an invention designed to solve an unmet need, followed by prototype design and optimization, animal studies, pilot and pivotal studies, and regulatory approval. The post-marketing strategy relative to funding, along with analysis of cost benefit, is a critical component for the adoption of new technologies. Examples of the road to innovation are provided, based on the experience with development of the transcatheter aortic valve. Finally, ideas are presented to contribute to the further development of this worldwide trend in innovation. PMID:26241234

  17. On-line application of near-infrared spectroscopy for monitoring water levels in parts per million in a manufacturing-scale distillation process.

    PubMed

    Lambertus, Gordon; Shi, Zhenqi; Forbes, Robert; Kramer, Timothy T; Doherty, Steven; Hermiller, James; Scully, Norma; Wong, Sze Wing; LaPack, Mark

    2014-01-01

    An on-line analytical method based on transmission near-infrared spectroscopy (NIRS) for the quantitative determination of water concentrations (in parts per million) was developed and applied to the manufacture of a pharmaceutical intermediate. Calibration models for water analysis, built at the development site and applied at the manufacturing site, were successfully demonstrated during six manufacturing runs at a 250-gallon scale. The water measurements will be used as a forward-processing control point following distillation of a toluene product solution prior to use in a Grignard reaction. The most significant impact of using this NIRS-based process analytical technology (PAT) to replace off-line measurements is the significant reduction in the risk of operator exposure through the elimination of sampling of a severely lachrymatory and mutagenic compound. The work described in this report illustrates the development effort from proof-of-concept phase to manufacturing implementation.

  18. Model-based analysis of coupled equilibrium-kinetic processes: indirect kinetic studies of thermodynamic parameters using the dynamic data.

    PubMed

    Emami, Fereshteh; Maeder, Marcel; Abdollahi, Hamid

    2015-05-07

    Thermodynamic studies of equilibrium chemical reactions linked with kinetic procedures are mostly impossible by traditional approaches. In this work, the new concept of generalized kinetic study of thermodynamic parameters is introduced for dynamic data. The examples of equilibria intertwined with kinetic chemical mechanisms include molecular charge transfer complex formation reactions, pH-dependent degradation of chemical compounds and tautomerization kinetics in micellar solutions. Model-based global analysis with the possibility of calculating and embedding the equilibrium and kinetic parameters into the fitting algorithm has allowed the complete analysis of the complex reaction mechanisms. After the fitting process, the optimal equilibrium and kinetic parameters together with an estimate of their standard deviations have been obtained. This work opens up a promising new avenue for obtaining equilibrium constants through the kinetic data analysis for the kinetic reactions that involve equilibrium processes.

  19. Biosorption: current perspectives on concept, definition and application.

    PubMed

    Fomina, Marina; Gadd, Geoffrey Michael

    2014-05-01

    Biosorption is a physico-chemical and metabolically-independent process based on a variety of mechanisms including absorption, adsorption, ion exchange, surface complexation and precipitation. Biosorption processes are highly important in the environment and conventional biotreatment processes. As a branch of biotechnology, biosorption has been aimed at the removal or recovery of organic and inorganic substances from solution by biological material which can include living or dead microorganisms and their components, seaweeds, plant materials, industrial and agricultural wastes and natural residues. For decades biosorption has been heralded as a promising cost-effective clean-up biotechnology. Despite significant progress in our understanding of this complex phenomenon and a dramatic increase in publications in this research area, commercialization of biosorption technologies has been limited so far. This article summarizes existing knowledge on various aspects of the fundamentals and applications of biosorption and critically reviews the obstacles to commercial success and future perspectives. Copyright © 2014 Elsevier Ltd. All rights reserved.

  20. The study of electrochemical cell taught by problem-based learning

    NASA Astrophysics Data System (ADS)

    Srichaitung, Paisan

    2018-01-01

    According to the teaching activity of Chemistry, researcher found that students were not able to seek self knowledge even applied knowledge to their everyday life. Therefore, the researcher is interested in creating an activity to have students constructed their knowledge, science process skills, and can apply knowledge in their everyday life. The researcher presented form of teaching activity of electrochemical cell by using problem-based learning for Mathayom five students of Thai Christian School. The teaching activity focused on electron transfer in galvanic cell. In this activity, the researcher assigned students to design the electron transfer in galvanic cell using any solution that could light up the bulb. Then students were separated into a group of two, which were total seven groups. Each group of students searched the information about the electron transfer in galvanic cell from books, internet, or other sources of information. After students received concepts, or knowledge they searched for, Students designed and did the experiment. Finally, the students in each groups had twenty minutes to give a presentation in front of the classroom about the electron transfer in galvanic using any solution to light up the bulb with showing the experiment, and five minutes to answer their classmates' questions. Giving the presentation took four periods with total seven groups. After students finished their presentation, the researcher had students discussed and summarized the teaching activity's main idea of electron transfer in galvanic. Then, researcher observed students' behavior in each group found that 85.7 percentages of total students developed science process skills, and transferred their knowledge through presentation completely. When students done the post test, the researcher found that 92.85 percentages of total students were able to explain the concept of galvanic cell, described the preparation and the selection of experimental equipment. Furthermore, students constructed their skills, scientific process, and seek self knowledge which made them seek the choices to solve problems variously. This Research using problem-based learning can be applied to teaching activity in other subjects.

  1. A System-Oriented Approach for the Optimal Control of Process Chains under Stochastic Influences

    NASA Astrophysics Data System (ADS)

    Senn, Melanie; Schäfer, Julian; Pollak, Jürgen; Link, Norbert

    2011-09-01

    Process chains in manufacturing consist of multiple connected processes in terms of dynamic systems. The properties of a product passing through such a process chain are influenced by the transformation of each single process. There exist various methods for the control of individual processes, such as classical state controllers from cybernetics or function mapping approaches realized by statistical learning. These controllers ensure that a desired state is obtained at process end despite of variations in the input and disturbances. The interactions between the single processes are thereby neglected, but play an important role in the optimization of the entire process chain. We divide the overall optimization into two phases: (1) the solution of the optimization problem by Dynamic Programming to find the optimal control variable values for each process for any encountered end state of its predecessor and (2) the application of the optimal control variables at runtime for the detected initial process state. The optimization problem is solved by selecting adequate control variables for each process in the chain backwards based on predefined quality requirements for the final product. For the demonstration of the proposed concept, we have chosen a process chain from sheet metal manufacturing with simplified transformation functions.

  2. Directly Observing Micelle Fusion and Growth in Solution by Liquid-Cell Transmission Electron Microscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parent, Lucas R.; Bakalis, Evangelos; Ramírez-Hernández, Abelardo

    Amphiphilic small molecules and polymers form commonplace nanoscale macromolecular compartments and bilayers, and as such are truly essential components in all cells and in many cellular processes. The nature of these architectures, including their formation, phase changes, and stimuli-response behaviors, is necessary for the most basic functions of life, and over the past half-century, these natural micellar structures have inspired a vast diversity of industrial products, from biomedicines to detergents, lubricants, and coatings. The importance of these materials and their ubiquity have made them the subject of intense investigation regarding their nanoscale dynamics with increasing interest in obtaining sufficient temporalmore » and spatial resolution to directly observe nanoscale processes. However, the vast majority of experimental methods involve either bulk-averaging techniques including light, neutron, and X-ray scattering, or are static in nature including even the most advanced cryogenic transmission electron microscopy techniques. Here, we employ in situ liquid-cell transmission electron microscopy (LCTEM) to directly observe the evolution of individual amphiphilic block copolymer micellar nanoparticles in solution, in real time with nanometer spatial resolution. These observations, made on a proof-of-concept bioconjugate polymer amphiphile, revealed growth and evolution occurring by unimer addition processes and by particle-particle collision-and-fusion events. The experimental approach, combining direct LCTEM observation, quantitative analysis of LCTEM data, and correlated in silico simulations, provides a unique view of solvated soft matter nanoassemblies as they morph and evolve in time and space, enabling us to capture these phenomena in solution.« less

  3. Two-Black Box Concept for Warhead Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bates, Cameron Russell; Frame, Katherine Chiyoko; Mckigney, Edward Allen

    2017-03-06

    We have created a possible solution to meeting the requirements of certification/authentication while still employing complicated criteria. Technical solutions to protecting information from the host in an inspection environment needs to be assessed by those with specific expertise but, LANL can still study the verification problem. The two-black box framework developed provides another potential solution to the confidence vs. certification paradox.

  4. Research into display sharing techniques for distributed computing environments

    NASA Technical Reports Server (NTRS)

    Hugg, Steven B.; Fitzgerald, Paul F., Jr.; Rosson, Nina Y.; Johns, Stephen R.

    1990-01-01

    The X-based Display Sharing solution for distributed computing environments is described. The Display Sharing prototype includes the base functionality for telecast and display copy requirements. Since the prototype implementation is modular and the system design provided flexibility for the Mission Control Center Upgrade (MCCU) operational consideration, the prototype implementation can be the baseline for a production Display Sharing implementation. To facilitate the process the following discussions are presented: Theory of operation; System of architecture; Using the prototype; Software description; Research tools; Prototype evaluation; and Outstanding issues. The prototype is based on the concept of a dedicated central host performing the majority of the Display Sharing processing, allowing minimal impact on each individual workstation. Each workstation participating in Display Sharing hosts programs to facilitate the user's access to Display Sharing as host machine.

  5. HSR combustion analytical research

    NASA Technical Reports Server (NTRS)

    Nguyen, H. Lee

    1992-01-01

    Increasing the pressure and temperature of the engines of a new generation of supersonic airliners increases the emissions of nitrogen oxides (NO(x)) to a level that would have an adverse impact on the Earth's protective ozone layer. In the process of evolving and implementing low emissions combustor technologies, NASA LeRC has pursued a combustion analysis code program to guide combustor design processes, to identify potential concepts of the greatest promise, and to optimize them at low cost, with short turnaround time. The computational analyses are evaluated at actual engine operating conditions. The approach is to upgrade and apply advanced computer programs for gas turbine applications. Efforts were made in further improving the code capabilities for modeling the physics and the numerical methods of solution. Then test cases and measurements from experiments are used for code validation.

  6. Role of OpenEHR as an open source solution for the regional modelling of patient data in obstetrics.

    PubMed

    Pahl, Christina; Zare, Mojtaba; Nilashi, Mehrbakhsh; de Faria Borges, Marco Aurélio; Weingaertner, Daniel; Detschew, Vesselin; Supriyanto, Eko; Ibrahim, Othman

    2015-06-01

    This work investigates, whether openEHR with its reference model, archetypes and templates is suitable for the digital representation of demographic as well as clinical data. Moreover, it elaborates openEHR as a tool for modelling Hospital Information Systems on a regional level based on a national logical infrastructure. OpenEHR is a dual model approach developed for the modelling of Hospital Information Systems enabling semantic interoperability. A holistic solution to this represents the use of dual model based Electronic Healthcare Record systems. Modelling data in the field of obstetrics is a challenge, since different regions demand locally specific information for the process of treatment. Smaller health units in developing countries like Brazil or Malaysia, which until recently handled automatable processes like the storage of sensitive patient data in paper form, start organizational reconstruction processes. This archetype proof-of-concept investigation has tried out some elements of the openEHR methodology in cooperation with a health unit in Colombo, Brazil. Two legal forms provided by the Brazilian Ministry of Health have been analyzed and classified into demographic and clinical data. LinkEHR-Ed editor was used to read, edit and create archetypes. Results show that 33 clinical and demographic concepts, which are necessary to cover data demanded by the Unified National Health System, were identified. Out of the concepts 61% were reused and 39% modified to cover domain requirements. The detailed process of reuse, modification and creation of archetypes is shown. We conclude that, although a major part of demographic and clinical patient data were already represented by existing archetypes, a significant part required major modifications. In this study openEHR proved to be a highly suitable tool in the modelling of complex health data. In combination with LinkEHR-Ed software it offers user-friendly and highly applicable tools, although the complexity built by the vast specifications requires expert networks to define generally excepted clinical models. Finally, this project has pointed out main benefits enclosing high coverage of obstetrics data on the Clinical Knowledge Manager, simple modelling, and wide network and support using openEHR. Moreover, barriers described are enclosing the allocation of clinical content to respective archetypes, as well as stagnant adaption of changes on the Clinical Knowledge Manager leading to redundant efforts in data contribution that need to be addressed in future works. Copyright © 2015 Elsevier Inc. All rights reserved.

  7. A New Framework and Prototype Solution for Clinical Decision Support and Research in Genomics and Other Data-intensive Fields of Medicine

    PubMed Central

    Evans, James P.; Wilhelmsen, Kirk C.; Berg, Jonathan; Schmitt, Charles P.; Krishnamurthy, Ashok; Fecho, Karamarie; Ahalt, Stanley C.

    2016-01-01

    Introduction: In genomics and other fields, it is now possible to capture and store large amounts of data in electronic medical records (EMRs). However, it is not clear if the routine accumulation of massive amounts of (largely uninterpretable) data will yield any health benefits to patients. Nevertheless, the use of large-scale medical data is likely to grow. To meet emerging challenges and facilitate optimal use of genomic data, our institution initiated a comprehensive planning process that addresses the needs of all stakeholders (e.g., patients, families, healthcare providers, researchers, technical staff, administrators). Our experience with this process and a key genomics research project contributed to the proposed framework. Framework: We propose a two-pronged Genomic Clinical Decision Support System (CDSS) that encompasses the concept of the “Clinical Mendeliome” as a patient-centric list of genomic variants that are clinically actionable and introduces the concept of the “Archival Value Criterion” as a decision-making formalism that approximates the cost-effectiveness of capturing, storing, and curating genome-scale sequencing data. We describe a prototype Genomic CDSS that we developed as a first step toward implementation of the framework. Conclusion: The proposed framework and prototype solution are designed to address the perspectives of stakeholders, stimulate effective clinical use of genomic data, drive genomic research, and meet current and future needs. The framework also can be broadly applied to additional fields, including other ‘-omics’ fields. We advocate for the creation of a Task Force on the Clinical Mendeliome, charged with defining Clinical Mendeliomes and drafting clinical guidelines for their use. PMID:27195307

  8. Deriving a probabilistic syntacto-semantic grammar for biomedicine based on domain-specific terminologies

    PubMed Central

    Fan, Jung-Wei; Friedman, Carol

    2011-01-01

    Biomedical natural language processing (BioNLP) is a useful technique that unlocks valuable information stored in textual data for practice and/or research. Syntactic parsing is a critical component of BioNLP applications that rely on correctly determining the sentence and phrase structure of free text. In addition to dealing with the vast amount of domain-specific terms, a robust biomedical parser needs to model the semantic grammar to obtain viable syntactic structures. With either a rule-based or corpus-based approach, the grammar engineering process requires substantial time and knowledge from experts, and does not always yield a semantically transferable grammar. To reduce the human effort and to promote semantic transferability, we propose an automated method for deriving a probabilistic grammar based on a training corpus consisting of concept strings and semantic classes from the Unified Medical Language System (UMLS), a comprehensive terminology resource widely used by the community. The grammar is designed to specify noun phrases only due to the nominal nature of the majority of biomedical terminological concepts. Evaluated on manually parsed clinical notes, the derived grammar achieved a recall of 0.644, precision of 0.737, and average cross-bracketing of 0.61, which demonstrated better performance than a control grammar with the semantic information removed. Error analysis revealed shortcomings that could be addressed to improve performance. The results indicated the feasibility of an approach which automatically incorporates terminology semantics in the building of an operational grammar. Although the current performance of the unsupervised solution does not adequately replace manual engineering, we believe once the performance issues are addressed, it could serve as an aide in a semi-supervised solution. PMID:21549857

  9. Integrating technology education concepts into China's educational system

    NASA Astrophysics Data System (ADS)

    Yang, Faxian

    The problem of this study was to develop a strategy for integrating technology education concepts within the Chinese mathematics and science curricula. The researcher used a case study as the basic methodology. It included three methods for collecting data: literature review, field study in junior and senior secondary schools in America and China, and interviews with experienced educators who were familiar with the status of technology education programs in the selected countries. The data came from the following areas: Japan, Taiwan, the United Kingdom, China, and five states in the United States: Illinois, Iowa, Maryland, Massachusetts, and New York. The researcher summarized each state and country's educational data, identified the advantages and disadvantages of their current technology education program, and identified the major concepts within each program. The process determined that identified concepts would be readily acceptable into the current Chinese educational system. Modernization of, industry, agriculture, science and technology, and defense have been recent objectives of the Chinese government. Therefore, Chinese understanding of technology, or technology education, became important for the country. However, traditional thought and culture curb the implementation of technology education within China's current education system. The proposed solution was to integrate technology education concepts into China's mathematics and science curricula. The purpose of the integration was to put new thoughts and methods into the current educational structure. It was concluded that the proposed model and interventions would allow Chinese educators to carry out the integration into China's education system.

  10. Constrained Optimization Methods in Health Services Research-An Introduction: Report 1 of the ISPOR Optimization Methods Emerging Good Practices Task Force.

    PubMed

    Crown, William; Buyukkaramikli, Nasuh; Thokala, Praveen; Morton, Alec; Sir, Mustafa Y; Marshall, Deborah A; Tosh, Jon; Padula, William V; Ijzerman, Maarten J; Wong, Peter K; Pasupathy, Kalyan S

    2017-03-01

    Providing health services with the greatest possible value to patients and society given the constraints imposed by patient characteristics, health care system characteristics, budgets, and so forth relies heavily on the design of structures and processes. Such problems are complex and require a rigorous and systematic approach to identify the best solution. Constrained optimization is a set of methods designed to identify efficiently and systematically the best solution (the optimal solution) to a problem characterized by a number of potential solutions in the presence of identified constraints. This report identifies 1) key concepts and the main steps in building an optimization model; 2) the types of problems for which optimal solutions can be determined in real-world health applications; and 3) the appropriate optimization methods for these problems. We first present a simple graphical model based on the treatment of "regular" and "severe" patients, which maximizes the overall health benefit subject to time and budget constraints. We then relate it back to how optimization is relevant in health services research for addressing present day challenges. We also explain how these mathematical optimization methods relate to simulation methods, to standard health economic analysis techniques, and to the emergent fields of analytics and machine learning. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  11. Information needs related to extension service and community outreach.

    PubMed

    Bottcher, Robert W

    2003-06-01

    Air quality affects everyone. Some people are affected by air quality impacts, regulations, and technological developments in several ways. Stakeholders include the medical community, ecologists, government regulators, industries, technology providers, academic professionals, concerned citizens, the news media, and elected officials. Each of these groups may perceive problems and opportunities differently, but all need access to information as it is developed. The diversity and complexity of air quality problems contribute to the challenges faced by extension and outreach professionals who must communicate with stakeholders having diverse backgrounds. Gases, particulates, biological aerosols, pathogens, and odors all require expensive and relatively complex technology to measure and control. Economic constraints affect the ability of regulators and others to measure air quality, and industry and others to control it. To address these challenges, while communicating air quality research results and concepts to stakeholders, three areas of information needs are evident. (1) A basic understanding of the fundamental concepts regarding air pollutants and their measurement and control is needed by all stakeholders; the Extension Specialist, to be effective, must help people move some distance up the learning curve. (2) Each problem or set of problems must be reasonably well defined since comprehensive solution of all problems simultaneously may not be feasible; for instance, the solution of an odor problem associated with animal production may not address atmospheric effects due to ammonia emissions. (3) The integrity of the communication process must be preserved by avoiding prejudice and protectionism; although stakeholders may seek to modify information to enhance their interests, extension and outreach professionals must be willing to present unwelcome information or admit to a lack of information. A solid grounding in fundamental concepts, careful and fair problem definition, and a resolute commitment to integrity and credibility will enable effective communication of air quality information to and among diverse stakeholders.

  12. Understanding transporter specificity and the discrete appearance of channel-like gating domains in transporters

    PubMed Central

    Diallinas, George

    2014-01-01

    Transporters are ubiquitous proteins mediating the translocation of solutes across cell membranes, a biological process involved in nutrition, signaling, neurotransmission, cell communication and drug uptake or efflux. Similarly to enzymes, most transporters have a single substrate binding-site and thus their activity follows Michaelis-Menten kinetics. Substrate binding elicits a series of structural changes, which produce a transporter conformer open toward the side opposite to the one from where the substrate was originally bound. This mechanism, involving alternate outward- and inward-facing transporter conformers, has gained significant support from structural, genetic, biochemical and biophysical approaches. Most transporters are specific for a given substrate or a group of substrates with similar chemical structure, but substrate specificity and/or affinity can vary dramatically, even among members of a transporter family that show high overall amino acid sequence and structural similarity. The current view is that transporter substrate affinity or specificity is determined by a small number of interactions a given solute can make within a specific binding site. However, genetic, biochemical and in silico modeling studies with the purine transporter UapA of the filamentous ascomycete Aspergillus nidulans have challenged this dogma. This review highlights results leading to a novel concept, stating that substrate specificity, but also transport kinetics and transporter turnover, are determined by subtle intramolecular interactions between a major substrate binding site and independent outward- or cytoplasmically-facing gating domains, analogous to those present in channels. This concept is supported by recent structural evidence from several, phylogenetically and functionally distinct transporter families. The significance of this concept is discussed in relationship to the role and potential exploitation of transporters in drug action. PMID:25309439

  13. Hydrophobic Solvation: Aqueous Methane Solutions

    ERIC Educational Resources Information Center

    Konrod, Oliver; Lankau, Timm

    2007-01-01

    A basic introduction to concept of a solvation shell around an apolar solute as well as its detection is presented. The hydrophobic solvation of toluene is found to be a good teaching example which connects macroscopic, phenomenological thermodynamic results with an atomistic point of view.

  14. Inorganic and Protein Crystal Assembly in Solutions

    NASA Technical Reports Server (NTRS)

    Chernov, A. A.

    2005-01-01

    The basic kinetic and thermodynamic concepts of crystal growth will be revisited in view of recent AFM and interferometric findings. These concepts are as follows: 1) The Kossel crystal model that allows only one kink type on the crystal surface. The modern theory is developed overwhelmingly for the Kessel model; 2) Presumption that intensive step fluctuations maintain kink density sufficiently high to allow applicability of Gibbs-Thomson law; 3) Common experience that unlimited step bunching (morphological instability) during layer growth from solutions and supercooled melts always takes place if the step flow direction coincides with that of the fluid.

  15. Minimize system cost by choosing optimal subsystem reliability and redundancy

    NASA Technical Reports Server (NTRS)

    Suich, Ronald C.; Patterson, Richard L.

    1993-01-01

    The basic question which we address in this paper is how to choose among competing subsystems. This paper utilizes both reliabilities and costs to find the subsystems with the lowest overall expected cost. The paper begins by reviewing some of the concepts of expected value. We then address the problem of choosing among several competing subsystems. These concepts are then applied to k-out-of-n: G subsystems. We illustrate the use of the authors' basic program in viewing a range of possible solutions for several different examples. We then discuss the implications of various solutions in these examples.

  16. On the singular perturbations for fractional differential equation.

    PubMed

    Atangana, Abdon

    2014-01-01

    The goal of this paper is to examine the possible extension of the singular perturbation differential equation to the concept of fractional order derivative. To achieve this, we presented a review of the concept of fractional calculus. We make use of the Laplace transform operator to derive exact solution of singular perturbation fractional linear differential equations. We make use of the methodology of three analytical methods to present exact and approximate solution of the singular perturbation fractional, nonlinear, nonhomogeneous differential equation. These methods are including the regular perturbation method, the new development of the variational iteration method, and the homotopy decomposition method.

  17. The stability issues in problems of mathematical modeling

    NASA Astrophysics Data System (ADS)

    Mokin, A. Yu.; Savenkova, N. P.; Udovichenko, N. S.

    2018-03-01

    In the paper it is briefly considered various aspects of stability concepts, which are used in physics, mathematics and numerical methods of solution. The interrelation between these concepts is described, the questions of preliminary stability research before the numerical solution of the problem and the correctness of the mathematical statement of the physical problem are discussed. Examples of concrete mathematical statements of individual physical problems are given: a nonlocal problem for the heat equation, the Korteweg-de Fries equation with boundary conditions at infinity, the sine-Gordon equation, the problem of propagation of femtosecond light pulses in an area with a cubic nonlinearity.

  18. Data-Intensive Science meets Inquiry-Driven Pedagogy: Interactive Big Data Exploration, Threshold Concepts, and Liminality

    NASA Technical Reports Server (NTRS)

    Ramachandran, Rahul; Word, Andrea; Nair, Udasysankar

    2014-01-01

    Threshold concepts in any discipline are the core concepts an individual must understand in order to master a discipline. By their very nature, these concepts are troublesome, irreversible, integrative, bounded, discursive, and reconstitutive. Although grasping threshold concepts can be extremely challenging for each learner as s/he moves through stages of cognitive development relative to a given discipline, the learner's grasp of these concepts determines the extent to which s/he is prepared to work competently and creatively within the field itself. The movement of individuals from a state of ignorance of these core concepts to one of mastery occurs not along a linear path but in iterative cycles of knowledge creation and adjustment in liminal spaces - conceptual spaces through which learners move from the vaguest awareness of concepts to mastery, accompanied by understanding of their relevance, connectivity, and usefulness relative to questions and constructs in a given discipline. For example, challenges in the teaching and learning of atmospheric science can be traced to threshold concepts in fluid dynamics. In particular, Dynamic Meteorology is one of the most challenging courses for graduate students and undergraduates majoring in Atmospheric Science. Dynamic Meteorology introduces threshold concepts - those that prove troublesome for the majority of students but that are essential, associated with fundamental relationships between forces and motion in the atmosphere and requiring the application of basic classical statics, dynamics, and thermodynamic principles to the three dimensionally varying atmospheric structure. With the explosive growth of data available in atmospheric science, driven largely by satellite Earth observations and high-resolution numerical simulations, paradigms such as that of dataintensive science have emerged. These paradigm shifts are based on the growing realization that current infrastructure, tools and processes will not allow us to analyze and fully utilize the complex and voluminous data that is being gathered. In this emerging paradigm, the scientific discovery process is driven by knowledge extracted from large volumes of data. In this presentation, we contend that this paradigm naturally lends to inquiry-driven pedagogy where knowledge is discovered through inductive engagement with large volumes of data rather than reached through traditional, deductive, hypothesis-driven analyses. In particular, data-intensive techniques married with an inductive methodology allow for exploration on a scale that is not possible in the traditional classroom with its typical problem sets and static, limited data samples. In addition, we identify existing gaps and possible solutions for addressing the infrastructure and tools as well as a pedagogical framework through which to implement this inductive approach.

  19. Automatic analysis of online image data for law enforcement agencies by concept detection and instance search

    NASA Astrophysics Data System (ADS)

    de Boer, Maaike H. T.; Bouma, Henri; Kruithof, Maarten C.; ter Haar, Frank B.; Fischer, Noëlle M.; Hagendoorn, Laurens K.; Joosten, Bart; Raaijmakers, Stephan

    2017-10-01

    The information available on-line and off-line, from open as well as from private sources, is growing at an exponential rate and places an increasing demand on the limited resources of Law Enforcement Agencies (LEAs). The absence of appropriate tools and techniques to collect, process, and analyze the volumes of complex and heterogeneous data has created a severe information overload. If a solution is not found, the impact on law enforcement will be dramatic, e.g. because important evidence is missed or the investigation time is too long. Furthermore, there is an uneven level of capabilities to deal with the large volumes of complex and heterogeneous data that come from multiple open and private sources at national level across the EU, which hinders cooperation and information sharing. Consequently, there is a pertinent need to develop tools, systems and processes which expedite online investigations. In this paper, we describe a suite of analysis tools to identify and localize generic concepts, instances of objects and logos in images, which constitutes a significant portion of everyday law enforcement data. We describe how incremental learning based on only a few examples and large-scale indexing are addressed in both concept detection and instance search. Our search technology allows querying of the database by visual examples and by keywords. Our tools are packaged in a Docker container to guarantee easy deployment on a system and our tools exploit possibilities provided by open source toolboxes, contributing to the technical autonomy of LEAs.

  20. Concepts of Life in the Contexts of Mars

    NASA Technical Reports Server (NTRS)

    Des Marais, D. J.

    2014-01-01

    The search for habitable environments and life requires a working concept of life's fundamental attributes. This concept helps to identify the "services" that an environment must provide to sustain life. We must consider the possibility that extraterrestrial life might differ fundamentally from our own, but it is still worthwhile to begin by hypothesizing attributes of life that might be universal versus ones that reflect local solutions to survival on Earth.

  1. On the reliability of computed chaotic solutions of non-linear differential equations

    NASA Astrophysics Data System (ADS)

    Liao, Shijun

    2009-08-01

    A new concept, namely the critical predictable time Tc, is introduced to give a more precise description of computed chaotic solutions of non-linear differential equations: it is suggested that computed chaotic solutions are unreliable and doubtable when t > Tc. This provides us a strategy to detect reliable solution from a given computed result. In this way, the computational phenomena, such as computational chaos (CC), computational periodicity (CP) and computational prediction uncertainty, which are mainly based on long-term properties of computed time-series, can be completely avoided. Using this concept, the famous conclusion `accurate long-term prediction of chaos is impossible' should be replaced by a more precise conclusion that `accurate prediction of chaos beyond the critical predictable time Tc is impossible'. So, this concept also provides us a timescale to determine whether or not a particular time is long enough for a given non-linear dynamic system. Besides, the influence of data inaccuracy and various numerical schemes on the critical predictable time is investigated in details by using symbolic computation software as a tool. A reliable chaotic solution of Lorenz equation in a rather large interval 0 <= t < 1200 non-dimensional Lorenz time units is obtained for the first time. It is found that the precision of the initial condition and the computed data at each time step, which is mathematically necessary to get such a reliable chaotic solution in such a long time, is so high that it is physically impossible due to the Heisenberg uncertainty principle in quantum physics. This, however, provides us a so-called `precision paradox of chaos', which suggests that the prediction uncertainty of chaos is physically unavoidable, and that even the macroscopical phenomena might be essentially stochastic and thus could be described by probability more economically.

  2. A Monte Carlo method for the simulation of coagulation and nucleation based on weighted particles and the concepts of stochastic resolution and merging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kotalczyk, G., E-mail: Gregor.Kotalczyk@uni-due.de; Kruis, F.E.

    Monte Carlo simulations based on weighted simulation particles can solve a variety of population balance problems and allow thus to formulate a solution-framework for many chemical engineering processes. This study presents a novel concept for the calculation of coagulation rates of weighted Monte Carlo particles by introducing a family of transformations to non-weighted Monte Carlo particles. The tuning of the accuracy (named ‘stochastic resolution’ in this paper) of those transformations allows the construction of a constant-number coagulation scheme. Furthermore, a parallel algorithm for the inclusion of newly formed Monte Carlo particles due to nucleation is presented in the scope ofmore » a constant-number scheme: the low-weight merging. This technique is found to create significantly less statistical simulation noise than the conventional technique (named ‘random removal’ in this paper). Both concepts are combined into a single GPU-based simulation method which is validated by comparison with the discrete-sectional simulation technique. Two test models describing a constant-rate nucleation coupled to a simultaneous coagulation in 1) the free-molecular regime or 2) the continuum regime are simulated for this purpose.« less

  3. Naval electronic warfare simulation for effectiveness assessment and softkill programmability facility

    NASA Astrophysics Data System (ADS)

    Lançon, F.

    2011-06-01

    The Anti-ship Missile (ASM) threat to be faced by ships will become more diverse and difficult. Intelligence, rules of engagement constraints, fast reaction-time for effective softkill solution require specific tools to design Electronic Warfare (EW) systems and to integrate it onboard ship. SAGEM Company provides decoy launcher system [1] and its associated Naval Electronic Warfare Simulation tool (NEWS) to permit softkill effectiveness analysis for anti-ship missile defence. NEWS tool generates virtual environment for missile-ship engagement and counter-measure simulator over a wide spectrum: RF, IR, EO. It integrates EW Command & Control (EWC2) process which is implemented in decoy launcher system and performs Monte-Carlo batch processing to evaluate softkill effectiveness in different engagement situations. NEWS is designed to allow immediate EWC2 process integration from simulation to real decoy launcher system. By design, it allows the final operator to be able to program, test and integrate its own EWC2 module and EW library onboard, so intelligence of each user is protected and evolution of threat can be taken into account through EW library update. The objectives of NEWS tool are also to define a methodology for trial definition and trial data reduction. Growth potential would permit to design new concept for EWC2 programmability and real time effectiveness estimation in EW system. This tool can also be used for operator training purpose. This paper presents the architecture design, the softkill programmability facility concept and the flexibility for onboard integration on ship. The concept of this operationally focused simulation, which is to use only one tool for design, development, trial validation and operational use, will be demonstrated.

  4. Complexity in Soil Systems: What Does It Mean and How Should We Proceed?

    NASA Astrophysics Data System (ADS)

    Faybishenko, B.; Molz, F. J.; Brodie, E.; Hubbard, S. S.

    2015-12-01

    The complex soil systems approach is needed fundamentally for the development of integrated, interdisciplinary methods to measure and quantify the physical, chemical and biological processes taking place in soil, and to determine the role of fine-scale heterogeneities. This presentation is aimed at a review of the concepts and observations concerning complexity and complex systems theory, including terminology, emergent complexity and simplicity, self-organization and a general approach to the study of complex systems using the Weaver (1948) concept of "organized complexity." These concepts are used to provide understanding of complex soil systems, and to develop experimental and mathematical approaches to soil microbiological processes. The results of numerical simulations, observations and experiments are presented that indicate the presence of deterministic chaotic dynamics in soil microbial systems. So what are the implications for the scientists who wish to develop mathematical models in the area of organized complexity or to perform experiments to help clarify an aspect of an organized complex system? The modelers have to deal with coupled systems having at least three dependent variables, and they have to forgo making linear approximations to nonlinear phenomena. The analogous rule for experimentalists is that they need to perform experiments that involve measurement of at least three interacting entities (variables depending on time, space, and each other). These entities could be microbes in soil penetrated by roots. If a process being studied in a soil affects the soil properties, like biofilm formation, then this effect has to be measured and included. The mathematical implications of this viewpoint are examined, and results of numerical solutions to a system of equations demonstrating deterministic chaotic behavior are also discussed using time series and the 3D strange attractors.

  5. Technology Transfer Challenges: A Case Study of User-Centered Design in NASA's Systems Engineering Culture

    NASA Technical Reports Server (NTRS)

    Quick, Jason

    2009-01-01

    The Upper Stage (US) section of the National Aeronautics and Space Administration's (NASA) Ares I rocket will require internal access platforms for maintenance tasks performed by humans inside the vehicle. Tasks will occur during expensive critical path operations at Kennedy Space Center (KSC) including vehicle stacking and launch preparation activities. Platforms must be translated through a small human access hatch, installed in an enclosed worksite environment, support the weight of ground operators and be removed before flight - and their design must minimize additional vehicle mass at attachment points. This paper describes the application of a user-centered conceptual design process and the unique challenges encountered within NASA's systems engineering culture focused on requirements and "heritage hardware". The NASA design team at Marshall Space Flight Center (MSFC) initiated the user-centered design process by studying heritage internal access kits and proposing new design concepts during brainstorming sessions. Simultaneously, they partnered with the Technology Transfer/Innovative Partnerships Program to research inflatable structures and dynamic scaffolding solutions that could enable ground operator access. While this creative, technology-oriented exploration was encouraged by upper management, some design stakeholders consistently opposed ideas utilizing novel, untested equipment. Subsequent collaboration with an engineering consulting firm improved the technical credibility of several options, however, there was continued resistance from team members focused on meeting system requirements with pre-certified hardware. After a six-month idea-generating phase, an intensive six-week effort produced viable design concepts that justified additional vehicle mass while optimizing the human factors of platform installation and use. Although these selected final concepts closely resemble heritage internal access platforms, challenges from the application of the user-centered process provided valuable lessons for improving future collaborative conceptual design efforts.

  6. A framework for multi-stakeholder decision-making and ...

    EPA Pesticide Factsheets

    We propose a decision-making framework to compute compromise solutions that balance conflicting priorities of multiple stakeholders on multiple objectives. In our setting, we shape the stakeholder dis-satisfaction distribution by solving a conditional-value-at-risk (CVaR) minimization problem. The CVaR problem is parameterized by a probability level that shapes the tail of the dissatisfaction distribution. The proposed approach allows us to compute a family of compromise solutions and generalizes multi-stakeholder settings previously proposed in the literature that minimize average and worst-case dissatisfactions. We use the concept of the CVaR norm to give a geometric interpretation to this problem +and use the properties of this norm to prove that the CVaR minimization problem yields Pareto optimal solutions for any choice of the probability level. We discuss a broad range of potential applications of the framework that involve complex decision-making processes. We demonstrate the developments using a biowaste facility location case study in which we seek to balance stakeholder priorities on transportation, safety, water quality, and capital costs. This manuscript describes the methodology of a new decision-making framework that computes compromise solutions that balance conflicting priorities of multiple stakeholders on multiple objectives as needed for SHC Decision Science and Support Tools project. A biowaste facility location is employed as the case study

  7. A Bayesian Hierarchical Model for Glacial Dynamics Based on the Shallow Ice Approximation and its Evaluation Using Analytical Solutions

    NASA Astrophysics Data System (ADS)

    Gopalan, Giri; Hrafnkelsson, Birgir; Aðalgeirsdóttir, Guðfinna; Jarosch, Alexander H.; Pálsson, Finnur

    2018-03-01

    Bayesian hierarchical modeling can assist the study of glacial dynamics and ice flow properties. This approach will allow glaciologists to make fully probabilistic predictions for the thickness of a glacier at unobserved spatio-temporal coordinates, and it will also allow for the derivation of posterior probability distributions for key physical parameters such as ice viscosity and basal sliding. The goal of this paper is to develop a proof of concept for a Bayesian hierarchical model constructed, which uses exact analytical solutions for the shallow ice approximation (SIA) introduced by Bueler et al. (2005). A suite of test simulations utilizing these exact solutions suggests that this approach is able to adequately model numerical errors and produce useful physical parameter posterior distributions and predictions. A byproduct of the development of the Bayesian hierarchical model is the derivation of a novel finite difference method for solving the SIA partial differential equation (PDE). An additional novelty of this work is the correction of numerical errors induced through a numerical solution using a statistical model. This error correcting process models numerical errors that accumulate forward in time and spatial variation of numerical errors between the dome, interior, and margin of a glacier.

  8. Replica exchange with solute tempering: A method for sampling biological systems in explicit water

    NASA Astrophysics Data System (ADS)

    Liu, Pu; Kim, Byungchan; Friesner, Richard A.; Berne, B. J.

    2005-09-01

    An innovative replica exchange (parallel tempering) method called replica exchange with solute tempering (REST) for the efficient sampling of aqueous protein solutions is presented here. The method bypasses the poor scaling with system size of standard replica exchange and thus reduces the number of replicas (parallel processes) that must be used. This reduction is accomplished by deforming the Hamiltonian function for each replica in such a way that the acceptance probability for the exchange of replica configurations does not depend on the number of explicit water molecules in the system. For proof of concept, REST is compared with standard replica exchange for an alanine dipeptide molecule in water. The comparisons confirm that REST greatly reduces the number of CPUs required by regular replica exchange and increases the sampling efficiency. This method reduces the CPU time required for calculating thermodynamic averages and for the ab initio folding of proteins in explicit water. Author contributions: B.J.B. designed research; P.L. and B.K. performed research; P.L. and B.K. analyzed data; and P.L., B.K., R.A.F., and B.J.B. wrote the paper.Abbreviations: REST, replica exchange with solute tempering; REM, replica exchange method; MD, molecular dynamics.*P.L. and B.K. contributed equally to this work.

  9. Programmed Evolution for Optimization of Orthogonal Metabolic Output in Bacteria

    PubMed Central

    Eckdahl, Todd T.; Campbell, A. Malcolm; Heyer, Laurie J.; Poet, Jeffrey L.; Blauch, David N.; Snyder, Nicole L.; Atchley, Dustin T.; Baker, Erich J.; Brown, Micah; Brunner, Elizabeth C.; Callen, Sean A.; Campbell, Jesse S.; Carr, Caleb J.; Carr, David R.; Chadinha, Spencer A.; Chester, Grace I.; Chester, Josh; Clarkson, Ben R.; Cochran, Kelly E.; Doherty, Shannon E.; Doyle, Catherine; Dwyer, Sarah; Edlin, Linnea M.; Evans, Rebecca A.; Fluharty, Taylor; Frederick, Janna; Galeota-Sprung, Jonah; Gammon, Betsy L.; Grieshaber, Brandon; Gronniger, Jessica; Gutteridge, Katelyn; Henningsen, Joel; Isom, Bradley; Itell, Hannah L.; Keffeler, Erica C.; Lantz, Andrew J.; Lim, Jonathan N.; McGuire, Erin P.; Moore, Alexander K.; Morton, Jerrad; Nakano, Meredith; Pearson, Sara A.; Perkins, Virginia; Parrish, Phoebe; Pierson, Claire E.; Polpityaarachchige, Sachith; Quaney, Michael J.; Slattery, Abagael; Smith, Kathryn E.; Spell, Jackson; Spencer, Morgan; Taye, Telavive; Trueblood, Kamay; Vrana, Caroline J.; Whitesides, E. Tucker

    2015-01-01

    Current use of microbes for metabolic engineering suffers from loss of metabolic output due to natural selection. Rather than combat the evolution of bacterial populations, we chose to embrace what makes biological engineering unique among engineering fields – evolving materials. We harnessed bacteria to compute solutions to the biological problem of metabolic pathway optimization. Our approach is called Programmed Evolution to capture two concepts. First, a population of cells is programmed with DNA code to enable it to compute solutions to a chosen optimization problem. As analog computers, bacteria process known and unknown inputs and direct the output of their biochemical hardware. Second, the system employs the evolution of bacteria toward an optimal metabolic solution by imposing fitness defined by metabolic output. The current study is a proof-of-concept for Programmed Evolution applied to the optimization of a metabolic pathway for the conversion of caffeine to theophylline in E. coli. Introduced genotype variations included strength of the promoter and ribosome binding site, plasmid copy number, and chaperone proteins. We constructed 24 strains using all combinations of the genetic variables. We used a theophylline riboswitch and a tetracycline resistance gene to link theophylline production to fitness. After subjecting the mixed population to selection, we measured a change in the distribution of genotypes in the population and an increased conversion of caffeine to theophylline among the most fit strains, demonstrating Programmed Evolution. Programmed Evolution inverts the standard paradigm in metabolic engineering by harnessing evolution instead of fighting it. Our modular system enables researchers to program bacteria and use evolution to determine the combination of genetic control elements that optimizes catabolic or anabolic output and to maintain it in a population of cells. Programmed Evolution could be used for applications in energy, pharmaceuticals, chemical commodities, biomining, and bioremediation. PMID:25714374

  10. Catalytic Palladium Film Deposited by Scalable Low-Temperature Aqueous Combustion.

    PubMed

    Voskanyan, Albert A; Li, Chi-Ying Vanessa; Chan, Kwong-Yu

    2017-09-27

    This article describes a novel method for depositing a dense, high quality palladium thin film via a one-step aqueous combustion process which can be easily scaled up. Film deposition of Pd from aqueous solutions by conventional chemical or electrochemical methods is inhibited by hydrogen embrittlement, thus resulting in a brittle palladium film. The method outlined in this work allows a direct aqueous solution deposition of a mirror-bright, durable Pd film on substrates including glass and glassy carbon. This simple procedure has many advantages including a very high deposition rate (>10 cm 2 min -1 ) and a relatively low deposition temperature (250 °C), which makes it suitable for large-scale industrial applications. Although preparation of various high-quality oxide films has been successfully accomplished via solution combustion synthesis (SCS) before, this article presents the first report on direct SCS production of a metallic film. The mechanism of Pd film formation is discussed with the identification of a complex formed between palladium nitrate and glycine at low temperature. The catalytic properties and stability of films are successfully tested in alcohol electrooxidation and electrochemical oxygen reduction reaction. It was observed that combustion deposited Pd film on a glassy carbon electrode showed excellent catalytic activity in ethanol oxidation without using any binder or additive. We also report for the first time the concept of a reusable "catalytic flask" as illustrated by the Suzuki-Miyaura cross-coupling reaction. The Pd film uniformly covers the inner walls of the flask and eliminates the catalyst separation step. We believe the innovative concept of a reusable catalytic flask is very promising and has the required features to become a commercial product in the future.

  11. Programmed evolution for optimization of orthogonal metabolic output in bacteria.

    PubMed

    Eckdahl, Todd T; Campbell, A Malcolm; Heyer, Laurie J; Poet, Jeffrey L; Blauch, David N; Snyder, Nicole L; Atchley, Dustin T; Baker, Erich J; Brown, Micah; Brunner, Elizabeth C; Callen, Sean A; Campbell, Jesse S; Carr, Caleb J; Carr, David R; Chadinha, Spencer A; Chester, Grace I; Chester, Josh; Clarkson, Ben R; Cochran, Kelly E; Doherty, Shannon E; Doyle, Catherine; Dwyer, Sarah; Edlin, Linnea M; Evans, Rebecca A; Fluharty, Taylor; Frederick, Janna; Galeota-Sprung, Jonah; Gammon, Betsy L; Grieshaber, Brandon; Gronniger, Jessica; Gutteridge, Katelyn; Henningsen, Joel; Isom, Bradley; Itell, Hannah L; Keffeler, Erica C; Lantz, Andrew J; Lim, Jonathan N; McGuire, Erin P; Moore, Alexander K; Morton, Jerrad; Nakano, Meredith; Pearson, Sara A; Perkins, Virginia; Parrish, Phoebe; Pierson, Claire E; Polpityaarachchige, Sachith; Quaney, Michael J; Slattery, Abagael; Smith, Kathryn E; Spell, Jackson; Spencer, Morgan; Taye, Telavive; Trueblood, Kamay; Vrana, Caroline J; Whitesides, E Tucker

    2015-01-01

    Current use of microbes for metabolic engineering suffers from loss of metabolic output due to natural selection. Rather than combat the evolution of bacterial populations, we chose to embrace what makes biological engineering unique among engineering fields - evolving materials. We harnessed bacteria to compute solutions to the biological problem of metabolic pathway optimization. Our approach is called Programmed Evolution to capture two concepts. First, a population of cells is programmed with DNA code to enable it to compute solutions to a chosen optimization problem. As analog computers, bacteria process known and unknown inputs and direct the output of their biochemical hardware. Second, the system employs the evolution of bacteria toward an optimal metabolic solution by imposing fitness defined by metabolic output. The current study is a proof-of-concept for Programmed Evolution applied to the optimization of a metabolic pathway for the conversion of caffeine to theophylline in E. coli. Introduced genotype variations included strength of the promoter and ribosome binding site, plasmid copy number, and chaperone proteins. We constructed 24 strains using all combinations of the genetic variables. We used a theophylline riboswitch and a tetracycline resistance gene to link theophylline production to fitness. After subjecting the mixed population to selection, we measured a change in the distribution of genotypes in the population and an increased conversion of caffeine to theophylline among the most fit strains, demonstrating Programmed Evolution. Programmed Evolution inverts the standard paradigm in metabolic engineering by harnessing evolution instead of fighting it. Our modular system enables researchers to program bacteria and use evolution to determine the combination of genetic control elements that optimizes catabolic or anabolic output and to maintain it in a population of cells. Programmed Evolution could be used for applications in energy, pharmaceuticals, chemical commodities, biomining, and bioremediation.

  12. Quality risk management of top spray fluidized bed process for antihypertensive drug formulation with control strategy engendered by Box-behnken experimental design space.

    PubMed

    Mukharya, Amit; Patel, Paresh U; Shenoy, Dinesh; Chaudhary, Shivang

    2013-01-01

    Lacidipine (LCDP) is a very low soluble and highly biovariable calcium channel blocker used in the treatment of hypertension. To increase its apparent solubility and to reduce its biovariability, solid dispersion fluid bed processing technology was explored, as it produces highly dispersible granules with a characteristic porous structure that enhances dispersibility, wettability, blend uniformity (by dissolving and spraying a solution of actives), flow ability and compressibility of granules for tableting and reducing variability by uniform drug-binder solution distribution on carrier molecules. Main object of this quality risk management (QRM) study is to provide a sophisticated "robust and rugged" Fluidized Bed Process (FBP) for the preparation of LCDP tablets with desired quality (stability) and performance (dissolution) by quality by design (QbD) concept. THIS STUDY IS PRINCIPALLY FOCUSING ON THOROUGH MECHANISTIC UNDERSTANDING OF THE FBP BY WHICH IT IS DEVELOPED AND SCALED UP WITH A KNOWLEDGE OF THE CRITICAL RISKS INVOLVED IN MANUFACTURING PROCESS ANALYZED BY RISK ASSESSMENT TOOLS LIKE: Qualitative Initial Risk-based Matrix Analysis (IRMA) and Quantitative Failure Mode Effective Analysis (FMEA) to identify and rank parameters with potential to have an impact on In Process/Finished Product Critical Quality Attributes (IP/FP CQAs). These Critical Process Parameters (CPPs) were further refined by DoE and MVDA to develop design space with Real Time Release Testing (RTRT) that leads to implementation of a control strategy to achieve consistent finished product quality at lab scale itself to prevent possible product failure at larger manufacturing scale.

  13. Quality risk management of top spray fluidized bed process for antihypertensive drug formulation with control strategy engendered by Box-behnken experimental design space

    PubMed Central

    Mukharya, Amit; Patel, Paresh U; Shenoy, Dinesh; Chaudhary, Shivang

    2013-01-01

    Introduction: Lacidipine (LCDP) is a very low soluble and highly biovariable calcium channel blocker used in the treatment of hypertension. To increase its apparent solubility and to reduce its biovariability, solid dispersion fluid bed processing technology was explored, as it produces highly dispersible granules with a characteristic porous structure that enhances dispersibility, wettability, blend uniformity (by dissolving and spraying a solution of actives), flow ability and compressibility of granules for tableting and reducing variability by uniform drug-binder solution distribution on carrier molecules. Materials and Methods: Main object of this quality risk management (QRM) study is to provide a sophisticated “robust and rugged” Fluidized Bed Process (FBP) for the preparation of LCDP tablets with desired quality (stability) and performance (dissolution) by quality by design (QbD) concept. Results and Conclusion: This study is principally focusing on thorough mechanistic understanding of the FBP by which it is developed and scaled up with a knowledge of the critical risks involved in manufacturing process analyzed by risk assessment tools like: Qualitative Initial Risk-based Matrix Analysis (IRMA) and Quantitative Failure Mode Effective Analysis (FMEA) to identify and rank parameters with potential to have an impact on In Process/Finished Product Critical Quality Attributes (IP/FP CQAs). These Critical Process Parameters (CPPs) were further refined by DoE and MVDA to develop design space with Real Time Release Testing (RTRT) that leads to implementation of a control strategy to achieve consistent finished product quality at lab scale itself to prevent possible product failure at larger manufacturing scale. PMID:23799202

  14. Scheduling Onboard Processing for the Proposed HyspIRI Mission

    NASA Technical Reports Server (NTRS)

    Chien, Steve; Mclaren, David; Rabideau, Gregg; Mandl, Daniel; Hengemihle, Jerry

    2011-01-01

    The proposed Hyspiri mission is evaluating a X-band Direct Broadcast (DB) capability that would enable data to be delivered to ground stations virtually as it is acquired. However the HyspIRI VSWIR and TIR instruments will produce 1 Gbps data while the DB capability is 15 M bps for a 60x oversubscription. In order to address this data volume mismatch a DB concept has been developed thatdetermines which data to downlink based on both: 1. The type of surface the spacecraft is overflying and 2. Onboard processing of the data to detect events. For example when the spacecraft is overflying polar regions it might downlink a snow/ice product. Additionally the onboard software will search for thermal signatures indicative of a volcanic event or wild fire and downlink summary information (extent, spectra) when detected. The process of determining which products to generate when, based on request prioritization and onboard processing and downlink constraints is inherently a prioritized scheduling problem - we describe work to develop an automated solution to this problem.

  15. The Mole Concept

    ERIC Educational Resources Information Center

    Duncan, I. M.; Johnstone, A. H.

    1973-01-01

    Reports a study of difficulties encountered by 14.5- to 15.0- year-old children in learning the mole concept with a programed instruction. Concludes that three respective disturbing factors were embedded in manipulation of molarity of solutions, balancing equations, and misapprehension that one mole of a compound always reacts with one mole of…

  16. Recycle/Reuse: Utilizing New Technology.

    ERIC Educational Resources Information Center

    Vaglia, John S.

    In the early 1990s, efforts were initiated to help countries move toward a solution of the global pollution problem. Technology education classrooms and laboratories are among the best places for bring the concepts of recycling/reuse and waste management to students' attention. Important concepts about pollution, waste prevention, and recycling…

  17. Green Schools.

    ERIC Educational Resources Information Center

    Kozlowski, David, Ed.

    1998-01-01

    Discusses "going green" concept in school-building design, its cost-savings benefits through more efficient energy use, and its use by the State University of New York at Buffalo as solution to an energy retrofit program. Examples are provided of how this concept can be used, even for small colleges without large capital budgets, and how…

  18. Environmental Media Phase-Tracking Units in the Classroom

    ERIC Educational Resources Information Center

    Langseth, David E.

    2009-01-01

    When teaching phase partitioning concepts for solutes in porous media, and other multi-phase environmental systems, explicitly tracking the environmental media phase with which a substance of interest (S0I) is associated can enhance the students' understanding of the fundamental concepts and derivations. It is common to explicitly track the…

  19. Algebraic Concepts: What's Really New in New Curricula?

    ERIC Educational Resources Information Center

    Star, Jon R.; Herbel-Eisenmann, Beth A.; Smith, John P., III

    2000-01-01

    Examines 8th grade units from the Connected Mathematics Project (CMP). Identifies differences in older and newer conceptions, fundamental objects of study, typical problems, and typical solution methods in algebra. Also discusses where the issue of what is new in algebra is relevant to many other innovative middle school curricula. (KHR)

  20. Regional approaches in high-rise construction

    NASA Astrophysics Data System (ADS)

    Iconopisceva, O. G.; Proskurin, G. A.

    2018-03-01

    The evolutionary process of high-rise construction is in the article focus. The aim of the study was to create a retrospective matrix reflecting the tasks of the study such as: structuring the most iconic high-rise objects within historic boundaries. The study is based on contemporary experience of high-rise construction in different countries. The main directions and regional specifics in the field of high-rise construction as well as factors influencing the further evolution process are analyzed. The main changes in architectural stylistics, form-building, constructive solutions that focus on the principles of energy efficiency and bio positivity of "sustainable buildings", as well as the search for a new typology are noted. The most universal constructive methods and solutions that turned out to be particularly popular are generalized. The new typology of high-rises and individual approach to urban context are noted. The results of the study as a graphical scheme made it possible to represent the whole high-rise evolution. The new spatial forms of high-rises lead them to new role within the urban environments. Futuristic hyperscalable concepts take the autonomous urban space functions itself and demonstrate us how high-rises can replace multifunctional urban fabric, developing it inside their shells.

Top