Functional Analyses of the Problems in Non-English Majors' Writings
ERIC Educational Resources Information Center
Li, Shun-ying
2010-01-01
Problems in generating and organizing ideas, in coherence and language competence are common in non-English majors' writings, which decrease non-English majors' ability to use English as a tool to realize its pragmatic functions and meta-functions. The exam-centered objective, the product-oriented approach, the inefficient mode of instruction, the…
ERIC Educational Resources Information Center
Diamond, Nina; Koernig, Stephen K.; Iqbal, Zafar
2008-01-01
This article describes an innovative strategic tools course designed to enhance the problem-solving skills of marketing majors. The course serves as a means of preparing students to capitalize on opportunities afforded by a case-based capstone course and to better meet the needs and expectations of prospective employers. The course format utilizes…
ERIC Educational Resources Information Center
Klenke, Andrew M.; Dell, Tim W.
2007-01-01
Graduates of the automotive technology program at Pittsburg State University (PSU) generally enter the workforce in some type of automotive management role. As a result, the program does not require students to purchase their own tools, and it does not have room for all 280 majors to roll around a personal tool chest. Each instructor must maintain…
Rosetta Structure Prediction as a Tool for Solving Difficult Molecular Replacement Problems.
DiMaio, Frank
2017-01-01
Molecular replacement (MR), a method for solving the crystallographic phase problem using phases derived from a model of the target structure, has proven extremely valuable, accounting for the vast majority of structures solved by X-ray crystallography. However, when the resolution of data is low, or the starting model is very dissimilar to the target protein, solving structures via molecular replacement may be very challenging. In recent years, protein structure prediction methodology has emerged as a powerful tool in model building and model refinement for difficult molecular replacement problems. This chapter describes some of the tools available in Rosetta for model building and model refinement specifically geared toward difficult molecular replacement cases.
Studying PubMed usages in the field for complex problem solving: Implications for tool design
Song, Jean; Tonks, Jennifer Steiner; Meng, Fan; Xuan, Weijian; Ameziane, Rafiqa
2012-01-01
Many recent studies on MEDLINE-based information seeking have shed light on scientists’ behaviors and associated tool innovations that may improve efficiency and effectiveness. Few if any studies, however, examine scientists’ problem-solving uses of PubMed in actual contexts of work and corresponding needs for better tool support. Addressing this gap, we conducted a field study of novice scientists (14 upper level undergraduate majors in molecular biology) as they engaged in a problem solving activity with PubMed in a laboratory setting. Findings reveal many common stages and patterns of information seeking across users as well as variations, especially variations in cognitive search styles. Based on findings, we suggest tool improvements that both confirm and qualify many results found in other recent studies. Our findings highlight the need to use results from context-rich studies to inform decisions in tool design about when to offer improved features to users. PMID:24376375
Hanratty, Jennifer; Livingstone, Nuala; Robalino, Shannon; Terwee, Caroline B; Glod, Magdalena; Oono, Inalegwu P; Rodgers, Jacqui; Macdonald, Geraldine; McConachie, Helen
2015-01-01
Behaviour problems are common in young children with autism spectrum disorder (ASD). There are many different tools used to measure behavior problems but little is known about their validity for the population. To evaluate the measurement properties of behaviour problems tools used in evaluation of intervention or observational research studies with children with ASD up to the age of six years. Behaviour measurement tools were identified as part of a larger, two stage, systematic review. First, sixteen major electronic databases, as well as grey literature and research registers were searched, and tools used listed and categorized. Second, using methodological filters, we searched for articles examining the measurement properties of the tools in use with young children with ASD in ERIC, MEDLINE, EMBASE, CINAHL, and PsycINFO. The quality of these papers was then evaluated using the COSMIN checklist. We identified twelve tools which had been used to measure behaviour problems in young children with ASD, and fifteen studies which investigated the measurement properties of six of these tools. There was no evidence available for the remaining six tools. Two questionnaires were found to be the most robust in their measurement properties, the Child Behavior Checklist and the Home Situations Questionnaire-Pervasive Developmental Disorders version. We found patchy evidence on reliability and validity, for only a few of the tools used to measure behaviour problems in young children with ASD. More systematic research is required on measurement properties of tools for use in this population, in particular to establish responsiveness to change which is essential in measurement of outcomes of intervention. CRD42012002223.
Hanratty, Jennifer; Livingstone, Nuala; Robalino, Shannon; Terwee, Caroline B.; Glod, Magdalena; Oono, Inalegwu P.; Rodgers, Jacqui; Macdonald, Geraldine; McConachie, Helen
2015-01-01
Background Behaviour problems are common in young children with autism spectrum disorder (ASD). There are many different tools used to measure behavior problems but little is known about their validity for the population. Objectives To evaluate the measurement properties of behaviour problems tools used in evaluation of intervention or observational research studies with children with ASD up to the age of six years. Methods Behaviour measurement tools were identified as part of a larger, two stage, systematic review. First, sixteen major electronic databases, as well as grey literature and research registers were searched, and tools used listed and categorized. Second, using methodological filters, we searched for articles examining the measurement properties of the tools in use with young children with ASD in ERIC, MEDLINE, EMBASE, CINAHL, and PsycINFO. The quality of these papers was then evaluated using the COSMIN checklist. Results We identified twelve tools which had been used to measure behaviour problems in young children with ASD, and fifteen studies which investigated the measurement properties of six of these tools. There was no evidence available for the remaining six tools. Two questionnaires were found to be the most robust in their measurement properties, the Child Behavior Checklist and the Home Situations Questionnaire—Pervasive Developmental Disorders version. Conclusions We found patchy evidence on reliability and validity, for only a few of the tools used to measure behaviour problems in young children with ASD. More systematic research is required on measurement properties of tools for use in this population, in particular to establish responsiveness to change which is essential in measurement of outcomes of intervention. PROSPERO Registration Number CRD42012002223 PMID:26659821
USMC Ground Surveillance Robot (GSR): Lessons Learned
NASA Astrophysics Data System (ADS)
Harmon, S. Y.
1987-02-01
This paper describes the design of an autonomous vehicle and the lessons learned during the implementation of that complex robot. The major problems encountered to which solutions were found include sensor processing bandwidth limitations, coordination of the interactions between major subsystems, sensor data fusion and system knowledge representation. Those problems remaining unresolved include system complexity management, the lack of powerful system monitoring and debugging tools, exploratory implementation of a complex system and safety and testing issues. Many of these problems arose from working with underdeveloped and continuously evolving technology and will probably be resolved as the technological resources mature and stabilize. Unfortunately, other problems will continue to plague developers throughout the evolution of autonomous system technology.
Dare We Build a New Curriculum for a New Age?
ERIC Educational Resources Information Center
Seif, Elliott
Ten major elements in developing a curriculum to prepare students to face future challenges and problems are outlined. One, mastery and understanding of technology, should focus on the use of tools and machines with an emphasis on problems related to technology in our lives. Two, cooperative living skills, can be achieved through classroom…
Supporting Abstraction Processes in Problem Solving through Pattern-Oriented Instruction
ERIC Educational Resources Information Center
Muller, Orna; Haberman, Bruria
2008-01-01
Abstraction is a major concept in computer science and serves as a powerful tool in software development. Pattern-oriented instruction (POI) is a pedagogical approach that incorporates patterns in an introductory computer science course in order to structure the learning of algorithmic problem solving. This paper examines abstraction processes in…
ESA's tools for internal charging
NASA Astrophysics Data System (ADS)
Sorensen, J.; Rodgers, D. J.; Ryden, K. A.; Latham, P. M.; Wrenn, G. L.; Levy, L.; Panabiere, G.
2000-06-01
Electrostatic discharges, caused by bulk charging of spacecraft insulating materials, are a major cause of satellite anomalies. A quantitative knowledge of the charge build-up is essential in order to eliminate these problems in the design stage. This is a presentation of ESA's tools to assess whether a given structure is liable to experience electrostatic discharges or not. A study has been made of the physical phenomenon, and an engineering specification has been created to be used to assess a structure for potential discharge problems. The specification has been implemented in a new software DICTAT. The implementation of tests in dedicated facilities is an important part of the specification, and tests have been performed to validate the new tool.
Using Predictive Analytics to Detect Major Problems in Department of Defense Acquisition Programs
2012-03-01
research is focused on three questions. First, can we predict the contractor provided estimate at complete (EAC)? Second, can we use those predictions to...develop an algorithm to determine if a problem will occur in an acquisition program or sub-program? Lastly, can we provide the probability of a problem...more than doubling the probability of a problem occurrence compared to current tools in the cost community. Though program managers can use this
Podometrics as a Potential Clinical Tool for Glomerular Disease Management.
Kikuchi, Masao; Wickman, Larysa; Hodgin, Jeffrey B; Wiggins, Roger C
2015-05-01
Chronic kidney disease culminating in end-stage kidney disease is a major public health problem costing in excess of $40 billion per year with high morbidity and mortality. Current tools for glomerular disease monitoring lack precision and contribute to poor outcome. The podocyte depletion hypothesis describes the major mechanisms underlying the progression of glomerular diseases, which are responsible for more than 80% of cases of end-stage kidney disease. The question arises of whether this new knowledge can be used to improve outcomes and reduce costs. Podocytes have unique characteristics that make them an attractive monitoring tool. Methodologies for estimating podocyte number, size, density, glomerular volume and other parameters in routine kidney biopsies, and the rate of podocyte detachment from glomeruli into urine (podometrics) now have been developed and validated. They potentially fill important gaps in the glomerular disease monitoring toolbox. The application of these tools to glomerular disease groups shows good correlation with outcome, although data validating their use for individual decision making is not yet available. Given the urgency of the clinical problem, we argue that the time has come to focus on testing these tools for application to individualized clinical decision making toward more effective progression prevention. Copyright © 2015 Elsevier Inc. All rights reserved.
Functional specifications for AI software tools for electric power applications. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Faught, W.S.
1985-08-01
The principle barrier to the introduction of artificial intelligence (AI) technology to the electric power industry has not been a lack of interest or appropriate problems, for the industry abounds in both. Like most others, however, the electric power industry lacks the personnel - knowledge engineers - with the special combination of training and skills AI programming demands. Conversely, very few AI specialists are conversant with electric power industry problems and applications. The recent availability of sophisticated AI programming environments is doing much to alleviate this shortage. These products provide a set of powerful and usable software tools that enablemore » even non-AI scientists to rapidly develop AI applications. The purpose of this project was to develop functional specifications for programming tools that, when integrated with existing general-purpose knowledge engineering tools, would expedite the production of AI applications for the electric power industry. Twelve potential applications, representative of major problem domains within the nuclear power industry, were analyzed in order to identify those tools that would be of greatest value in application development. Eight tools were specified, including facilities for power plant modeling, data base inquiry, simulation and machine-machine interface.« less
ERIC Educational Resources Information Center
Howard, Cynthia; Jordan, Pamela; Di Eugenio, Barbara; Katz, Sandra
2017-01-01
Despite a growing need for educational tools that support students at the earliest phases of undergraduate Computer Science (CS) curricula, relatively few such tools exist--the majority being Intelligent Tutoring Systems. Since peer interactions more readily give rise to challenges and negotiations, another way in which students can become more…
NASA Engineering and Technology Advancement Office: A proposal to the administrator
NASA Technical Reports Server (NTRS)
Schulze, Norman R.
1993-01-01
NASA has continually had problems with cost, schedule, performance, reliability, quality, and safety aspects in programs. Past solutions have not provided the answers needed, and a major change is needed in the way of doing business. A new approach is presented for consideration. These problems are all engineering matters, and therefore, require engineering solutions. Proper engineering tools are needed to fix engineering problems. Headquarters is responsible for providing the management structure to support programs with appropriate engineering tools. A guide to define those tools and an approach for putting them into place is provided. Recommendations include establishing a new Engineering and Technology Advancement Office, requesting a review of this proposal by the Administrator since this subject requires a top level decision. There has been a wide peer review conducted by technical staff at Headquarters, the Field Installations, and others in industry as discussed.
Development of the major trauma case review tool.
Curtis, Kate; Mitchell, Rebecca; McCarthy, Amy; Wilson, Kellie; Van, Connie; Kennedy, Belinda; Tall, Gary; Holland, Andrew; Foster, Kim; Dickinson, Stuart; Stelfox, Henry T
2017-02-28
As many as half of all patients with major traumatic injuries do not receive the recommended care, with variance in preventable mortality reported across the globe. This variance highlights the need for a comprehensive process for monitoring and reviewing patient care, central to which is a consistent peer-review process that includes trauma system safety and human factors. There is no published, evidence-informed standardised tool that considers these factors for use in adult or paediatric trauma case peer-review. The aim of this research was to develop and validate a trauma case review tool to facilitate clinical review of paediatric trauma patient care in extracting information to facilitate monitoring, inform change and enable loop closure. Development of the trauma case review tool was multi-faceted, beginning with a review of the trauma audit tool literature. Data were extracted from the literature to inform iterative tool development using a consensus approach. Inter-rater agreement was assessed for both the pilot and finalised versions of the tool. The final trauma case review tool contained ten sections, including patient factors (such as pre-existing conditions), presenting problem, a timeline of events, factors contributing to the care delivery problem (including equipment, work environment, staff action, organizational factors), positive aspects of care and the outcome of panel discussion. After refinement, the inter-rater reliability of the human factors and outcome components of the tool improved with an average 86% agreement between raters. This research developed an evidence-informed tool for use in paediatric trauma case review that considers both system safety and human factors to facilitate clinical review of trauma patient care. This tool can be used to identify opportunities for improvement in trauma care and guide quality assurance activities. Validation is required in the adult population.
The Pocket Psychiatrist: Tools to enhance psychiatry education in family medicine.
Bass, Deanna; Brandenburg, Dana; Danner, Christine
2015-01-01
Primary care is the setting where the majority of patients seek assistance for their mental health problems. To assist family medicine residents in providing effective care to patients for mental health problems during residency and after graduation, it is essential they receive training in the assessment, diagnosis, and treatment of common mental health conditions. While there is some limited education time with a psychiatrist in our department, residents need tools and resources that provide education during their continuity clinics even when the psychiatrist is not available. Information on two tools that were developed is provided. These tools include teaching residents a brief method for conducting a psychiatric interview as well as a means to access evidence-based information on diagnosis and treatment of mental health conditions through templates available within our electronic medical record. © The Author(s) 2015.
Overview of the Development for a Suite of Low-Thrust Trajectory Analysis Tools
NASA Technical Reports Server (NTRS)
Kos, Larry D.; Polsgrove, Tara; Hopkins, Randall; Thomas, Dan; Sims, Jon A.
2006-01-01
A NASA intercenter team has developed a suite of low-thrust trajectory analysis tools to make a significant improvement in three major facets of low-thrust trajectory and mission analysis. These are: 1) ease of use, 2) ability to more robustly converge to solutions, and 3) higher fidelity modeling and accuracy of results. Due mostly to the short duration of the development, the team concluded that a suite of tools was preferred over having one integrated tool. This tool-suite, their characteristics, and their applicability will be described. Trajectory analysts can read this paper and determine which tool is most appropriate for their problem.
NASA Astrophysics Data System (ADS)
DeVore, Seth; Marshman, Emily; Singh, Chandralekha
2017-06-01
As research-based, self-paced electronic learning tools become increasingly available, a critical issue educators encounter is implementing strategies to ensure that all students engage with them as intended. Here, we first discuss the effectiveness of electronic learning tutorials as self-paced learning tools in large enrollment brick and mortar introductory physics courses and then propose a framework for helping students engage effectively with the learning tools. The tutorials were developed via research in physics education and were found to be effective for a diverse group of introductory physics students in one-on-one implementation. Instructors encouraged the use of these tools in a self-paced learning environment by telling students that they would be helpful for solving the assigned homework problems and that the underlying physics principles in the tutorial problems would be similar to those in the in-class quizzes (which we call paired problems). We find that many students in the courses in which these interactive electronic learning tutorials were assigned as a self-study tool performed poorly on the paired problems. In contrast, a majority of student volunteers in one-on-one implementation greatly benefited from the tutorials and performed well on the paired problems. The significantly lower overall performance on paired problems administered as an in-class quiz compared to the performance of student volunteers who used the research-based tutorials in one-on-one implementation suggests that many students enrolled in introductory physics courses did not effectively engage with the tutorials outside of class and may have only used them superficially. The findings suggest that many students in need of out-of-class remediation via self-paced learning tools may have difficulty motivating themselves and may lack the self-regulation and time-management skills to engage effectively with tools specially designed to help them learn at their own pace. We conclude by proposing a theoretical framework to help students with diverse prior preparations engage effectively with self-paced learning tools.
NASA Astrophysics Data System (ADS)
El Bouami, Souhail; Habak, Malek; Franz, Gérald; Velasco, Raphaël; Vantomme, Pascal
2016-10-01
Composite materials are increasingly used for structural parts in the aeronautic industries. Carbon Fiber-Reinforced Plastics (CFRP) are often used in combination with metallic materials, mostly aluminium alloys. This raises new problems in aircraft assembly. Delamination is one of these problems. In this study, CFRP/Al-Li stacks is used as experimental material for investigation effect of interaction of cutting parameters (cutting speed and feed rate) and tool geometry on delamination and thrust forces in drilling operation. A plan of experiments, based on Taguchi design method, was employed to investigate the influence of tool geometry and in particular the point angle and cutting parameters on delamination and axial effort. The experimental results demonstrate that the feed rate is the major parameter and the importance of tool point angle in delamination and thrust forces in the stacks were shown.
Numerical methods on some structured matrix algebra problems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jessup, E.R.
1996-06-01
This proposal concerned the design, analysis, and implementation of serial and parallel algorithms for certain structured matrix algebra problems. It emphasized large order problems and so focused on methods that can be implemented efficiently on distributed-memory MIMD multiprocessors. Such machines supply the computing power and extensive memory demanded by the large order problems. We proposed to examine three classes of matrix algebra problems: the symmetric and nonsymmetric eigenvalue problems (especially the tridiagonal cases) and the solution of linear systems with specially structured coefficient matrices. As all of these are of practical interest, a major goal of this work was tomore » translate our research in linear algebra into useful tools for use by the computational scientists interested in these and related applications. Thus, in addition to software specific to the linear algebra problems, we proposed to produce a programming paradigm and library to aid in the design and implementation of programs for distributed-memory MIMD computers. We now report on our progress on each of the problems and on the programming tools.« less
2012-01-01
Background The problem list is a key part of the electronic health record (EHR) that allows practitioners to see a patient’s diagnoses and health issues. Yet, as the content of the problem list largely represents the subjective decisions of those who edit it, patients’ problem lists are often unreliable when shared across practitioners. The lack of standards for how the problem list is compiled in the EHR limits its effectiveness in improving patient care, particularly as a resource for clinical decision support and population management tools. The purpose of this study is to discover practitioner opinions towards the problem list and the logic behind their decisions during clinical situations. Materials and methods An observational cross-sectional study was conducted at two major Boston teaching hospitals. Practitioners’ opinions about the problem list were collected through both in-person interviews and an online questionnaire. Questions were framed using vignettes of clinical scenarios asking practitioners about their preferred actions towards the problem list. Results These data confirmed prior research that practitioners differ in their opinions over managing the problem list, but in most responses to a questionnaire, there was a common approach among the relative majority of respondents. Further, basic demographic characteristics of providers (age, medical experience, etc.) did not appear to strongly affect attitudes towards the problem list. Conclusion The results supported the premise that policies and EHR tools are needed to bring about a common approach. Further, the findings helped identify what issues might benefit the most from a defined policy and the level of restriction a problem list policy should place on the addition of different types of information. PMID:23140312
Process Guide for Deburring Technologies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Frey, David L.
This report is an updated and consolidated view of the current deburring processes at the Kansas City Plant (KCP). It includes specific examples of current burr problems and the methods used for their detection. Also included is a pictorial review of the large variety of available deburr tools, along with a complete numerical listing of existing tools and their descriptions. The process for deburring all the major part feature categories is discussed.
PlantCV v2: Image analysis software for high-throughput plant phenotyping
Abbasi, Arash; Berry, Jeffrey C.; Callen, Steven T.; Chavez, Leonardo; Doust, Andrew N.; Feldman, Max J.; Gilbert, Kerrigan B.; Hodge, John G.; Hoyer, J. Steen; Lin, Andy; Liu, Suxing; Lizárraga, César; Lorence, Argelia; Miller, Michael; Platon, Eric; Tessman, Monica; Sax, Tony
2017-01-01
Systems for collecting image data in conjunction with computer vision techniques are a powerful tool for increasing the temporal resolution at which plant phenotypes can be measured non-destructively. Computational tools that are flexible and extendable are needed to address the diversity of plant phenotyping problems. We previously described the Plant Computer Vision (PlantCV) software package, which is an image processing toolkit for plant phenotyping analysis. The goal of the PlantCV project is to develop a set of modular, reusable, and repurposable tools for plant image analysis that are open-source and community-developed. Here we present the details and rationale for major developments in the second major release of PlantCV. In addition to overall improvements in the organization of the PlantCV project, new functionality includes a set of new image processing and normalization tools, support for analyzing images that include multiple plants, leaf segmentation, landmark identification tools for morphometrics, and modules for machine learning. PMID:29209576
PlantCV v2: Image analysis software for high-throughput plant phenotyping.
Gehan, Malia A; Fahlgren, Noah; Abbasi, Arash; Berry, Jeffrey C; Callen, Steven T; Chavez, Leonardo; Doust, Andrew N; Feldman, Max J; Gilbert, Kerrigan B; Hodge, John G; Hoyer, J Steen; Lin, Andy; Liu, Suxing; Lizárraga, César; Lorence, Argelia; Miller, Michael; Platon, Eric; Tessman, Monica; Sax, Tony
2017-01-01
Systems for collecting image data in conjunction with computer vision techniques are a powerful tool for increasing the temporal resolution at which plant phenotypes can be measured non-destructively. Computational tools that are flexible and extendable are needed to address the diversity of plant phenotyping problems. We previously described the Plant Computer Vision (PlantCV) software package, which is an image processing toolkit for plant phenotyping analysis. The goal of the PlantCV project is to develop a set of modular, reusable, and repurposable tools for plant image analysis that are open-source and community-developed. Here we present the details and rationale for major developments in the second major release of PlantCV. In addition to overall improvements in the organization of the PlantCV project, new functionality includes a set of new image processing and normalization tools, support for analyzing images that include multiple plants, leaf segmentation, landmark identification tools for morphometrics, and modules for machine learning.
PlantCV v2: Image analysis software for high-throughput plant phenotyping
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gehan, Malia A.; Fahlgren, Noah; Abbasi, Arash
Systems for collecting image data in conjunction with computer vision techniques are a powerful tool for increasing the temporal resolution at which plant phenotypes can be measured non-destructively. Computational tools that are flexible and extendable are needed to address the diversity of plant phenotyping problems. We previously described the Plant Computer Vision (PlantCV) software package, which is an image processing toolkit for plant phenotyping analysis. The goal of the PlantCV project is to develop a set of modular, reusable, and repurposable tools for plant image analysis that are open-source and community-developed. Here in this paper we present the details andmore » rationale for major developments in the second major release of PlantCV. In addition to overall improvements in the organization of the PlantCV project, new functionality includes a set of new image processing and normalization tools, support for analyzing images that include multiple plants, leaf segmentation, landmark identification tools for morphometrics, and modules for machine learning.« less
PlantCV v2: Image analysis software for high-throughput plant phenotyping
Gehan, Malia A.; Fahlgren, Noah; Abbasi, Arash; ...
2017-12-01
Systems for collecting image data in conjunction with computer vision techniques are a powerful tool for increasing the temporal resolution at which plant phenotypes can be measured non-destructively. Computational tools that are flexible and extendable are needed to address the diversity of plant phenotyping problems. We previously described the Plant Computer Vision (PlantCV) software package, which is an image processing toolkit for plant phenotyping analysis. The goal of the PlantCV project is to develop a set of modular, reusable, and repurposable tools for plant image analysis that are open-source and community-developed. Here in this paper we present the details andmore » rationale for major developments in the second major release of PlantCV. In addition to overall improvements in the organization of the PlantCV project, new functionality includes a set of new image processing and normalization tools, support for analyzing images that include multiple plants, leaf segmentation, landmark identification tools for morphometrics, and modules for machine learning.« less
[Childhood accidents: relevant epidemiologic data].
Julé, Laure; Chevallier, Bertrand
2009-02-20
Injuries resulting from accidents are a major public heath problem. Accidents account for 700 deaths among French children up to 15 years and near 300 concern home accidents. Accidental injuries represent the first cause of children mortality, hospitalisations and sequelae. The lack of data registration supports the need of epidemiological tools to appreciate the burden of the public heath problem and the basis of a surveillance system to evaluate strategy prevention.
Business Games: "A Neglected Pedagogical Tool"
ERIC Educational Resources Information Center
Gupta, Vinay K.
1972-01-01
The major value of games is the development of problem-solving and decision-making skills--a value that has not always been achieved under traditional methods of teaching or by some of the other recent innovations, such as programmed or televised lessons. (Author/JS)
Human Trafficking: The Role of the Health Care Provider
Dovydaitis, Tiffany
2011-01-01
Human trafficking is a major public health problem, both domestically and internationally. Health care providers are often the only professionals to interact with trafficking victims who are still in captivity. The expert assessment and interview skills of providers contribute to their readiness to identify victims of trafficking. The purpose of this article is to provide clinicians with knowledge on trafficking and give specific tools that they may use to assist victims in the clinical setting. Definitions, statistics, and common health care problems of trafficking victims are reviewed. The role of the health care provider is outlined through a case study and clinical practice tools are provided. Suggestions for future research are also briefly addressed. PMID:20732668
Developing a Web-Based Ppgis, as AN Environmental Reporting Service
NASA Astrophysics Data System (ADS)
Ranjbar Nooshery, N.; Taleai, M.; Kazemi, R.; Ebadi, K.
2017-09-01
Today municipalities are searching for new tools to empower locals for changing the future of their own areas by increasing their participation in different levels of urban planning. These tools should involve the community in planning process using participatory approaches instead of long traditional top-down planning models and help municipalities to obtain proper insight about major problems of urban neighborhoods from the residents' point of view. In this matter, public participation GIS (PPGIS) which enables citizens to record and following up their feeling and spatial knowledge regarding problems of the city in the form of maps have been introduced. In this research, a tool entitled CAER (Collecting & Analyzing of Environmental Reports) is developed. In the first step, a software framework based on Web-GIS tool, called EPGIS (Environmental Participatory GIS) has been designed to support public participation in reporting urban environmental problems and to facilitate data flow between citizens and municipality. A web-based cartography tool was employed for geo-visualization and dissemination of map-based reports. In the second step of CAER, a subsystem is developed based on SOLAP (Spatial On-Line Analytical Processing), as a data mining tools to elicit the local knowledge facilitating bottom-up urban planning practices and to help urban managers to find hidden relations among the recorded reports. This system is implemented in a case study area in Boston, Massachusetts and its usability was evaluated. The CAER should be considered as bottom-up planning tools to collect people's problems and views about their neighborhood and transmits them to the city officials. It also helps urban planners to find solutions for better management from citizen's viewpoint and gives them this chance to develop good plans to the neighborhoods that should be satisfied the citizens.
Teachers as Public Speakers: Training Teachers to Lecture.
ERIC Educational Resources Information Center
Cooper, Pamela J.
Focusing on public speaking as a major instructional tool for teachers, this paper contains suggestions for more effective lecturing. In the introduction, classroom communication is analyzed according to four moves: structuring, soliciting, responding, and reacting. The paper then discusses four problems in studying learning from meaningful verbal…
COMPUTER TOOLS FOR SANITARY SEWER SYSTEM CAPACITY ANALYSIS AND PLANNING
Rainfall-derived infiltration and inflow (RDII) into sanitary sewer systems has long been recognized as a major source of operating problems, causing poor performance of many sewer systems. RDII is the main cause of SSOs to customer basements, streets, or nearby streams and can a...
Student Research in Computational Astrophysics
NASA Astrophysics Data System (ADS)
Blondin, J. M.
1999-12-01
Computational physics can shorten the long road from freshman physics major to independent research by providing students with powerful tools to deal with the complexities of modern research problems. At North Carolina State University we have introduced dozens of students to astrophysics research using the tools of computational fluid dynamics. We have used several formats for working with students, including the traditional approach of one-on-one mentoring, a more group-oriented format in which several students work together on one or more related projects, and a novel attempt to involve an entire class in a coordinated semester research project. The advantages and disadvantages of these formats will be discussed at length, but the single most important influence has been peer support. Having students work in teams or learn the tools of research together but tackle different problems has led to more positive experiences than a lone student diving into solo research. This work is supported by an NSF CAREER Award.
Virtual GEOINT Center: C2ISR through an avatar's eyes
NASA Astrophysics Data System (ADS)
Seibert, Mark; Tidbal, Travis; Basil, Maureen; Muryn, Tyler; Scupski, Joseph; Williams, Robert
2013-05-01
As the number of devices collecting and sending data in the world are increasing, finding ways to visualize and understand that data is becoming more and more of a problem. This has often been coined as the problem of "Big Data." The Virtual Geoint Center (VGC) aims to aid in solving that problem by providing a way to combine the use of the virtual world with outside tools. Using open-source software such as OpenSim and Blender, the VGC uses a visually stunning 3D environment to display the data sent to it. The VGC is broken up into two major components: The Kinect Minimap, and the Geoint Map. The Kinect Minimap uses the Microsoft Kinect and its open-source software to make a miniature display of people the Kinect detects in front of it. The Geoint Map collect smartphone sensor information from online databases and displays them in real time onto a map generated by Google Maps. By combining outside tools and the virtual world, the VGC can help a user "visualize" data, and provide additional tools to "understand" the data.
Approaches to the Analysis of School Costs, an Introduction.
ERIC Educational Resources Information Center
Payzant, Thomas
A review and general discussion of quantitative and qualitative techniques for the analysis of economic problems outside of education is presented to help educators discover new tools for planning, allocating, and evaluating educational resources. The pamphlet covers some major components of cost accounting, cost effectiveness, cost-benefit…
Probabilistic finite elements for fatigue and fracture analysis
NASA Astrophysics Data System (ADS)
Belytschko, Ted; Liu, Wing Kam
1993-04-01
An overview of the probabilistic finite element method (PFEM) developed by the authors and their colleagues in recent years is presented. The primary focus is placed on the development of PFEM for both structural mechanics problems and fracture mechanics problems. The perturbation techniques are used as major tools for the analytical derivation. The following topics are covered: (1) representation and discretization of random fields; (2) development of PFEM for the general linear transient problem and nonlinear elasticity using Hu-Washizu variational principle; (3) computational aspects; (4) discussions of the application of PFEM to the reliability analysis of both brittle fracture and fatigue; and (5) a stochastic computational tool based on stochastic boundary element (SBEM). Results are obtained for the reliability index and corresponding probability of failure for: (1) fatigue crack growth; (2) defect geometry; (3) fatigue parameters; and (4) applied loads. These results show that initial defect is a critical parameter.
Probabilistic finite elements for fatigue and fracture analysis
NASA Technical Reports Server (NTRS)
Belytschko, Ted; Liu, Wing Kam
1993-01-01
An overview of the probabilistic finite element method (PFEM) developed by the authors and their colleagues in recent years is presented. The primary focus is placed on the development of PFEM for both structural mechanics problems and fracture mechanics problems. The perturbation techniques are used as major tools for the analytical derivation. The following topics are covered: (1) representation and discretization of random fields; (2) development of PFEM for the general linear transient problem and nonlinear elasticity using Hu-Washizu variational principle; (3) computational aspects; (4) discussions of the application of PFEM to the reliability analysis of both brittle fracture and fatigue; and (5) a stochastic computational tool based on stochastic boundary element (SBEM). Results are obtained for the reliability index and corresponding probability of failure for: (1) fatigue crack growth; (2) defect geometry; (3) fatigue parameters; and (4) applied loads. These results show that initial defect is a critical parameter.
Efficient hybrid-symbolic methods for quantum mechanical calculations
NASA Astrophysics Data System (ADS)
Scott, T. C.; Zhang, Wenxing
2015-06-01
We present hybrid symbolic-numerical tools to generate optimized numerical code for rapid prototyping and fast numerical computation starting from a computer algebra system (CAS) and tailored to any given quantum mechanical problem. Although a major focus concerns the quantum chemistry methods of H. Nakatsuji which has yielded successful and very accurate eigensolutions for small atoms and molecules, the tools are general and may be applied to any basis set calculation with a variational principle applied to its linear and non-linear parameters.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tahvili, Sahar; Österberg, Jonas; Silvestrov, Sergei
One of the most important factors in the operations of many cooperations today is to maximize profit and one important tool to that effect is the optimization of maintenance activities. Maintenance activities is at the largest level divided into two major areas, corrective maintenance (CM) and preventive maintenance (PM). When optimizing maintenance activities, by a maintenance plan or policy, we seek to find the best activities to perform at each point in time, be it PM or CM. We explore the use of stochastic simulation, genetic algorithms and other tools for solving complex maintenance planning optimization problems in terms ofmore » a suggested framework model based on discrete event simulation.« less
Development of a large scale Chimera grid system for the Space Shuttle Launch Vehicle
NASA Technical Reports Server (NTRS)
Pearce, Daniel G.; Stanley, Scott A.; Martin, Fred W., Jr.; Gomez, Ray J.; Le Beau, Gerald J.; Buning, Pieter G.; Chan, William M.; Chiu, Ing-Tsau; Wulf, Armin; Akdag, Vedat
1993-01-01
The application of CFD techniques to large problems has dictated the need for large team efforts. This paper offers an opportunity to examine the motivations, goals, needs, problems, as well as the methods, tools, and constraints that defined NASA's development of a 111 grid/16 million point grid system model for the Space Shuttle Launch Vehicle. The Chimera approach used for domain decomposition encouraged separation of the complex geometry into several major components each of which was modeled by an autonomous team. ICEM-CFD, a CAD based grid generation package, simplified the geometry and grid topology definition by provoding mature CAD tools and patch independent meshing. The resulting grid system has, on average, a four inch resolution along the surface.
Selection of software for mechanical engineering undergraduates
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cheah, C. T.; Yin, C. S.; Halim, T.
A major problem with the undergraduate mechanical course is the limited exposure of students to software packages coupled with the long learning curve on the existing software packages. This work proposes the use of appropriate software packages for the entire mechanical engineering curriculum to ensure students get sufficient exposure real life design problems. A variety of software packages are highlighted as being suitable for undergraduate work in mechanical engineering, e.g. simultaneous non-linear equations; uncertainty analysis; 3-D modeling software with the FEA; analysis tools for the solution of problems in thermodynamics, fluid mechanics, mechanical system design, and solid mechanics.
Goulding, F S; Stone, Y
1970-10-16
The past decade has seen the rapid development and exploitation of one of the most significant tools of nuclear physics, the semiconductor radiation detector. Applications of the device to the analysis of materials promises to be one of the major contributions of nuclear research to technology, and may even assist in some aspects of our environmental problems. In parallel with the development of these applications, further developments in detectors for nuclear research are taking place: the use of very thin detectors for heavyion identification, position-sensitive detectors for nuclear-reaction studies, and very pure germanium for making more satisfactory detectors for many applications suggest major future contributions to physics.
Majoring in the Rest of Your Life. Career Secrets for College Students.
ERIC Educational Resources Information Center
Carter, Carol
Primarily intended for college freshmen, this book provides practical advice and hints on ways to succeed in college and on setting career goals. Thirteen chapters outline and discuss various life skills and "tools" for succeeding in college and on the job, including planning and organizing; problem solving/analytical skills;…
Design Lab. USMES "How To" Series.
ERIC Educational Resources Information Center
Donahoe, Charles; And Others
The major emphasis in all Unified Sciences and Mathematics for Elementary Schools (USMES) units is on open-ended, long-range investigations of real problems. Since children often design and build things in USMES, 26 "Design Lab" cards provide information on the safe use and simple maintenance of tools. Each card has a large photograph of…
Using Web 2.0 Tools to Facilitate Case-Based Instruction: Considering the Possibilities
ERIC Educational Resources Information Center
Koehler, Adrie A.; Ertmer, Peggy A.
2016-01-01
Case-based instruction (CBI) offers a promising method for promoting problem-solving skills in learners. However, during CBI, the instructor shoulders major responsibility for shaping the learning that takes place. Research indicates that the facilitation techniques used during case discussions influence what gets covered, and to what extent,…
Cough Hypersensitivity Syndrome: A Few More Steps Forward
Song, Woo-Jung
2017-01-01
Cough reflex is a vital protective mechanism against aspiration, but when dysregulated, it can become hypersensitive. In fact, chronic cough is a significant medical problem with a high degree of morbidity. Recently, a unifying paradigm of cough hypersensitivity syndrome has been proposed. It represents a clinical entity in which chronic cough is a major presenting problem, regardless of the underlying condition. Although it remains a theoretical construct, emerging evidence suggests that aberrant neurophysiology is the common etiology of this syndrome. Recent success in randomized clinical trials using a P2X3 receptor antagonist is the first major advance in the therapeutics of cough in the past 30 years; it at last provides a strategy for treating intractable cough as well as an invaluable tool for dissecting the mechanism underpinning cough hypersensitivity. Additionally, several cough measurement tools have been validated for use and will help assess the clinical relevance of cough in various underlying conditions. Along with this paradigm shift, our understanding of cough mechanisms has improved during the past decades, allowing us to continue to take more steps forward in the future. PMID:28677352
Abubakar, Amina; Kariuki, Symon M; Tumaini, Judith Dzombo; Gona, Joseph; Katana, Khamis; Owen, Jacqueline A Phillips; Newton, Charles R
2015-04-01
Childhood epilepsy is common in Africa. However, there are little data on the developmental and behavioral problems experienced by children living with epilepsy, especially qualitative data that capture community perceptions of the challenges faced by these children. Identifying these perceptions using qualitative approaches is important not only to help design appropriate interventions but also to help adapt behavioral tools that are culturally appropriate. We documented the description of these problems as perceived by parents and teachers of children with or without epilepsy. The study involved 70 participants. Data were collected using in-depth interviews and focus group discussions and were analyzed using NVIVO to identify major themes. Our analysis identified four major areas that are perceived to be adversely affected among children with epilepsy. These included internalizing and externalizing problems such as aggression, temper tantrums, and excessive crying. Additionally, developmental delay, especially cognitive deficits and academic underachievement, was also identified as a major problematic area. There is a need to supplement these findings with quantitative estimates and to develop psychosocial and educational interventions to rehabilitate children with epilepsy who have these difficulties. Copyright © 2015. Published by Elsevier Inc.
Defining Geodetic Reference Frame using Matlab®: PlatEMotion 2.0
NASA Astrophysics Data System (ADS)
Cannavò, Flavio; Palano, Mimmo
2016-03-01
We describe the main features of the developed software tool, namely PlatE-Motion 2.0 (PEM2), which allows inferring the Euler pole parameters by inverting the observed velocities at a set of sites located on a rigid block (inverse problem). PEM2 allows also calculating the expected velocity value for any point located on the Earth providing an Euler pole (direct problem). PEM2 is the updated version of a previous software tool initially developed for easy-to-use file exchange with the GAMIT/GLOBK software package. The software tool is developed in Matlab® framework and, as the previous version, includes a set of MATLAB functions (m-files), GUIs (fig-files), map data files (mat-files) and user's manual as well as some example input files. New changes in PEM2 include (1) some bugs fixed, (2) improvements in the code, (3) improvements in statistical analysis, (4) new input/output file formats. In addition, PEM2 can be now run under the majority of operating systems. The tool is open source and freely available for the scientific community.
Building Automatic Grading Tools for Basic of Programming Lab in an Academic Institution
NASA Astrophysics Data System (ADS)
Harimurti, Rina; Iwan Nurhidayat, Andi; Asmunin
2018-04-01
The skills of computer programming is a core competency that must be mastered by students majoring in computer sciences. The best way to improve this skill is through the practice of writing many programs to solve various problems from simple to complex. It takes hard work and a long time to check and evaluate the results of student labs one by one, especially if the number of students a lot. Based on these constrain, web proposes Automatic Grading Tools (AGT), the application that can evaluate and deeply check the source code in C, C++. The application architecture consists of students, web-based applications, compilers, and operating systems. Automatic Grading Tools (AGT) is implemented MVC Architecture and using open source software, such as laravel framework version 5.4, PostgreSQL 9.6, Bootstrap 3.3.7, and jquery library. Automatic Grading Tools has also been tested for real problems by submitting source code in C/C++ language and then compiling. The test results show that the AGT application has been running well.
Wong, Kam Cheong
2011-03-29
Studying medical cases is an effective way to enhance clinical reasoning skills and reinforce clinical knowledge. An Ishikawa diagram, also known as a cause-and-effect diagram or fishbone diagram, is often used in quality management in manufacturing industries.In this report, an Ishikawa diagram is used to demonstrate how to relate potential causes of a major presenting problem in a clinical setting. This tool can be used by teams in problem-based learning or in self-directed learning settings.An Ishikawa diagram annotated with references to relevant medical cases and literature can be continually updated and can assist memory and retrieval of relevant medical cases and literature. It could also be used to cultivate a lifelong learning habit in medical professionals.
Problem drinking - detection and assessment in general practice.
Demirkol, Apo; Haber, Paul; Conigrave, Katherine
2011-08-01
Alcohol has long been an integral part of the social life of many Australians. However, alcohol is associated with significant harm to drinkers, and also to nondrinkers. This article explores the role of the general practitioner in the detection and assessment of problem drinking. Excessive alcohol use is a major public health problem and the majority of people who drink excessively go undetected. General practitioners are in a good position to detect excessive alcohol consumption; earlier intervention can help improve outcomes. AUDIT-C is an effective screening tool for the detection of problem drinking. National Health and Medical Research Council guidelines suggest that no more than two standard drinks on each occasion will keep lifetime risk of death from alcohol related disease or injury at a low level. Once an alcohol problem is detected it is important to assess for alcohol dependence, other substance use, motivation to change, psychiatric comorbidities and examination and investigation findings that may be associated with excessive alcohol use. A comprehensive assessment of the impact and risk of harm of the patient's drinking to themselves and others is vital, and may require several consultations.
Motion Artefacts in MRI: a Complex Problem with Many Partial Solutions
Zaitsev, Maxim; Maclaren, Julian.; Herbst, Michael
2015-01-01
Subject motion during magnetic resonance imaging (MRI) has been problematic since its introduction as a clinical imaging modality. While sensitivity to particle motion or blood flow can be used to provide useful image contrast, bulk motion presents a considerable problem in the majority of clinical applications. It is one of the most frequent sources of artefacts. Over 30 years of research have produced numerous methods to mitigate or correct for motion artefacts, but no single method can be applied in all imaging situations. Instead, a ‘toolbox’ of methods exists, where each tool is suitable for some tasks, but not for others. This article reviews the origins of motion artefacts and presents current mitigation and correction methods. In some imaging situations, the currently available motion correction tools are highly effective; in other cases, appropriate tools still need to be developed. It seems likely that this multifaceted approach will be what eventually solves the motion sensitivity problem in MRI, rather than a single solution that is effective in all situations. This review places a strong emphasis on explaining the physics behind the occurrence of such artefacts, with the aim of aiding artefact detection and mitigation in particular clinical situations. PMID:25630632
Motion artifacts in MRI: A complex problem with many partial solutions.
Zaitsev, Maxim; Maclaren, Julian; Herbst, Michael
2015-10-01
Subject motion during magnetic resonance imaging (MRI) has been problematic since its introduction as a clinical imaging modality. While sensitivity to particle motion or blood flow can be used to provide useful image contrast, bulk motion presents a considerable problem in the majority of clinical applications. It is one of the most frequent sources of artifacts. Over 30 years of research have produced numerous methods to mitigate or correct for motion artifacts, but no single method can be applied in all imaging situations. Instead, a "toolbox" of methods exists, where each tool is suitable for some tasks, but not for others. This article reviews the origins of motion artifacts and presents current mitigation and correction methods. In some imaging situations, the currently available motion correction tools are highly effective; in other cases, appropriate tools still need to be developed. It seems likely that this multifaceted approach will be what eventually solves the motion sensitivity problem in MRI, rather than a single solution that is effective in all situations. This review places a strong emphasis on explaining the physics behind the occurrence of such artifacts, with the aim of aiding artifact detection and mitigation in particular clinical situations. © 2015 Wiley Periodicals, Inc.
Preferences of Male and Female Students for TSA Competitive Events
ERIC Educational Resources Information Center
Mitts, Charles R.; Haynie, W. J., III
2010-01-01
Arguably a major issue facing technology education (TE) since its inception has been its failure to attract and keep female students. This article explains one primary reason female students may be avoiding TE courses, presents a research-tested set of tools that TE teachers can use to help fix the problem, and offers a new realizable pathway…
ERIC Educational Resources Information Center
Tyunnikov, Yurii S.
2016-01-01
The paper solves the problem of the relationship of external diagnosis and self-diagnosis of readiness of teachers to innovative activity. It highlights major disadvantages of measurement tools that are used to this process. The author demonstrates an alternative approach to harmonizing the diagnosis, based on a modular diagnostic model, general…
Nutrient Management Approaches and Tools for Dairy farms in Australia and the USA.
USDA-ARS?s Scientific Manuscript database
In Australia and the USA, nutrient imports and accumulation on dairy farms can be a problem and may pose a threat to the greater environment. While the major nutrient imports onto dairy farms (i.e. fertilizer and feed) and exports (i.e. milk and animals) are generally the same for confinement-based ...
ERIC Educational Resources Information Center
Chien, Tien-Chen
2008-01-01
Computer is not only a powerful technology for managing information and enhancing productivity, but also an efficient tool for education and training. Computer anxiety can be one of the major problems that affect the effectiveness of learning. Through analyzing related literature, this study describes the phenomenon of computer anxiety,…
Health Information Retrieval Tool (HIRT)
Nyun, Mra Thinzar; Ogunyemi, Omolola; Zeng, Qing
2002-01-01
The World Wide Web (WWW) is a powerful way to deliver on-line health information, but one major problem limits its value to consumers: content is highly distributed, while relevant and high quality information is often difficult to find. To address this issue, we experimented with an approach that utilizes three-dimensional anatomic models in conjunction with free-text search.
ERIC Educational Resources Information Center
Bature, Iliya Joseph; Atweh, Bill; Treagust, David
2016-01-01
Mathematics classrooms instruction in Nigeria secondary schools has been termed a major problem to both teachers and their students. Most classroom activities are teacher-centred with students as mere listeners and recipients of knowledge rather than being active initiators of their knowledge. This paper seeks to investigate the effects of…
Design knowledge capture for a corporate memory facility
NASA Technical Reports Server (NTRS)
Boose, John H.; Shema, David B.; Bradshaw, Jeffrey M.
1990-01-01
Currently, much of the information regarding decision alternatives and trade-offs made in the course of a major program development effort is not represented or retained in a way that permits computer-based reasoning over the life cycle of the program. The loss of this information results in problems in tracing design alternatives to requirements, in assessing the impact of change in requirements, and in configuration management. To address these problems, the problem was studied of building an intelligent, active corporate memory facility which would provide for the capture of the requirements and standards of a program, analyze the design alternatives and trade-offs made over the program's lifetime, and examine relationships between requirements and design trade-offs. Early phases of the work have concentrated on design knowledge capture for the Space Station Freedom. Tools are demonstrated and extended which helps automate and document engineering trade studies, and another tool is being developed to help designers interactively explore design alternatives and constraints.
Real simulation tools in introductory courses: packaging and repurposing our research code.
NASA Astrophysics Data System (ADS)
Heagy, L. J.; Cockett, R.; Kang, S.; Oldenburg, D.
2015-12-01
Numerical simulations are an important tool for scientific research and applications in industry. They provide a means to experiment with physics in a tangible, visual way, often providing insights into the problem. Over the last two years, we have been developing course and laboratory materials for an undergraduate geophysics course primarily taken by non-geophysics majors, including engineers and geologists. Our aim is to provide the students with resources to build intuition about geophysical techniques, promote curiosity driven exploration, and help them develop the skills necessary to communicate across disciplines. Using open-source resources and our existing research code, we have built modules around simulations, with supporting content to give student interactive tools for exploration into the impacts of input parameters and visualization of the resulting fields, fluxes and data for a variety of problems in applied geophysics, including magnetics, seismic, electromagnetics, and direct current resistivity. The content provides context for the problems, along with exercises that are aimed at getting students to experiment and ask 'what if...?' questions. In this presentation, we will discuss our approach for designing the structure of the simulation-based modules, the resources we have used, challenges we have encountered, general feedback from students and instructors, as well as our goals and roadmap for future improvement. We hope that our experiences and approach will be beneficial to other instructors who aim to put simulation tools in the hands of students.
Perera, E A Ramani; Kathriarachchi, Samudra T
2011-01-01
Suicidal behaviour among youth is a major public health concern in Sri Lanka. Prevention of youth suicides using effective, feasible and culturally acceptable methods is invaluable in this regard, however research in this area is grossly lacking. This study aimed at determining the effectiveness of problem solving counselling as a therapeutic intervention in prevention of youth suicidal behaviour in Sri Lanka. This control trial study was based on hospital admissions with suicidal attempts in a sub-urban hospital in Sri Lanka. The study was carried out at Base Hospital Homagama. A sample of 124 was recruited using convenience sampling method and divided into two groups, experimental and control. Control group was offered routine care and experimental group received four sessions of problem solving counselling over one month. Outcome of both groups was measured, six months after the initial screening, using the visual analogue scale. Individualized outcome measures on problem solving counselling showed that problem solving ability among the subjects in the experimental group had improved after four counselling sessions and suicidal behaviour has been reduced. The results are statistically significant. This Study confirms that problem solving counselling is an effective therapeutic tool in management of youth suicidal behaviour in hospital setting in a developing country.
The astronaut and the banana peel: An EVA retriever scenario
NASA Technical Reports Server (NTRS)
Shapiro, Daniel G.
1989-01-01
To prepare for the problem of accidents in Space Station activities, the Extravehicular Activity Retriever (EVAR) robot is being constructed, whose purpose is to retrieve astronauts and tools that float free of the Space Station. Advanced Decision Systems is at the beginning of a project to develop research software capable of guiding EVAR through the retrieval process. This involves addressing problems in machine vision, dexterous manipulation, real time construction of programs via speech input, and reactive execution of plans despite the mishaps and unexpected conditions that arise in uncontrolled domains. The problem analysis phase of this work is presented. An EVAR scenario is used to elucidate major domain and technical problems. An overview of the technical approach to prototyping an EVAR system is also presented.
NASA Astrophysics Data System (ADS)
Sullivan, W. N.
The Darrieus-type Vertical Axis Wind Turbine (VAWT) presents a variety of unusual structural problems to designers. The level of understanding of these structural problems governs, to a large degree, the success or failure of today's rotor designs. A survey is presented of the technology available for rotor structural design with emphasis on the DOE research program now underway. Itemizations are included of the major structural issues unique to the VAWT along with discussion of available analysis techniques for each problem area. It is concluded that tools are available to at least approximately address the most important problems. However, experimental data for confirmation is rather limited in terms of volume and the range of rotor configurations tested.
Case finding of lifestyle and mental health disorders in primary care: validation of the ‘CHAT’ tool
Goodyear-Smith, Felicity; Coupe, Nicole M; Arroll, Bruce; Elley, C Raina; Sullivan, Sean; McGill, Anne-Thea
2008-01-01
Background Primary care is accessible and ideally placed for case finding of patients with lifestyle and mental health risk factors and subsequent intervention. The short self-administered Case-finding and Help Assessment Tool (CHAT) was developed for lifestyle and mental health assessment of adult patients in primary health care. This tool checks for tobacco use, alcohol and other drug misuse, problem gambling, depression, anxiety and stress, abuse, anger problems, inactivity, and eating disorders. It is well accepted by patients, GPs and nurses. Aim To assess criterion-based validity of CHAT against a composite gold standard. Design of study Conducted according to the Standards for Reporting of Diagnostic Accuracy statement for diagnostic tests. Setting Primary care practices in Auckland, New Zealand. Method One thousand consecutive adult patients completed CHAT and a composite gold standard. Sensitivities, specificities, positive and negative predictive values, and likelihood ratios were calculated. Results Response rates for each item ranged from 79.6 to 99.8%. CHAT was sensitive and specific for almost all issues screened, except exercise and eating disorders. Sensitivity ranged from 96% (95% confidence interval [CI] = 87 to 99%) for major depression to 26% (95% CI = 22 to 30%) for exercise. Specificity ranged from 97% (95% CI = 96 to 98%) for problem gambling and problem drug use to 40% (95% CI = 36 to 45%) for exercise. All had high likelihood ratios (3–30), except exercise and eating disorders. Conclusion CHAT is a valid and acceptable case-finding tool for most common lifestyle and mental health conditions. PMID:18186993
A complementary measure of heterogeneity on mathematical skills
NASA Astrophysics Data System (ADS)
Fedriani, Eugenio M.; Moyano, Rafael
2012-06-01
Finding educational truths is an inherently multivariate problem. There are many factors affecting each student and their performances. Because of this, both measuring of skills and assessing students are always complex processes. This is a well-known problem, and a number of solutions have been proposed by specialists. One of its ramifications is that the variety of progress levels of students in the Mathematics classroom makes teaching more difficult. We think that a measure of the heterogeneity of the different student groups could be interesting in order to prepare some strategies to deal with these kinds of difficulties. The major aim of this study is to develop new tools, complementary to the statistical ones that are commonly used for these purposes, to study situations related to education (mainly to the detection of levels of mathematical education) in which several variables are involved. These tools are thought to simplify these educational analyses and, through a better comprehension of the topic, to improve our teaching. Several authors in our research group have developed some mathematical, theoretical tools, to deal with multidimensional phenomena, and have applied them to measure poverty and also to other business models. These tools are based on multidigraphs. In this article, we implement these tools using symbolic computational software and apply them to study a specific situation related to mathematical education.
ERIC Educational Resources Information Center
Young, Shelley Shwu-Ching; Lin, Wei-Lin
2012-01-01
This study explores how to make diverse learning/instructional materials compatible with e-readers when the instructor pioneered to adopt e-readers into a course of the graduate level. What problems did the instructor encounter when she used the e-readers as a major tool to deliver learning contents, such as the process of converting the…
ERIC Educational Resources Information Center
Lokar, Matija; Libbrecht, Paul
2017-01-01
Mathematical formulae are information objects that can be entered in a computer, visualized, and evaluated. Thus, by the majority of (mostly occasional) users it is also expected that they are transferable through the simple copy-paste procedure. This transfer is particularly interesting when users are involved in tasks that span different…
ERIC Educational Resources Information Center
Holzmann, Vered; Mischari, Shoshana; Goldberg, Shoshana; Ziv, Amitai
2012-01-01
Purpose: This article aims to present a unique systematic and validated method for creating a linkage between past experiences and management of future occurrences in an organization. Design/methodology/approach: The study is based on actual data accumulated in a series of projects performed in a major medical center. Qualitative and quantitative…
ERIC Educational Resources Information Center
Kilic, Deniz Beste Çevik
2017-01-01
Rapid developments and innovations in technology have impact on individuals. The use of technology for one's daily life has become a necessity; therefore, the development and popularization of Information and Communication Technologies (ICTs) is use as a tool for solving educational problems. Because educational technologies play a major role both…
A Qualitative Study of Students' Computational Thinking Skills in a Data-Driven Computing Class
ERIC Educational Resources Information Center
Yuen, Timothy T.; Robbins, Kay A.
2014-01-01
Critical thinking, problem solving, the use of tools, and the ability to consume and analyze information are important skills for the 21st century workforce. This article presents a qualitative case study that follows five undergraduate biology majors in a computer science course (CS0). This CS0 course teaches programming within a data-driven…
A Smartphone App to Communicate Child Passenger Safety: An Application of Theory to Practice
ERIC Educational Resources Information Center
Gielen, A. C.; McDonald, E. M.; Omaki, E.; Shields, W.; Case, J.; Aitken, M.
2015-01-01
Child passenger safety remains an important public health problem because motor vehicle crashes are the leading cause of death for children, and the majority of children ride improperly restrained. Using a mobile app to communicate with parents about injury prevention offers promise but little information is available on how to create such a tool.…
The History, Status, Gaps, and Future Directions of Neurotoxicology in China.
Cai, Tongjian; Luo, Wenjing; Ruan, Diyun; Wu, Yi-Jun; Fox, Donald A; Chen, Jingyuan
2016-06-01
Rapid economic development in China has produced serious ecological, environmental, and health problems. Neurotoxicity has been recognized as a major public health problem. The Chinese government, research institutes, and scientists conducted extensive studies concerning the source, characteristics, and mechanisms of neurotoxicants. This paper presents, for the first time, a comprehensive history and review of major sources of neurotoxicants, national bodies/legislation engaged, and major neurotoxicology research in China. Peer-reviewed research and pollution studies by Chinese scientists from 1991 to 2015 were examined. PubMed, Web of Science and Chinese National Knowledge Infrastructure (CNKI) were the major search tools. The central problem is an increased exposure to neurotoxicants from air and water, food contamination, e-waste recycling, and manufacturing of household products. China formulated an institutional framework and standards system for management of major neurotoxicants. Basic and applied research was initiated, and international cooperation was achieved. The annual number of peer-reviewed neurotoxicology papers from Chinese authors increased almost 30-fold since 2001. Despite extensive efforts, neurotoxicity remains a significant public health problem. This provides great challenges and opportunities. We identified 10 significant areas that require major educational, environmental, governmental, and research efforts, as well as attention to public awareness. For example, there is a need to increase efforts to utilize new in vivo and in vitro models, determine the potential neurotoxicity and mechanisms involved in newly emerging pollutants, and examine the effects and mechanisms of mixtures. In the future, we anticipate working with scientists worldwide to accomplish these goals and eliminate, prevent and treat neurotoxicity. Cai T, Luo W, Ruan D, Wu YJ, Fox DA, Chen J. 2016. The history, status, gaps, and future directions of neurotoxicology in China. Environ Health Perspect 124:722-732; http://dx.doi.org/10.1289/ehp.1409566.
Detection of faults in rotating machinery using periodic time-frequency sparsity
NASA Astrophysics Data System (ADS)
Ding, Yin; He, Wangpeng; Chen, Binqiang; Zi, Yanyang; Selesnick, Ivan W.
2016-11-01
This paper addresses the problem of extracting periodic oscillatory features in vibration signals for detecting faults in rotating machinery. To extract the feature, we propose an approach in the short-time Fourier transform (STFT) domain where the periodic oscillatory feature manifests itself as a relatively sparse grid. To estimate the sparse grid, we formulate an optimization problem using customized binary weights in the regularizer, where the weights are formulated to promote periodicity. In order to solve the proposed optimization problem, we develop an algorithm called augmented Lagrangian majorization-minimization algorithm, which combines the split augmented Lagrangian shrinkage algorithm (SALSA) with majorization-minimization (MM), and is guaranteed to converge for both convex and non-convex formulation. As examples, the proposed approach is applied to simulated data, and used as a tool for diagnosing faults in bearings and gearboxes for real data, and compared to some state-of-the-art methods. The results show that the proposed approach can effectively detect and extract the periodical oscillatory features.
Downhole vacuum cleans up tough fishing, milling jobs
DOE Office of Scientific and Technical Information (OSTI.GOV)
LaLande, P.; Flanders, B.
1996-02-01
A unique tool developed to effect reverse circulation downhole is being used successfully in problem milling and fishing operations where conventional techniques fail to recover junk in the hole. Jointly developed by several major operators in conjunction with Baker Oil Tools, the patented Reverse Circulating Tool (RCT) acts as a downhole vacuum cleaner, catching and retaining debris circulated from the wellbore while allowing fishing, milling and washover operations to continue uninterrupted. As described in several case histories overviewed, the unique vacuuming action efficiently cleans up junk and debris in even the most difficult fishing and milling applications. Downhole operations proceedmore » normally, but without threat of damage from milled debris. Developers hold both mechanical and method patents on the RCT.« less
Using block pulse functions for seismic vibration semi-active control of structures with MR dampers
NASA Astrophysics Data System (ADS)
Rahimi Gendeshmin, Saeed; Davarnia, Daniel
2018-03-01
This article applied the idea of block pulse functions in the semi-active control of structures. The BP functions give effective tools to approximate complex problems. The applied control algorithm has a major effect on the performance of the controlled system and the requirements of the control devices. In control problems, it is important to devise an accurate analytical technique with less computational cost. It is proved that the BP functions are fundamental tools in approximation problems which have been applied in disparate areas in last decades. This study focuses on the employment of BP functions in control algorithm concerning reduction the computational cost. Magneto-rheological (MR) dampers are one of the well-known semi-active tools that can be used to control the response of civil Structures during earthquake. For validation purposes, numerical simulations of a 5-story shear building frame with MR dampers are presented. The results of suggested method were compared with results obtained by controlling the frame by the optimal control method based on linear quadratic regulator theory. It can be seen from simulation results that the suggested method can be helpful in reducing seismic structural responses. Besides, this method has acceptable accuracy and is in agreement with optimal control method with less computational costs.
Accurate construction of consensus genetic maps via integer linear programming.
Wu, Yonghui; Close, Timothy J; Lonardi, Stefano
2011-01-01
We study the problem of merging genetic maps, when the individual genetic maps are given as directed acyclic graphs. The computational problem is to build a consensus map, which is a directed graph that includes and is consistent with all (or, the vast majority of) the markers in the input maps. However, when markers in the individual maps have ordering conflicts, the resulting consensus map will contain cycles. Here, we formulate the problem of resolving cycles in the context of a parsimonious paradigm that takes into account two types of errors that may be present in the input maps, namely, local reshuffles and global displacements. The resulting combinatorial optimization problem is, in turn, expressed as an integer linear program. A fast approximation algorithm is proposed, and an additional speedup heuristic is developed. Our algorithms were implemented in a software tool named MERGEMAP which is freely available for academic use. An extensive set of experiments shows that MERGEMAP consistently outperforms JOINMAP, which is the most popular tool currently available for this task, both in terms of accuracy and running time. MERGEMAP is available for download at http://www.cs.ucr.edu/~yonghui/mgmap.html.
Communicating and Visualizing Erosion-associated Risks to Infrastructure
NASA Astrophysics Data System (ADS)
Hewett, Caspar; Simpson, Carolyn; Wainwright, John
2016-04-01
Soil erosion is a major problem worldwide, affecting agriculture, the natural environment and urban areas through its impact on flood risk, water quality, loss of nutrient-rich upper soil layers, eutrophication of water bodies, sedimentation of waterways and sediment-related damage to roads, buildings and infrastructure such as water, gas and electricity supply networks. This study focuses on risks to infrastructure associated with erosion and the interventions needed to reduce those risks. Deciding on what interventions to make means understanding better which parts of the landscape are most susceptible to erosion and which measures are most effective in reducing it. Effective ways of communicating mitigation strategies to stakeholders such as farmers, land managers and policy-makers are then essential if interventions are to be implemented. Drawing on the Decision-Support Matrix (DSM) approach which combines a set of hydrological principles with Participatory Action Research (PAR), a decision-support tool for Communicating and Visualizing Erosion-Associated Risks to Infrastructure (CAVERTI) was developed. The participatory component was developed with the Wear Rivers Trust, focusing on a case-study area in the North East of England. The CAVERTI tool brings together process understanding gained from modelling with knowledge and experience of a variety of stakeholders to address directly the problem of sediment transport. Development of the tool was a collaborative venture, ensuring that the problems and solutions presented are easily recognised by practitioners and decision-makers. This recognition, and ease of access via a web-based interface, in turn help to ensure that the tools get used. The web-based tool developed helps to assess, manage and improve understanding of risk from a multi-stakeholder perspective and proposes solutions to problems. We argue that visualization and communication tools co-developed by researchers and stakeholders are the best means of ensuring that mitigation measures are undertaken across the landscape to reduce soil erosion. The CAVERTI tool has proven to be an effective means of encouraging farmers and land owners to act to reduce erosion, providing multiple benefits from protecting local infrastructure to reducing pollution of waterways.
[Imaging of breast tissues changes--early detection, screening and problem solving].
Wruk, Daniela
2008-04-01
In the industrialised countries breast cancer is the cancer with the highest prevalence and causes the highest rate of cancer deaths among women. In Switzerland alone, about 5000 newly diagnosed cases occur per year. Our three main diagnostic tools in imaging diseases of the breast in the setting of screening, early detection or problem solving are mammography, ultrasound and MRI with intravenous contrast application. The most important imaging technique is mammography, which as only method has shown evidence to be suitable for screening so far. As a major accomplishing imaging tool there is sonography, which in women under 30 years of age is the first method of choice in examination of the breasts. The MRI is able to provide additional information about the perfusion of tissue changes within the breast; because of its low specificity, however, it should cautiously be applied for specific questions.
Ethics and radiation protection.
Hansson, Sven Ove
2007-06-01
Some of the major problems in radiation protection are closely connected to issues that have a long, independent tradition in moral philosophy. This contribution focuses on two of these issues. One is the relationship between the protection of individuals and optimisation on the collective level, and the other is the relative valuation of future versus immediate damage. Some of the intellectual tools that have been developed by philosophers can be useful in radiation protection. On the other hand, philosophers have much to learn from radiation protectors, not least when it comes to finding pragmatic solutions to problems that may be intractable in principle.
Patrick H. Brose
2014-01-01
In the past 40 years, the perception of periodic fire in upland oak (Quercus spp.) forests in the eastern United States has changed dramatically. Once thought of as a wholly destructive force, periodic fire is now considered an important disturbance whose absence is a major contributing factor to oak regeneration problems. This change in attitude and...
New Tools in Orthology Analysis: A Brief Review of Promising Perspectives
Nichio, Bruno T. L.; Marchaukoski, Jeroniza Nunes; Raittz, Roberto Tadeu
2017-01-01
Nowadays defying homology relationships among sequences is essential for biological research. Within homology the analysis of orthologs sequences is of great importance for computational biology, annotation of genomes and for phylogenetic inference. Since 2007, with the increase in the number of new sequences being deposited in large biological databases, researchers have begun to analyse computerized methodologies and tools aimed at selecting the most promising ones in the prediction of orthologous groups. Literature in this field of research describes the problems that the majority of available tools show, such as those encountered in accuracy, time required for analysis (especially in light of the increasing volume of data being submitted, which require faster techniques) and the automatization of the process without requiring manual intervention. Conducting our search through BMC, Google Scholar, NCBI PubMed, and Expasy, we examined more than 600 articles pursuing the most recent techniques and tools developed to solve most the problems still existing in orthology detection. We listed the main computational tools created and developed between 2011 and 2017, taking into consideration the differences in the type of orthology analysis, outlining the main features of each tool and pointing to the problems that each one tries to address. We also observed that several tools still use as their main algorithm the BLAST “all-against-all” methodology, which entails some limitations, such as limited number of queries, computational cost, and high processing time to complete the analysis. However, new promising tools are being developed, like OrthoVenn (which uses the Venn diagram to show the relationship of ortholog groups generated by its algorithm); or proteinOrtho (which improves the accuracy of ortholog groups); or ReMark (tackling the integration of the pipeline to turn the entry process automatic); or OrthAgogue (using algorithms developed to minimize processing time); and proteinOrtho (developed for dealing with large amounts of biological data). We made a comparison among the main features of four tool and tested them using four for prokaryotic genomas. We hope that our review can be useful for researchers and will help them in selecting the most appropriate tool for their work in the field of orthology. PMID:29163633
New Tools in Orthology Analysis: A Brief Review of Promising Perspectives.
Nichio, Bruno T L; Marchaukoski, Jeroniza Nunes; Raittz, Roberto Tadeu
2017-01-01
Nowadays defying homology relationships among sequences is essential for biological research. Within homology the analysis of orthologs sequences is of great importance for computational biology, annotation of genomes and for phylogenetic inference. Since 2007, with the increase in the number of new sequences being deposited in large biological databases, researchers have begun to analyse computerized methodologies and tools aimed at selecting the most promising ones in the prediction of orthologous groups. Literature in this field of research describes the problems that the majority of available tools show, such as those encountered in accuracy, time required for analysis (especially in light of the increasing volume of data being submitted, which require faster techniques) and the automatization of the process without requiring manual intervention. Conducting our search through BMC, Google Scholar, NCBI PubMed, and Expasy, we examined more than 600 articles pursuing the most recent techniques and tools developed to solve most the problems still existing in orthology detection. We listed the main computational tools created and developed between 2011 and 2017, taking into consideration the differences in the type of orthology analysis, outlining the main features of each tool and pointing to the problems that each one tries to address. We also observed that several tools still use as their main algorithm the BLAST "all-against-all" methodology, which entails some limitations, such as limited number of queries, computational cost, and high processing time to complete the analysis. However, new promising tools are being developed, like OrthoVenn (which uses the Venn diagram to show the relationship of ortholog groups generated by its algorithm); or proteinOrtho (which improves the accuracy of ortholog groups); or ReMark (tackling the integration of the pipeline to turn the entry process automatic); or OrthAgogue (using algorithms developed to minimize processing time); and proteinOrtho (developed for dealing with large amounts of biological data). We made a comparison among the main features of four tool and tested them using four for prokaryotic genomas. We hope that our review can be useful for researchers and will help them in selecting the most appropriate tool for their work in the field of orthology.
Rational Solutions for Challenges of the New Mellennium
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gover, J.; Guray, P.G.
We have reviewed ten major public problems challenging our Nation as it enters the new millennium. These are defense, healthcare costs, education, aging population, energy and environment, crime, low productivity growth services, income distribution, regulations, and infrastructure. These problems share several features. First, each is so large, if it were soIved; it would have major impact on the U.S. economy. Second, each is resident in a socioeconomic system containing non-linear feedback loops and an adaptive human element. Third, each can only be solved by our political system, yet these problems are not responsive to piecemeal problem solving, the approach traditionallymore » used by policy makers. However, unless each problem is addressed in the context of the system in which it resides, the solution maybe worse than the problem. Our political system is immersed in reams of disconnected, unintelligible information skewed by various special interests to suggest policies favoring their particular needs. Help is needed, if rational solutions that serve public interests are to be forged for these ten probIems, The simulation and modeIing tools of physical scientists, engineers, economists, social scientists, public policy experts, and others, bolstered by the recent explosive growth in massively parallel computing power, must be blended together to synthesize models of the complex systems in which these problems are resident. These models must simulate the seemingly chaotic human element inherent in these systems and support policymakers in making informed decKlons about the future. We propose altering the policy development process by incorporating more modeling, simulation and analysis to bring about a revolution in policy making that takes advantage of the revolution in engineering emerging from simulation and modeling. While we recommend major research efforts to address each of these problems, we also observe these to be very complex, highly interdependent, multi-disciplinary problems; it will challenge the U.S. community of individual investigator researchers to make the cultural transformation necessary to address these problems in a team environment. Furthermore, models that simulate future behavior of these complex systems will not be exacq therefore, researchers must be prepared to use the modeling and simulation tools they develop to propose experiments to Congress. We recommend that ten laboratories owned by the American public be selected in an interagency competition to each manage and host a $1 billion/yertr National effort, each focused on one of these ten problems. Much of the supporting research and subsystem modeling work will be conducted at U.S. universities and at private firms with relevant expertise. Success of the Manhattan Project at the middle of the 20th century provides evidence this leadership model works.« less
Application of Assessment Tools to Examine Mental Health in Workplaces: Job Stress and Depression.
Jeon, Sang Won; Kim, Yong-Ku
2018-06-01
Despite the fact that the lifetime and yearly prevalence rates of mental illness continue rising, such diseases have only been acknowledged as involved in workplace health issue since the 2000s. Additionally, while the number of recognized cases of mental illnesses is rather low compared to their prevalence, they have a high likelihood of causing significant problems, including fatalities. Many workers are terrified of losing their jobs due to mental illness and therefore attempt to hide their mental health problems. For this reason, clinicians involved in occupational and environmental medicine should focus on interviews or screenings to identify such hidden mental health problems. More specifically, it would be helpful to evaluate job stress and depression in workplaces to ensure appropriate preventive actions and thereby reduce the prevalence of mental illness. Job stress not only causes mental illness and dissatisfaction with work, but also can increase the prevalence and morbidity of medical diseases, as well as other physical health problems. Depression is a major contributor to work loss and absence with effects surpassing almost all of the chronic medical disorder. These facts show why measure of job stress and depression should be highlighted in the occupational settings. This article introduces a variety of assessment tools to examine mental health, particularly stress and depression, in workplaces. These tools can be used by clinicians or professionals involved in the mental health, occupational safety, or health service fields for running diagnostics or screening tests.
NASA Astrophysics Data System (ADS)
Liu, Shuang; Liu, Fei; Hu, Shaohua; Yin, Zhenbiao
The major power information of the main transmission system in machine tools (MTSMT) during machining process includes effective output power (i.e. cutting power), input power and power loss from the mechanical transmission system, and the main motor power loss. These information are easy to obtain in the lab but difficult to evaluate in a manufacturing process. To solve this problem, a separation method is proposed here to extract the MTSMT power information during machining process. In this method, the energy flow and the mathematical models of major power information of MTSMT during the machining process are set up first. Based on the mathematical models and the basic data tables obtained from experiments, the above mentioned power information during machining process can be separated just by measuring the real time total input power of the spindle motor. The operation program of this method is also given.
Hilbink, Mirrian A H W; Ouwens, Marielle M T J; Burgers, Jako S; Kool, Rudolf B
2014-03-19
In the last decade, guideline organizations faced a number of problems, including a lack of standardization in guideline development methods and suboptimal guideline implementation. To contribute to the solution of these problems, we produced a toolbox for guideline development, implementation, revision, and evaluation. All relevant guideline organizations in the Netherlands were approached to prioritize the topics. We sent out a questionnaire and discussed the results at an invitational conference. Based on consensus, twelve topics were selected for the development of new tools. Subsequently, working groups were composed for the development of the tools. After development of the tools, their draft versions were pilot tested in 40 guideline projects. Based on the results of the pilot tests, the tools were refined and their final versions were presented. The vast majority of organizations involved in pilot testing of the tools reported satisfaction with using the tools. Guideline experts involved in pilot testing of the tools proposed a variety of suggestions for the implementation of the tools. The tools are available in Dutch and in English at a web-based platform on guideline development and implementation (http://www.ha-ring.nl). A collaborative approach was used for the development and evaluation of a toolbox for development, implementation, revision, and evaluation of guidelines. This approach yielded a potentially powerful toolbox for improving the quality and implementation of Dutch clinical guidelines. Collaboration between guideline organizations within this project led to stronger linkages, which is useful for enhancing coordination of guideline development and implementation and preventing duplication of efforts. Use of the toolbox could improve quality standards in the Netherlands, and might facilitate the development of high-quality guidelines in other countries as well.
Utilizing lean tools to improve value and reduce outpatient wait times in an Indian hospital.
Miller, Richard; Chalapati, Nirisha
2015-01-01
This paper aims to demonstrate how lean tools were applied to some unique issues of providing healthcare in a developing country where many patients face challenges not found in developed countries. The challenges provide insight into how lean tools can be utilized to provide similar results across the world. This paper is based on a qualitative case study carried out by a master's student implementing lean at a hospital in India. This paper finds that lean tools such as value-stream mapping and root cause analysis can lead to dramatic reductions in waste and improvements in productivity. The problems of the majority of patients paying for their own healthcare and lacking transportation created scheduling problems that required patients to receive their diagnosis and pay for treatment within a single day. Many additional wastes were identified that were significantly impacting the hospital's ability to provide care. As a result of this project, average outpatient wait times were reduced from 1 hour to 15 minutes along with a significant increase in labor productivity. The results demonstrate how lean tools can increase value to the patients. It also provides are framework that can be utilized for healthcare providers in developed and developing countries to analyze their value streams to reduce waste. This paper is one of the first to address the unique issues of implementing lean to a healthcare setting in a developing country.
NMESys: An expert system for network fault detection
NASA Technical Reports Server (NTRS)
Nelson, Peter C.; Warpinski, Janet
1991-01-01
The problem of network management is becoming an increasingly difficult and challenging task. It is very common today to find heterogeneous networks consisting of many different types of computers, operating systems, and protocols. The complexity of implementing a network with this many components is difficult enough, while the maintenance of such a network is an even larger problem. A prototype network management expert system, NMESys, implemented in the C Language Integrated Production System (CLIPS). NMESys concentrates on solving some of the critical problems encountered in managing a large network. The major goal of NMESys is to provide a network operator with an expert system tool to quickly and accurately detect hard failures, potential failures, and to minimize or eliminate user down time in a large network.
ERIC Educational Resources Information Center
Alexander, Leigh; Simmons, John
A number of studies are reviewed in an attempt to identify those schooling inputs that affect schooling outcomes, specifically cognitive achievement of students, in developing countries. Part 1 of the paper outlines the nature of the major tool of analysis, the educational production function (EPF), and the problems associated with its use as a…
African Orphan Crops under Abiotic Stresses: Challenges and Opportunities.
Tadele, Zerihun
2018-01-01
A changing climate, a growing world population, and a reduction in arable land devoted to food production are all problems facing the world food security. The development of crops that can yield under uncertain and extreme climatic and soil growing conditions can play a key role in mitigating these problems. Major crops such as maize, rice, and wheat are responsible for a large proportion of global food production but many understudied crops (commonly known as "orphan crops") including millets, cassava, and cowpea feed millions of people in Asia, Africa, and South America and are already adapted to the local environments in which they are grown. The application of modern genetic and genomic tools to the breeding of these crops can provide enormous opportunities for ensuring world food security but is only in its infancy. In this review, the diversity and types of understudied crops will be introduced, and the beneficial traits of these crops as well as their role in the socioeconomics of Africa will be discussed. In addition, the response of orphan crops to diverse types of abiotic stresses is investigated. A review of the current tools and their application to the breeding of enhanced orphan crops will also be described. Finally, few examples of global efforts on tackling major abiotic constraints in Africa are presented.
African Orphan Crops under Abiotic Stresses: Challenges and Opportunities
2018-01-01
A changing climate, a growing world population, and a reduction in arable land devoted to food production are all problems facing the world food security. The development of crops that can yield under uncertain and extreme climatic and soil growing conditions can play a key role in mitigating these problems. Major crops such as maize, rice, and wheat are responsible for a large proportion of global food production but many understudied crops (commonly known as “orphan crops”) including millets, cassava, and cowpea feed millions of people in Asia, Africa, and South America and are already adapted to the local environments in which they are grown. The application of modern genetic and genomic tools to the breeding of these crops can provide enormous opportunities for ensuring world food security but is only in its infancy. In this review, the diversity and types of understudied crops will be introduced, and the beneficial traits of these crops as well as their role in the socioeconomics of Africa will be discussed. In addition, the response of orphan crops to diverse types of abiotic stresses is investigated. A review of the current tools and their application to the breeding of enhanced orphan crops will also be described. Finally, few examples of global efforts on tackling major abiotic constraints in Africa are presented. PMID:29623231
A Flexible Statechart-to-Model-Checker Translator
NASA Technical Reports Server (NTRS)
Rouquette, Nicolas; Dunphy, Julia; Feather, Martin S.
2000-01-01
Many current-day software design tools offer some variant of statechart notation for system specification. We, like others, have built an automatic translator from (a subset of) statecharts to a model checker, for use to validate behavioral requirements. Our translator is designed to be flexible. This allows us to quickly adjust the translator to variants of statechart semantics, including problem-specific notational conventions that designers employ. Our system demonstration will be of interest to the following two communities: (1) Potential end-users: Our demonstration will show translation from statecharts created in a commercial UML tool (Rational Rose) to Promela, the input language of Holzmann's model checker SPIN. The translation is accomplished automatically. To accommodate the major variants of statechart semantics, our tool offers user-selectable choices among semantic alternatives. Options for customized semantic variants are also made available. The net result is an easy-to-use tool that operates on a wide range of statechart diagrams to automate the pathway to model-checking input. (2) Other researchers: Our translator embodies, in one tool, ideas and approaches drawn from several sources. Solutions to the major challenges of statechart-to-model-checker translation (e.g., determining which transition(s) will fire, handling of concurrent activities) are retired in a uniform, fully mechanized, setting. The way in which the underlying architecture of the translator itself facilitates flexible and customizable translation will also be evident.
NASA Astrophysics Data System (ADS)
Azila Che Musa, Nor; Mahmud, Zamalia; Baharun, Norhayati
2017-09-01
One of the important skills that is required from any student who are learning statistics is knowing how to solve statistical problems correctly using appropriate statistical methods. This will enable them to arrive at a conclusion and make a significant contribution and decision for the society. In this study, a group of 22 students majoring in statistics at UiTM Shah Alam were given problems relating to topics on testing of hypothesis which require them to solve the problems using confidence interval, traditional and p-value approach. Hypothesis testing is one of the techniques used in solving real problems and it is listed as one of the difficult concepts for students to grasp. The objectives of this study is to explore students’ perceived and actual ability in solving statistical problems and to determine which item in statistical problem solving that students find difficult to grasp. Students’ perceived and actual ability were measured based on the instruments developed from the respective topics. Rasch measurement tools such as Wright map and item measures for fit statistics were used to accomplish the objectives. Data were collected and analysed using Winsteps 3.90 software which is developed based on the Rasch measurement model. The results showed that students’ perceived themselves as moderately competent in solving the statistical problems using confidence interval and p-value approach even though their actual performance showed otherwise. Item measures for fit statistics also showed that the maximum estimated measures were found on two problems. These measures indicate that none of the students have attempted these problems correctly due to reasons which include their lack of understanding in confidence interval and probability values.
Perera, E. A. Ramani; Kathriarachchi, Samudra T.
2011-01-01
Background: Suicidal behaviour among youth is a major public health concern in Sri Lanka. Prevention of youth suicides using effective, feasible and culturally acceptable methods is invaluable in this regard, however research in this area is grossly lacking. Objective: This study aimed at determining the effectiveness of problem solving counselling as a therapeutic intervention in prevention of youth suicidal behaviour in Sri Lanka. Setting and design: This control trial study was based on hospital admissions with suicidal attempts in a sub-urban hospital in Sri Lanka. The study was carried out at Base Hospital Homagama. Materials and Methods: A sample of 124 was recruited using convenience sampling method and divided into two groups, experimental and control. Control group was offered routine care and experimental group received four sessions of problem solving counselling over one month. Outcome of both groups was measured, six months after the initial screening, using the visual analogue scale. Results: Individualized outcome measures on problem solving counselling showed that problem solving ability among the subjects in the experimental group had improved after four counselling sessions and suicidal behaviour has been reduced. The results are statistically significant. Conclusion: This Study confirms that problem solving counselling is an effective therapeutic tool in management of youth suicidal behaviour in hospital setting in a developing country. PMID:21431005
Fundamentals of Physics, 6th Edition Enhanced Problems Version
NASA Astrophysics Data System (ADS)
Halliday, David; Resnick, Robert; Walker, Jearl
2002-04-01
No other text on the market today can match the success of Halliday, Resnick and Walker's Fundamentals of Physics. This text continues to outperform the competition year after year, and the new edition will be no exception. Intended for Calculus-based Physics courses, the 6th edition of this extraordinary text is a major redesign of the best-selling 5th edition, which still maintains many of the elements that led to its enormous success. Jearl Walker adds his unique style to this edition with the addition of new problems designed to capture, and keep, students' attention. Nearly all changes are based on suggestions from instructors and students using the 5th edition, from reviewer comments, and from research done on the process of learning. The primary goal of this text is to provide students with a solid understanding of fundamental physics concepts, and to help them apply this conceptual understanding to quantitative problem solving. The principal goal of Halliday-Resnick-Walker is to provide instructors with a tool by which they can teach students how to effectively read scientific material and successfully reason through scientific questions. To sharpen this tool, the Enhanced Problems Version of the sixth edition of Fundamentals of Physics contains over 1000 new, high-quality problems that require thought and reasoning rather than simplistic plugging of data into formulas.
Artificial intelligence within the chemical laboratory.
Winkel, P
1994-01-01
Various techniques within the area of artificial intelligence such as expert systems and neural networks may play a role during the problem-solving processes within the clinical biochemical laboratory. Neural network analysis provides a non-algorithmic approach to information processing, which results in the ability of the computer to form associations and to recognize patterns or classes among data. It belongs to the machine learning techniques which also include probabilistic techniques such as discriminant function analysis and logistic regression and information theoretical techniques. These techniques may be used to extract knowledge from example patients to optimize decision limits and identify clinically important laboratory quantities. An expert system may be defined as a computer program that can give advice in a well-defined area of expertise and is able to explain its reasoning. Declarative knowledge consists of statements about logical or empirical relationships between things. Expert systems typically separate declarative knowledge residing in a knowledge base from the inference engine: an algorithm that dynamically directs and controls the system when it searches its knowledge base. A tool is an expert system without a knowledge base. The developer of an expert system uses a tool by entering knowledge into the system. Many, if not the majority of problems encountered at the laboratory level are procedural. A problem is procedural if it is possible to write up a step-by-step description of the expert's work or if it can be represented by a decision tree. To solve problems of this type only small expert system tools and/or conventional programming are required.(ABSTRACT TRUNCATED AT 250 WORDS)
The History, Status, Gaps, and Future Directions of Neurotoxicology in China
Cai, Tongjian; Luo, Wenjing; Ruan, Diyun; Wu, Yi-Jun; Fox, Donald A.; Chen, Jingyuan
2016-01-01
Background: Rapid economic development in China has produced serious ecological, environmental, and health problems. Neurotoxicity has been recognized as a major public health problem. The Chinese government, research institutes, and scientists conducted extensive studies concerning the source, characteristics, and mechanisms of neurotoxicants. Objectives: This paper presents, for the first time, a comprehensive history and review of major sources of neurotoxicants, national bodies/legislation engaged, and major neurotoxicology research in China. Methods: Peer-reviewed research and pollution studies by Chinese scientists from 1991 to 2015 were examined. PubMed, Web of Science and Chinese National Knowledge Infrastructure (CNKI) were the major search tools. Results: The central problem is an increased exposure to neurotoxicants from air and water, food contamination, e-waste recycling, and manufacturing of household products. China formulated an institutional framework and standards system for management of major neurotoxicants. Basic and applied research was initiated, and international cooperation was achieved. The annual number of peer-reviewed neurotoxicology papers from Chinese authors increased almost 30-fold since 2001. Conclusions: Despite extensive efforts, neurotoxicity remains a significant public health problem. This provides great challenges and opportunities. We identified 10 significant areas that require major educational, environmental, governmental, and research efforts, as well as attention to public awareness. For example, there is a need to increase efforts to utilize new in vivo and in vitro models, determine the potential neurotoxicity and mechanisms involved in newly emerging pollutants, and examine the effects and mechanisms of mixtures. In the future, we anticipate working with scientists worldwide to accomplish these goals and eliminate, prevent and treat neurotoxicity. Citation: Cai T, Luo W, Ruan D, Wu YJ, Fox DA, Chen J. 2016. The history, status, gaps, and future directions of neurotoxicology in China. Environ Health Perspect 124:722–732; http://dx.doi.org/10.1289/ehp.1409566 PMID:26824332
Assessment of alcohol problems using AUDIT in a prison setting: more than an 'aye or no' question.
MacAskill, Susan; Parkes, Tessa; Brooks, Oona; Graham, Lesley; McAuley, Andrew; Brown, Abraham
2011-11-14
Alcohol problems are a major UK and international public health issue. The prevalence of alcohol problems is markedly higher among prisoners than the general population. However, studies suggest alcohol problems among prisoners are under-detected, under-recorded and under-treated. Identifying offenders with alcohol problems is fundamental to providing high quality healthcare. This paper reports use of the AUDIT screening tool to assess alcohol problems among prisoners. Universal screening was undertaken over ten weeks with all entrants to one male Scottish prison using the AUDIT standardised screening tool and supplementary contextual questions. The questionnaire was administered by trained prison officers during routine admission procedures. Overall 259 anonymised completed questionnaires were analysed. AUDIT scores showed a high prevalence of alcohol problems with 73% of prisoner scores indicating an alcohol use disorder (8+), including 36% having scores indicating 'possible dependence' (20-40). AUDIT scores indicating 'possible dependence' were most apparent among 18-24 and 40-64 year-olds (40% and 56% respectively). However, individual questions showed important differences, with younger drinkers less likely to demonstrate habitual and addictive behaviours than the older age group. Disparity between high levels of harmful/hazardous/dependent drinking and low levels of 'treatment' emerged (only 27% of prisoners with scores indicating 'possible dependence' reported being 'in treatment'). Self-reported associations between drinking alcohol and the index crime were identified among two-fifths of respondents, rising to half of those reporting violent crimes. To our knowledge, this is the first study to identify differing behaviours and needs among prisoners with high AUDIT score ranges, through additional analysis of individual questions. The study has identified high prevalence of alcohol use, varied problem behaviours, and links across drinking, crime and recidivism, supporting the argument for more extensive provision of alcohol-focused interventions in prisons. These should be carefully targeted based on initial screening and assessment, responsive, and include care pathways linking prisoners to community services. Finally, findings confirm the value and feasibility of routine use of the AUDIT screening tool in prison settings, to considerably enhance practice in the detection and understanding of alcohol problems, improving on current more limited questioning (e.g. 'yes or no' questions).
Dysphagia after Stroke: an Overview
González-Fernández, Marlís; Ottenstein, Lauren; Atanelov, Levan; Christian, Asare B.
2013-01-01
Dysphagia affects the vast majority of acute stroke patients. Although it improves within 2 weeks for most, some face longstanding swallowing problems that place them at risk for pneumonia, malnutrition, dehydration, and significantly affect quality of life. This paper discusses the scope, the disease burden, and the tools available for screening and formal evaluation of dysphagia. The most common and recently developed treatment interventions that might be useful in the treatment of this population are discussed. PMID:24977109
Prescribing the mixed scalar curvature of a foliated Riemann-Cartan manifold
NASA Astrophysics Data System (ADS)
Rovenski, Vladimir Y.; Zelenko, Leonid
2018-03-01
The mixed scalar curvature is the simplest curvature invariant of a foliated Riemannian manifold. We explore the problem of prescribing the leafwise constant mixed scalar curvature of a foliated Riemann-Cartan manifold by conformal change of the structure in tangent and normal to the leaves directions. Under certain geometrical assumptions and in two special cases: along a compact leaf and for a closed fibered manifold, we reduce the problem to solution of a nonlinear leafwise elliptic equation for the conformal factor. We are looking for its solutions that are stable stationary solutions of the associated parabolic equation. Our main tool is using of majorizing and minorizing nonlinear heat equations with constant coefficients and application of comparison theorems for solutions of Cauchy's problem for parabolic equations.
NASA Technical Reports Server (NTRS)
Kwak, Dochan
2005-01-01
Over the past 30 years, numerical methods and simulation tools for fluid dynamic problems have advanced as a new discipline, namely, computational fluid dynamics (CFD). Although a wide spectrum of flow regimes are encountered in many areas of science and engineering, simulation of compressible flow has been the major driver for developing computational algorithms and tools. This is probably due to a large demand for predicting the aerodynamic performance characteristics of flight vehicles, such as commercial, military, and space vehicles. As flow analysis is required to be more accurate and computationally efficient for both commercial and mission-oriented applications (such as those encountered in meteorology, aerospace vehicle development, general fluid engineering and biofluid analysis) CFD tools for engineering become increasingly important for predicting safety, performance and cost. This paper presents the author's perspective on the maturity of CFD, especially from an aerospace engineering point of view.
The Effect of Serious Video Game Play on Science Inquiry Scores
NASA Astrophysics Data System (ADS)
Hilosky, Alexandra Borzillo
American students are not developing the science inquiry skills needed to solve complex 21st century problems, thus impacting the workforce. In 2009, American high school students ranked 21 out of 26 in the category of problem-solving according to the Program for International Student Assessment. Serious video games have powerful epistemic value and are beneficial with respect to enhancing inquiry, effective problem-solving. The purpose of this correlational, quantitative study was to test Gee's assumption regarding the cycle of thinking (routinization, automatization, and deroutinization) by determining whether players status was a significant predictor of science inquiry scores, controlling for age, gender, and major. The 156 non-random volunteers who participated in this study were enrolled in a 2-year college in the northeastern U.S. Multiple regression analyses revealed that major was the strongest overall (significant) predictor, b = -.84, t(149) = -3.70, p < .001, even though gamer status served as a significant predictor variable for Stage 1 only, b = -.48, t(149) = -2.37, p = .019. Participants who reported playing serious video games scored .48 points higher than non-players of serious video games regardless of age, gender, and major, which supports previous studies that have found significant differences in scientific inquiry abilities related to forming hypotheses and identifying problems based on serious video game play. Recommendations include using serious games as instructional tools and to assess student learning (formative and summative), especially among non-traditional learners.
The QuakeSim Project: Numerical Simulations for Active Tectonic Processes
NASA Technical Reports Server (NTRS)
Donnellan, Andrea; Parker, Jay; Lyzenga, Greg; Granat, Robert; Fox, Geoffrey; Pierce, Marlon; Rundle, John; McLeod, Dennis; Grant, Lisa; Tullis, Terry
2004-01-01
In order to develop a solid earth science framework for understanding and studying of active tectonic and earthquake processes, this task develops simulation and analysis tools to study the physics of earthquakes using state-of-the art modeling, data manipulation, and pattern recognition technologies. We develop clearly defined accessible data formats and code protocols as inputs to the simulations. these are adapted to high-performance computers because the solid earth system is extremely complex and nonlinear resulting in computationally intensive problems with millions of unknowns. With these tools it will be possible to construct the more complex models and simulations necessary to develop hazard assessment systems critical for reducing future losses from major earthquakes.
Webb-Vargas, Yenny; Chen, Shaojie; Fisher, Aaron; Mejia, Amanda; Xu, Yuting; Crainiceanu, Ciprian; Caffo, Brian; Lindquist, Martin A
2017-12-01
Big Data are of increasing importance in a variety of areas, especially in the biosciences. There is an emerging critical need for Big Data tools and methods, because of the potential impact of advancements in these areas. Importantly, statisticians and statistical thinking have a major role to play in creating meaningful progress in this arena. We would like to emphasize this point in this special issue, as it highlights both the dramatic need for statistical input for Big Data analysis and for a greater number of statisticians working on Big Data problems. We use the field of statistical neuroimaging to demonstrate these points. As such, this paper covers several applications and novel methodological developments of Big Data tools applied to neuroimaging data.
Computational and mathematical methods in brain atlasing.
Nowinski, Wieslaw L
2017-12-01
Brain atlases have a wide range of use from education to research to clinical applications. Mathematical methods as well as computational methods and tools play a major role in the process of brain atlas building and developing atlas-based applications. Computational methods and tools cover three areas: dedicated editors for brain model creation, brain navigators supporting multiple platforms, and atlas-assisted specific applications. Mathematical methods in atlas building and developing atlas-aided applications deal with problems in image segmentation, geometric body modelling, physical modelling, atlas-to-scan registration, visualisation, interaction and virtual reality. Here I overview computational and mathematical methods in atlas building and developing atlas-assisted applications, and share my contribution to and experience in this field.
What's it worth? A general manager's guide to valuation.
Luehrman, T A
1997-01-01
Behind every major resource-allocation decision a company makes lies some calculation of what that move is worth. So it is not surprising that valuation is the financial analytical skill general managers want to learn more than any other. Managers whose formal training is more than a few years old, however, are likely to have learned approaches that are becoming obsolete. What do generalists need in an updated valuation tool kit? In the 1970s, discounted-cash-flow analysis (DCF) emerged as best practice for valuing corporate assets. And one version of DCF-using the weighted-average cost of capital (WACC)-became the standard. Over the years, WACC has been used by most companies as a one-size-fits-all valuation tool. Today the WACC standard is insufficient. Improvements in computers and new theoretical insights have given rise to tools that outperform WACC in the three basic types of valuation problems managers face. Timothy Luehrman presents an overview of the three tools, explaining how they work and when to use them. For valuing operations, the DCF methodology of adjusted present value allows managers to break a problem into pieces that make managerial sense. For valuing opportunities, option pricing captures the contingent nature of investments in areas such as R&D and marketing. And for valuing ownership claims, the tool of equity cash flows helps managers value their company's stake in a joint venture, a strategic alliance, or an investment that uses project financing.
Cancer systems biology: signal processing for cancer research
Yli-Harja, Olli; Ylipää, Antti; Nykter, Matti; Zhang, Wei
2011-01-01
In this editorial we introduce the research paradigms of signal processing in the era of systems biology. Signal processing is a field of science traditionally focused on modeling electronic and communications systems, but recently it has turned to biological applications with astounding results. The essence of signal processing is to describe the natural world by mathematical models and then, based on these models, develop efficient computational tools for solving engineering problems. Here, we underline, with examples, the endless possibilities which arise when the battle-hardened tools of engineering are applied to solve the problems that have tormented cancer researchers. Based on this approach, a new field has emerged, called cancer systems biology. Despite its short history, cancer systems biology has already produced several success stories tackling previously impracticable problems. Perhaps most importantly, it has been accepted as an integral part of the major endeavors of cancer research, such as analyzing the genomic and epigenomic data produced by The Cancer Genome Atlas (TCGA) project. Finally, we show that signal processing and cancer research, two fields that are seemingly distant from each other, have merged into a field that is indeed more than the sum of its parts. PMID:21439242
NASA Technical Reports Server (NTRS)
Jacklin, Stephen; Schumann, Johann; Gupta, Pramod; Richard, Michael; Guenther, Kurt; Soares, Fola
2005-01-01
Adaptive control technologies that incorporate learning algorithms have been proposed to enable automatic flight control and vehicle recovery, autonomous flight, and to maintain vehicle performance in the face of unknown, changing, or poorly defined operating environments. In order for adaptive control systems to be used in safety-critical aerospace applications, they must be proven to be highly safe and reliable. Rigorous methods for adaptive software verification and validation must be developed to ensure that control system software failures will not occur. Of central importance in this regard is the need to establish reliable methods that guarantee convergent learning, rapid convergence (learning) rate, and algorithm stability. This paper presents the major problems of adaptive control systems that use learning to improve performance. The paper then presents the major procedures and tools presently developed or currently being developed to enable the verification, validation, and ultimate certification of these adaptive control systems. These technologies include the application of automated program analysis methods, techniques to improve the learning process, analytical methods to verify stability, methods to automatically synthesize code, simulation and test methods, and tools to provide on-line software assurance.
Cooperative problem solving with personal mobile information tools in hospitals.
Buchauer, A; Werner, R; Haux, R
1998-01-01
Health-care professionals have a broad range of needs for information and cooperation while working at different points of care (e.g., outpatient departments, wards, and functional units such as operating theaters). Patient-related data and medical knowledge have to be widely available to support high-quality patient care. Furthermore, due to the increased specialization of health-care professionals, efficient collaboration is required. Personal mobile information tools have a considerable potential to realize almost ubiquitous information and collaborative support. They enable to unite the functionality of conventional tools such as paper forms, dictating machines, and pagers into one tool. Moreover, they can extend the support already provided by clinical workstations. An approach is described for the integration of mobile information tools with heterogeneous hospital information systems. This approach includes identification of functions which should be provided on mobile tools. Major functions are the presentation of medical records and reports, electronic mailing to support interpersonal communication, and the provision of editors for structured clinical documentation. To realize those functions on mobile tools, we propose a document-based client-server architecture that enables mobile information tools to interoperate with existing computer-based application systems. Open application systems and powerful, partially wireless, hospital-wide networks are the prerequisites for the introduction of mobile information tools.
An overview of recent applications of computational modelling in neonatology
Wrobel, Luiz C.; Ginalski, Maciej K.; Nowak, Andrzej J.; Ingham, Derek B.; Fic, Anna M.
2010-01-01
This paper reviews some of our recent applications of computational fluid dynamics (CFD) to model heat and mass transfer problems in neonatology and investigates the major heat and mass-transfer mechanisms taking place in medical devices, such as incubators, radiant warmers and oxygen hoods. It is shown that CFD simulations are very flexible tools that can take into account all modes of heat transfer in assisting neonatal care and improving the design of medical devices. PMID:20439275
2013-04-01
Teresa Wu, Arizona State University Eugene Rex Jalao, Arizona State University and University of the Philippines Christopher Auger, Lars Baldus, Brian...of Technology The RITE Approach to Agile Acquisition Timothy Boyce, Iva Sherman, and Nicholas Roussel Space and Naval Warfare Systems Center Pacific...Demonstration Office: Ad Hoc Problem Solving as a Mechanism for Adaptive Change Kathryn Aten and John T . Dillard Naval Postgraduate School A Comparative
NASA Astrophysics Data System (ADS)
Fucugauchi, J. U.
2013-05-01
In the coming decades a changing climate and natural hazards will likely increase the vulnerability of agricultural and other food production infrastructures, posing increasing treats to industrialized and developing economies. While food security concerns affect us globally, the huge differences among countries in stocks, population size, poverty levels, economy, technologic development, transportation, health care systems and basic infrastructure will pose a much larger burden on populations in the developing and less developed world. In these economies, increase in the magnitude, duration and frequency of droughts, floods, hurricanes, rising sea levels, heat waves, thunderstorms, freezing events and other phenomena will pose severe costs on the population. For this presentation, we concentrate on a geophysical perspective of the problems, tools available, challenges and short and long-term perspectives. In many instances, a range of natural hazards are considered as unforeseen catastrophes, which suddenly affect without warning, resulting in major losses. Although the forecasting capacity in the different situations arising from climate change and natural hazards is still limited, there are a range of tools available to assess scenarios and forecast models for developing and implementing better mitigation strategies and prevention programs. Earth observation systems, geophysical instrumental networks, satellite observatories, improved understanding of phenomena, expanded global and regional databases, geographic information systems, higher capacity for computer modeling, numerical simulations, etc provide a scientific-technical framework for developing strategies. Hazard prevention and mitigation programs will result in high costs globally, however major costs and challenges concentrate on the less developed economies already affected by poverty, famines, health problems, social inequalities, poor infrastructure, low life expectancy, high population growth, inadequate education systems, immigration, economic crises, conflicts and other issues. Case history analyses and proposals for collaboration programs, know-how transfer and better use of geophysical tools, data, observatories and monitoring networks will be discussed.
Research on automated disassembly technology for waste LCD
NASA Astrophysics Data System (ADS)
Qin, Qin; Zhu, Dongdong; Wang, Jingwei; Dou, Jianfang; Wang, Sujuan; Tu, Zimei
2017-11-01
In the field of Waste LCD disassembling and recycling, there are existing two major problems: 1) disassembling waste LCD mainly depends on manually mechanical crushing; 2) the resource level is not high. In order to deal with the above problems, in this paper, we develop an efficient, safe and automated waste LCD disassembling assembly line technology. This technology can disassembly and classify mainstream LCD into four components, which are liquid crystal display panels, housings and metal shield, PCB assembly. It can also disassembly many kinds of waste LCD. Compared with the traditional cooperation of manual labor and electric tools method, our proposed technology can significantly improve disassembling efficiency and demonstrate good prospects and promotional value.
Twelve tips for getting started using mixed methods in medical education research.
Lavelle, Ellen; Vuk, Jasna; Barber, Carolyn
2013-04-01
Mixed methods research, which is gaining popularity in medical education, provides a new and comprehensive approach for addressing teaching, learning, and evaluation issues in the field. The aim of this article is to provide medical education researchers with 12 tips, based on consideration of current literature in the health professions and in educational research, for conducting and disseminating mixed methods research. Engaging in mixed methods research requires consideration of several major components: the mixed methods paradigm, types of problems, mixed method designs, collaboration, and developing or extending theory. Mixed methods is an ideal tool for addressing a full range of problems in medical education to include development of theory and improving practice.
Organizational/Memory Tools: A Technique for Improving Problem Solving Skills.
ERIC Educational Resources Information Center
Steinberg, Esther R.; And Others
1986-01-01
This study was conducted to determine whether students would use a computer-presented organizational/memory tool as an aid in problem solving, and whether and how locus of control would affect tool use and problem-solving performance. Learners did use the tools, which were most effective in the learner control with feedback condition. (MBR)
Orangutans (Pongo spp.) may prefer tools with rigid properties to flimsy tools.
Walkup, Kristina R; Shumaker, Robert W; Pruetz, Jill D
2010-11-01
Preference for tools with either rigid or flexible properties was explored in orangutans (Pongo spp.) through an extension of D. J. Povinelli, J. E. Reaux, and L. A. Theall's (2000) flimsy-tool problem. Three captive orangutans were presented with three unfamiliar pairs of tools to solve a novel problem. Although each orangutan has spontaneously used tools in the past, the tools presented in this study were novel to the apes. Each pair of tools contained one tool with rigid properties (functional) and one tool with flimsy properties (nonfunctional). Solving the problem required selection of a rigid tool to retrieve a food reward. The functional tool was selected in nearly all trials. Moreover, two of the orangutans demonstrated this within the first test trials with each of the three tool types. Although further research is required to test this statistically, it suggests either a preexisting preference for rigid tools or comprehension of the relevant features required in a tool to solve the task. The results of this study demonstrate that orangutans can recognize, or learn to recognize, relevant tool properties and can choose an appropriate tool to solve a problem. (PsycINFO Database Record (c) 2010 APA, all rights reserved).
Theory and applications survey of decentralized control methods
NASA Technical Reports Server (NTRS)
Athans, M.
1975-01-01
A nonmathematical overview is presented of trends in the general area of decentralized control strategies which are suitable for hierarchical systems. Advances in decentralized system theory are closely related to advances in the so-called stochastic control problem with nonclassical information pattern. The basic assumptions and mathematical tools pertaining to the classical stochastic control problem are outlined. Particular attention is devoted to pitfalls in the mathematical problem formulation for decentralized control. Major conclusions are that any purely deterministic approach to multilevel hierarchical dynamic systems is unlikely to lead to realistic theories or designs, that the flow of measurements and decisions in a decentralized system should not be instantaneous and error-free, and that delays in information exchange in a decentralized system lead to reasonable approaches to decentralized control. A mathematically precise notion of aggregating information is not yet available.
NASA'S Water Resources Element Within the Applied Sciences Program
NASA Technical Reports Server (NTRS)
Toll, David; Doorn, Bradley; Engman, Edwin
2011-01-01
The NASA Earth Systems Division has the primary responsibility for the Applied Science Program and the objective to accelerate the use of NASA science results in applications to help solve problems important to society and the economy. The primary goal of the NASA Applied Science Program is to improve future and current operational systems by infusing them with scientific knowledge of the Earth system gained through space-based observation, assimilation of new observations, and development and deployment of enabling technologies, systems, and capabilities. This paper discusses major problems facing water resources managers, including having timely and accurate data to drive their decision support tools. It then describes how NASA's science and space based satellites may be used to overcome this problem. Opportunities for the water resources community to participate in NASA's Water Resources Applications Program are described.
Cause-and-effect mapping of critical events.
Graves, Krisanne; Simmons, Debora; Galley, Mark D
2010-06-01
Health care errors are routinely reported in the scientific and public press and have become a major concern for most Americans. In learning to identify and analyze errors health care can develop some of the skills of a learning organization, including the concept of systems thinking. Modern experts in improving quality have been working in other high-risk industries since the 1920s making structured organizational changes through various frameworks for quality methods including continuous quality improvement and total quality management. When using these tools, it is important to understand systems thinking and the concept of processes within organization. Within these frameworks of improvement, several tools can be used in the analysis of errors. This article introduces a robust tool with a broad analytical view consistent with systems thinking, called CauseMapping (ThinkReliability, Houston, TX, USA), which can be used to systematically analyze the process and the problem at the same time. Copyright 2010 Elsevier Inc. All rights reserved.
[The breast of the adolescent girl].
Bruant-Rodier, C; Dissaux, C; Baratte, A; Francois Fiquet, C; Bodin, F
2016-10-01
During adolescence, psychological and physical changes occur and breast takes a major place in the young woman body image. Except rare malign tumors, breast pathologies at this age are mainly benign or malformative. Malformative issues are revealed during breast growth, as isolated asymmetry or associated to other regional anomalies, with abnormal shape or volume of the breast, or even supernumerary breast. Therapeutic solutions will not differ from the ones used for adults. Breast lipofilling, recently admitted by plastic surgery community is an interesting tool that can be used on young women. Choosing the right technic depends on the initial problem. It comes at an early stage to offset hypoplasia resulting in a problem of asymmetry. It waits for breast stability in case of hypertrophy and for legal majority in case of breast augmentation using implants. Psychological impairment stays however a central issue and forces the surgeon to adapt to the individual and to his body change over time. Copyright © 2016 Elsevier Masson SAS. All rights reserved.
Rokaya, Dinesh; Suttagul, Kanokwan; Joshi, Shraddha; Bhattarai, Bishwa Prakash; Shah, Pravin Kumar; Dixit, Shantanu
2018-02-01
Temporomandibular disorder (TMD) represents a subgroup of painful orofacial disorders involving pain in the temporomandibular joint (TMJ) region, fatigue of the cranio-cervico-facial muscles (especially masticatory muscles), limitation of mandible movement, and the presence of a clicking sound in the TMJ. TMD is associated with multiple factors and systemic diseases. This study aimed to assess the prevalence of TMD in Nepalese subjects for the first time. A total of 500 medical and dental students (127 men and 373 women) participated in this study from May 2016 to September 2016. The Fonseca questionnaire was used as a tool to evaluate the prevalence of TMD, and Fonseca's Anamnestic Index (FAI) was used to classify the severity of TMD. The majority of the participants with TMD had a history of head trauma, psychological stress, and dental treatment or dental problems. The prevalence of TMD in Nepalese students was mild to moderate. The prevalence of TMD in Nepalese subjects was mild to moderate. The majority of the study subjects had eyesight problems, history of head trauma, psychological stress, and drinking alcohol and had received dental treatments.
An application of machine learning to the organization of institutional software repositories
NASA Technical Reports Server (NTRS)
Bailin, Sidney; Henderson, Scott; Truszkowski, Walt
1993-01-01
Software reuse has become a major goal in the development of space systems, as a recent NASA-wide workshop on the subject made clear. The Data Systems Technology Division of Goddard Space Flight Center has been working on tools and techniques for promoting reuse, in particular in the development of satellite ground support software. One of these tools is the Experiment in Libraries via Incremental Schemata and Cobweb (ElvisC). ElvisC applies machine learning to the problem of organizing a reusable software component library for efficient and reliable retrieval. In this paper we describe the background factors that have motivated this work, present the design of the system, and evaluate the results of its application.
Biodiversity in environmental assessment-current practice and tools for prediction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gontier, Mikael; Balfors, Berit; Moertberg, Ulla
Habitat loss and fragmentation are major threats to biodiversity. Environmental impact assessment and strategic environmental assessment are essential instruments used in physical planning to address such problems. Yet there are no well-developed methods for quantifying and predicting impacts of fragmentation on biodiversity. In this study, a literature review was conducted on GIS-based ecological models that have potential as prediction tools for biodiversity assessment. Further, a review of environmental impact statements for road and railway projects from four European countries was performed, to study how impact prediction concerning biodiversity issues was addressed. The results of the study showed the existing gapmore » between research in GIS-based ecological modelling and current practice in biodiversity assessment within environmental assessment.« less
Westgate, Philip M; Gomez-Pomar, Enrique
2017-01-01
In the face of the current Neonatal Abstinence Syndrome (NAS) epidemic, there is considerable variability in the assessment and management of infants with NAS. In this manuscript, we particularly focus on NAS assessment, with special attention given to the popular Finnegan Neonatal Abstinence Score (FNAS). A major instigator of the problem of variable practices is that multiple modified versions of the FNAS exist and continue to be proposed, including shortened versions. Furthermore, the validity of such assessment tools has been questioned, and as a result, the need for better tools has been suggested. The ultimate purpose of this manuscript, therefore, is to increase researchers' and clinicians' understanding on how to judge the usefulness of NAS assessment tools in order to guide future tool development and to reduce variable practices. In short, we suggest that judgment of NAS assessment tools should be made on a clinimetrics viewpoint as opposed to psychometrically. We provide examples, address multiple issues that must be considered, and discuss future tool development. Furthermore, we urge researchers and clinicians to come together, utilizing their knowledge and experience, to assess the utility and practicality of existing assessment tools and to determine if one or more new or modified tools are needed with the goal of increased agreement on the assessment of NAS in practice.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Witzel, Wayne; Rudinger, Kenneth Michael; Sarovar, Mohan
Using a novel formal methods approach, we have generated computer-veri ed proofs of major theorems pertinent to the quantum phase estimation algorithm. This was accomplished using our Prove-It software package in Python. While many formal methods tools are available, their practical utility is limited. Translating a problem of interest into these systems and working through the steps of a proof is an art form that requires much expertise. One must surrender to the preferences and restrictions of the tool regarding how mathematical notions are expressed and what deductions are allowed. Automation is a major driver that forces restrictions. Our focus,more » on the other hand, is to produce a tool that allows users the ability to con rm proofs that are essentially known already. This goal is valuable in itself. We demonstrate the viability of our approach that allows the user great exibility in expressing state- ments and composing derivations. There were no major obstacles in following a textbook proof of the quantum phase estimation algorithm. There were tedious details of algebraic manipulations that we needed to implement (and a few that we did not have time to enter into our system) and some basic components that we needed to rethink, but there were no serious roadblocks. In the process, we made a number of convenient additions to our Prove-It package that will make certain algebraic manipulations easier to perform in the future. In fact, our intent is for our system to build upon itself in this manner.« less
Chronic condition self-management support for Aboriginal people: Adapting tools and training.
Battersby, Malcolm; Lawn, Sharon; Kowanko, Inge; Bertossa, Sue; Trowbridge, Coral; Liddicoat, Raylene
2018-04-22
Chronic conditions are major health problems for Australian Aboriginal people. Self-management programs can improve health outcomes. However, few health workers are skilled in self-management support and existing programs are not always appropriate in Australian Aboriginal contexts. The goal was to increase the capacity of the Australian health workforce to support Australian Aboriginal people to self-manage their chronic conditions by adapting the Flinders Program of chronic condition self-management support for Australian Aboriginal clients and develop and deliver training for health professionals to implement the program. Feedback from health professionals highlighted that the Flinders Program assessment and care planning tools needed to be adapted to suit Australian Aboriginal contexts. Through consultation with Australian Aboriginal Elders and other experts, the tools were condensed into an illustrated booklet called 'My Health Story'. Associated training courses and resources focusing on cultural safety and effective engagement were developed. A total of 825 health professionals across Australia was trained and 61 people qualified as accredited trainers in the program, ensuring sustainability. The capacity and skills of the Australian health workforce to engage with and support Australian Aboriginal people to self-manage their chronic health problems significantly increased as a result of this project. The adapted tools and training were popular and appreciated by the health care organisations, health professionals and clients involved. The adapted tools have widespread appeal for cultures that do not have Western models of health care and where there are health literacy challenges. My Health Story has already been used internationally. © 2018 National Rural Health Alliance Ltd.
The Current Status of the Philosophy of Biology
NASA Astrophysics Data System (ADS)
Takacs, Peter; Ruse, Michael
2013-01-01
The philosophy of biology today is one of the most exciting areas of philosophy. It looks critically across the life sciences, teasing out conceptual issues and difficulties bringing to bear the tools of philosophical analysis to achieve clarification and understanding. This essay surveys work in all of the major directions of research: evolutionary theory and the units/levels of selection; evolutionary developmental biology; reductionism; ecology; the species problem; teleology; evolutionary epistemology; evolutionary ethics; and progress. There is a comprehensive bibliography.
Guidelines on What Constitutes Plagiarism and Electronic Tools to Detect it.
Luksanapruksa, Panya; Millhouse, Paul W
2016-04-01
Plagiarism is a serious ethical problem among scientific publications. There are various definitions of plagiarism, and the major categories include unintentional (unsuitable paraphrasing or improper citations) and intentional. Intentional plagiarism includes mosaic plagiarism, plagiarism of ideas, plagiarism of text, and self-plagiarism. There are many Web sites and software packages that claim to detect plagiarism effectively. A violation of plagiarism laws can lead to serious consequences including author banning, loss of professional reputation, termination of a position, and even legal action.
Abat, F; Alfredson, H; Cucchiarini, M; Madry, H; Marmotti, A; Mouton, C; Oliveira, J M; Pereira, H; Peretti, G M; Romero-Rodriguez, D; Spang, C; Stephen, J; van Bergen, C J A; de Girolamo, L
2017-12-01
Chronic tendinopathies represent a major problem in the clinical practice of sports orthopaedic surgeons, sports doctors and other health professionals involved in the treatment of athletes and patients that perform repetitive actions. The lack of consensus relative to the diagnostic tools and treatment modalities represents a management dilemma for these professionals. With this review, the purpose of the ESSKA Basic Science Committee is to establish guidelines for understanding, diagnosing and treating this complex pathology.
Human exposure assessment: a graduate level course
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lioy, P.J.
1991-07-01
The course has been offered three times. The content and the approach to each lecture has evolved after each time it was given. This is not unexpected since the field has been undergoing major transformations, and new approaches to measurement and modeling are being applied to current problems. The most recent student evaluation, 1990, indicates a difficulty rating of just right' (70%) to difficult' (30%). Most felt the course stimulated their interest in the topic (72%) and the examinations were learning experiences as well as a grading exercise. The major need for the discipline is an adequate text book. Themore » GRAPE program has excellent potential as an educational tool, but it needs to make more interactions and allow introduction of activities and data. The major strengths of the course are the problems provided to the students for homework. These give the student quantitative perspective on the concepts, range in values, variables, and uncertainties necessary to complete an assessment. In addition, the development of the mathematical and conceptional continuum for placing exposure assessment in the context of toxicology, environmental science, epidemiology, and clinical intervention provides a basic framework for the discipline.« less
Analysis and specification tools in relation to the APSE
NASA Technical Reports Server (NTRS)
Hendricks, John W.
1986-01-01
Ada and the Ada Programming Support Environment (APSE) specifically address the phases of the system/software life cycle which follow after the user's problem was translated into system and software development specifications. The waterfall model of the life cycle identifies the analysis and requirements definition phases as preceeding program design and coding. Since Ada is a programming language and the APSE is a programming support environment, they are primarily targeted to support program (code) development, tecting, and maintenance. The use of Ada based or Ada related specification languages (SLs) and program design languages (PDLs) can extend the use of Ada back into the software design phases of the life cycle. Recall that the standardization of the APSE as a programming support environment is only now happening after many years of evolutionary experience with diverse sets of programming support tools. Restricting consideration to one, or even a few chosen specification and design tools, could be a real mistake for an organization or a major project such as the Space Station, which will need to deal with an increasingly complex level of system problems. To require that everything be Ada-like, be implemented in Ada, run directly under the APSE, and fit into a rigid waterfall model of the life cycle would turn a promising support environment into a straight jacket for progress.
Recent Developments and Research Progress on Friction Stir Welding of Titanium Alloys: An Overview
NASA Astrophysics Data System (ADS)
Karna, Sivaji; Cheepu, Muralimohan; Venkateswarulu, D.; Srikanth, V.
2018-03-01
Titanium and its alloys are joined by various welding processes. However, Fusion welding of titanium alloys resulted solidification problems like porosity, segregation and columnar grains. The problems occurred in conventional welding processes can be resolved using a solid state welding i.e. friction stir welding. Aluminium and Magnesium alloys were welded by friction stir welding. However alloys used for high temperature applications such as titanium alloys and steels are arduous to weld using friction stir welding process because of tool limitations. Present paper summarises the studies on joining of Titanium alloys using friction stir welding with different tool materials. Selection of tool material and effect of welding conditions on mechanical and microstructure properties of weldments were also reported. Major advantage with friction stir welding is, we can control the welding temperature above or below β-transus temperature by optimizing the process parameters. Stir zone in below beta transus condition consists of bi-modal microstructure and microstructure in above β-transus condition has large prior β- grains and α/β laths present in the grain. Welding experiments conducted below β- transus condition has better mechanical properties than welding at above β-transus condition. Hardness and tensile properties of weldments are correlated with the stir zone microstructure.
Assessment of alcohol problems using AUDIT in a prison setting: more than an 'aye or no' question
2011-01-01
Background Alcohol problems are a major UK and international public health issue. The prevalence of alcohol problems is markedly higher among prisoners than the general population. However, studies suggest alcohol problems among prisoners are under-detected, under-recorded and under-treated. Identifying offenders with alcohol problems is fundamental to providing high quality healthcare. This paper reports use of the AUDIT screening tool to assess alcohol problems among prisoners. Methods Universal screening was undertaken over ten weeks with all entrants to one male Scottish prison using the AUDIT standardised screening tool and supplementary contextual questions. The questionnaire was administered by trained prison officers during routine admission procedures. Overall 259 anonymised completed questionnaires were analysed. Results AUDIT scores showed a high prevalence of alcohol problems with 73% of prisoner scores indicating an alcohol use disorder (8+), including 36% having scores indicating 'possible dependence' (20-40). AUDIT scores indicating 'possible dependence' were most apparent among 18-24 and 40-64 year-olds (40% and 56% respectively). However, individual questions showed important differences, with younger drinkers less likely to demonstrate habitual and addictive behaviours than the older age group. Disparity between high levels of harmful/hazardous/dependent drinking and low levels of 'treatment' emerged (only 27% of prisoners with scores indicating 'possible dependence' reported being 'in treatment'). Self-reported associations between drinking alcohol and the index crime were identified among two-fifths of respondents, rising to half of those reporting violent crimes. Conclusions To our knowledge, this is the first study to identify differing behaviours and needs among prisoners with high AUDIT score ranges, through additional analysis of individual questions. The study has identified high prevalence of alcohol use, varied problem behaviours, and links across drinking, crime and recidivism, supporting the argument for more extensive provision of alcohol-focused interventions in prisons. These should be carefully targeted based on initial screening and assessment, responsive, and include care pathways linking prisoners to community services. Finally, findings confirm the value and feasibility of routine use of the AUDIT screening tool in prison settings, to considerably enhance practice in the detection and understanding of alcohol problems, improving on current more limited questioning (e.g. 'yes or no' questions). PMID:22082009
Assessment of SOAP note evaluation tools in colleges and schools of pharmacy.
Sando, Karen R; Skoy, Elizabeth; Bradley, Courtney; Frenzel, Jeanne; Kirwin, Jennifer; Urteaga, Elizabeth
2017-07-01
To describe current methods used to assess SOAP notes in colleges and schools of pharmacy. Members of the American Association of Colleges of Pharmacy Laboratory Instructors Special Interest Group were invited to share assessment tools for SOAP notes. Content of submissions was evaluated to characterize overall qualities and how the tools assessed subjective, objective, assessment, and plan information. Thirty-nine assessment tools from 25 schools were evaluated. Twenty-nine (74%) of the tools were rubrics and ten (26%) were checklists. All rubrics included analytic scoring elements, while two (7%) were mixed with holistic and analytic scoring elements. A majority of the rubrics (35%) used a four-item rating scale. Substantial variability existed in how tools evaluated subjective and objective sections. All tools included problem identification in the assessment section. Other assessment items included goals (82%) and rationale (69%). Seventy-seven percent assessed drug therapy; however, only 33% assessed non-drug therapy. Other plan items included education (59%) and follow-up (90%). There is a great deal of variation in the specific elements used to evaluate SOAP notes in colleges and schools of pharmacy. Improved consistency in assessment methods to evaluate SOAP notes may better prepare students to produce standardized documentation when entering practice. Copyright © 2017 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Tuminaro, Jonathan
Many introductory, algebra-based physics students perform poorly on mathematical problem solving tasks in physics. There are at least two possible, distinct reasons for this poor performance: (1) students simply lack the mathematical skills needed to solve problems in physics, or (2) students do not know how to apply the mathematical skills they have to particular problem situations in physics. While many students do lack the requisite mathematical skills, a major finding from this work is that the majority of students possess the requisite mathematical skills, yet fail to use or interpret them in the context of physics. In this thesis I propose a theoretical framework to analyze and describe students' mathematical thinking in physics. In particular, I attempt to answer two questions. What are the cognitive tools involved in formal mathematical thinking in physics? And, why do students make the kinds of mistakes they do when using mathematics in physics? According to the proposed theoretical framework there are three major theoretical constructs: mathematical resources, which are the knowledge elements that are activated in mathematical thinking and problem solving; epistemic games, which are patterns of activities that use particular kinds of knowledge to create new knowledge or solve a problem; and frames, which are structures of expectations that determine how individuals interpret situations or events. The empirical basis for this study comes from videotaped sessions of college students solving homework problems. The students are enrolled in an algebra-based introductory physics course. The videotapes were transcribed and analyzed using the aforementioned theoretical framework. Two important results from this work are: (1) the construction of a theoretical framework that offers researchers a vocabulary (ontological classification of cognitive structures) and grammar (relationship between the cognitive structures) for understanding the nature and origin of mathematical use in the context physics, and (2) a detailed understanding, in terms of the proposed theoretical framework, of the errors that students make when using mathematics in the context of physics.
Papazoglou, T G
1995-04-01
A non-invasive diagnostic tool that can identify diseased tissue sites in situ and in real time could have a major impact on the detection and treatment of cancer and atherosclerosis. A review of the research performed on the utilization of laser induced fluorescence spectroscopy (LIFS) as a means of diseased tissue diagnosis is presented. Special emphasis is given to problems which were raised during clinical trials and recent experimental studies. The common origin and possible solution of these problems are shown to be related to, firstly, the identification of the fluorescent chemical species, secondly, the determination of the excitation/collection geometry and its effect to the method and, finally, the further elaboration on the laser-tissue interaction.
NASA Astrophysics Data System (ADS)
Jia, Ningning; Y Lam, Edmund
2010-04-01
Inverse lithography technology (ILT) synthesizes photomasks by solving an inverse imaging problem through optimization of an appropriate functional. Much effort on ILT is dedicated to deriving superior masks at a nominal process condition. However, the lower k1 factor causes the mask to be more sensitive to process variations. Robustness to major process variations, such as focus and dose variations, is desired. In this paper, we consider the focus variation as a stochastic variable, and treat the mask design as a machine learning problem. The stochastic gradient descent approach, which is a useful tool in machine learning, is adopted to train the mask design. Compared with previous work, simulation shows that the proposed algorithm is effective in producing robust masks.
Helicopter-V/STOL dynamic wind and turbulence design methodology
NASA Technical Reports Server (NTRS)
Bailey, J. Earl
1987-01-01
Aircraft and helicopter accidents due to severe dynamic wind and turbulence continue to present challenging design problems. The development of the current set of design analysis tools for a aircraft wind and turbulence design began in the 1940's and 1950's. The areas of helicopter dynamic wind and turbulence modeling and vehicle response to severe dynamic wind inputs (microburst type phenomena) during takeoff and landing remain as major unsolved design problems from a lack of both environmental data and computational methodology. The development of helicopter and V/STOL dynamic wind and turbulence response computation methology is reviewed, the current state of the design art in industry is outlined, and comments on design methodology are made which may serve to improve future flight vehicle design.
Mobile internet and technology for optical teaching reform in higher education
NASA Astrophysics Data System (ADS)
Zhou, Muchun; Zhao, Qi; Chen, Yanru
2017-08-01
There are some problems in optical education such as insufficient flexibility, individuality and adaptability to students who need information and education at present. The development of mobile internet and technology provides support to solve these problems. Basic characteristics, advantages and developments of these techniques used in education are presented in this paper. Mobile internet is introduced to reform the classroom teaching of optical courses. Mobile network tool selection, teaching resources construction and reform in teaching methods are discussed. Academic record and sampling surveys are used to assess intention to adopt mobile internet and learning effect of academic major of students, the results show that high quality optical education can be offered by adopting mobile internet and technologies in traditional instruction.
Gupta, Vishal; Pandey, Pulak M
2016-11-01
Thermal necrosis is one of the major problems associated with the bone drilling process in orthopedic/trauma surgical operations. To overcome this problem a new bone drilling method has been introduced recently. Studies have been carried out with rotary ultrasonic drilling (RUD) on pig bones using diamond coated abrasive hollow tools. In the present work, influence of process parameters (rotational speed, feed rate, drill diameter and vibrational amplitude) on change in the temperature was studied using design of experiment technique i.e., response surface methodology (RSM) and data analysis was carried out using analysis of variance (ANOVA). Temperature was recorded and measured by using embedded thermocouple technique at a distance of 0.5mm, 1.0mm, 1.5mm and 2.0mm from the drill site. Statistical model was developed to predict the maximum temperature at the drill tool and bone interface. It was observed that temperature increased with increase in the rotational speed, feed rate and drill diameter and decreased with increase in the vibrational amplitude. Copyright © 2016 IPEM. Published by Elsevier Ltd. All rights reserved.
Wavelet methods in multi-conjugate adaptive optics
NASA Astrophysics Data System (ADS)
Helin, T.; Yudytskiy, M.
2013-08-01
The next generation ground-based telescopes rely heavily on adaptive optics for overcoming the limitation of atmospheric turbulence. In the future adaptive optics modalities, like multi-conjugate adaptive optics (MCAO), atmospheric tomography is the major mathematical and computational challenge. In this severely ill-posed problem, a fast and stable reconstruction algorithm is needed that can take into account many real-life phenomena of telescope imaging. We introduce a novel reconstruction method for the atmospheric tomography problem and demonstrate its performance and flexibility in the context of MCAO. Our method is based on using locality properties of compactly supported wavelets, both in the spatial and frequency domains. The reconstruction in the atmospheric tomography problem is obtained by solving the Bayesian MAP estimator with a conjugate-gradient-based algorithm. An accelerated algorithm with preconditioning is also introduced. Numerical performance is demonstrated on the official end-to-end simulation tool OCTOPUS of European Southern Observatory.
NASA's Applied Sciences for Water Resources
NASA Technical Reports Server (NTRS)
Doorn, Bradley; Toll, David; Engman, Ted
2011-01-01
The Earth Systems Division within NASA has the primary responsibility for the Earth Science Applied Science Program and the objective to accelerate the use of NASA science results in applications to help solve problems important to society and the economy. The primary goal of the Earth Science Applied Science Program is to improve future and current operational systems by infusing them with scientific knowledge of the Earth system gained through space-based observation, assimilation of new observations, and development and deployment of enabling technologies, systems, and capabilities. This paper discusses one of the major problems facing water resources managers, that of having timely and accurate data to drive their decision support tools. It then describes how NASA?s science and space based satellites may be used to overcome this problem. Opportunities for the water resources community to participate in NASA?s Water Resources Applications Program are described.
Hierarchical winner-take-all particle swarm optimization social network for neural model fitting.
Coventry, Brandon S; Parthasarathy, Aravindakshan; Sommer, Alexandra L; Bartlett, Edward L
2017-02-01
Particle swarm optimization (PSO) has gained widespread use as a general mathematical programming paradigm and seen use in a wide variety of optimization and machine learning problems. In this work, we introduce a new variant on the PSO social network and apply this method to the inverse problem of input parameter selection from recorded auditory neuron tuning curves. The topology of a PSO social network is a major contributor to optimization success. Here we propose a new social network which draws influence from winner-take-all coding found in visual cortical neurons. We show that the winner-take-all network performs exceptionally well on optimization problems with greater than 5 dimensions and runs at a lower iteration count as compared to other PSO topologies. Finally we show that this variant of PSO is able to recreate auditory frequency tuning curves and modulation transfer functions, making it a potentially useful tool for computational neuroscience models.
NASA Technical Reports Server (NTRS)
Rubbert, P. E.
1978-01-01
The commercial airplane builder's viewpoint on the important issues involved in the development of improved computational aerodynamics tools such as powerful computers optimized for fluid flow problems is presented. The primary user of computational aerodynamics in a commercial aircraft company is the design engineer who is concerned with solving practical engineering problems. From his viewpoint, the development of program interfaces and pre-and post-processing capability for new computational methods is just as important as the algorithms and machine architecture. As more and more details of the entire flow field are computed, the visibility of the output data becomes a major problem which is then doubled when a design capability is added. The user must be able to see, understand, and interpret the results calculated. Enormous costs are expanded because of the need to work with programs having only primitive user interfaces.
A problem solving and decision making toolbox for approaching clinical problems and decisions.
Margolis, C; Jotkowitz, A; Sitter, H
2004-08-01
In this paper, we begin by presenting three real patients and then review all the practical conceptual tools that have been suggested for systematically analyzing clinical problems. Each of these conceptual tools (e.g. Evidence-Based Medicine, Clinical Practice Guidelines, Decision Analysis) deals mainly with a different type or aspect of clinical problems. We suggest that all of these conceptual tools can be thought of as belonging in the clinician's toolbox for solving clinical problems and making clinical decisions. A heuristic for guiding the clinician in using the tools is proposed. The heuristic is then used to analyze management of the three patients presented at the outset. Copyright 2004 Birkhäuser Verlag, Basel
A knowledge-based tool for multilevel decomposition of a complex design problem
NASA Technical Reports Server (NTRS)
Rogers, James L.
1989-01-01
Although much work has been done in applying artificial intelligence (AI) tools and techniques to problems in different engineering disciplines, only recently has the application of these tools begun to spread to the decomposition of complex design problems. A new tool based on AI techniques has been developed to implement a decomposition scheme suitable for multilevel optimization and display of data in an N x N matrix format.
Staeheli, Martha; Aseltine, Robert H; Schilling, Elizabeth; Anderson, Daren; Gould, Bruce
2017-01-01
Behavioral health disorders remain under recognized and under diagnosed among urban primary care patients. Screening patients for such problems is widely recommended, yet is challenging to do in a brief primary care encounter, particularly for this socially and medically complex patient population. In 2013, intervention patients at an urban Connecticut primary clinic were screened for post-traumatic stress disorder, depression, and risky drinking (n = 146) using an electronic tablet-based screening tool. Screening data were compared to electronic health record data from control patients (n = 129) to assess differences in the prevalence of behavioral health problems, rates of follow-up care, and the rate of newly identified cases in the intervention group. Results from logistic regressions indicated that both groups had similar rates of disorder at baseline. Patients in the intervention group were five times more likely to be identified with depression (p < 0.05). Post-traumatic stress disorder was virtually unrecognized among controls but was observed in 23% of the intervention group (p < 0.001). The vast majority of behavioral health problems identified in the intervention group were new cases. Follow-up rates were significantly higher in the intervention group relative to controls, but were low overall. This tablet-based electronic screening tool identified significantly higher rates of behavioral health disorders than have been previously reported for this patient population. Electronic risk screening using patient-reported outcome measures offers an efficient approach to improving the identification of behavioral health problems and improving rates of follow-up care.
Older Cancer Patients’ User Experiences With Web-Based Health Information Tools: A Think-Aloud Study
Romijn, Geke; Smets, Ellen M A; Loos, Eugene F; Kunneman, Marleen; van Weert, Julia C M
2016-01-01
Background Health information is increasingly presented on the Internet. Several Web design guidelines for older Web users have been proposed; however, these guidelines are often not applied in website development. Furthermore, although we know that older individuals use the Internet to search for health information, we lack knowledge on how they use and evaluate Web-based health information. Objective This study evaluates user experiences with existing Web-based health information tools among older (≥ 65 years) cancer patients and survivors and their partners. The aim was to gain insight into usability issues and the perceived usefulness of cancer-related Web-based health information tools. Methods We conducted video-recorded think-aloud observations for 7 Web-based health information tools, specifically 3 websites providing cancer-related information, 3 Web-based question prompt lists (QPLs), and 1 values clarification tool, with colorectal cancer patients or survivors (n=15) and their partners (n=8) (median age: 73; interquartile range 70-79). Participants were asked to think aloud while performing search, evaluation, and application tasks using the Web-based health information tools. Results Overall, participants perceived Web-based health information tools as highly useful and indicated a willingness to use such tools. However, they experienced problems in terms of usability and perceived usefulness due to difficulties in using navigational elements, shortcomings in the layout, a lack of instructions on how to use the tools, difficulties with comprehensibility, and a large amount of variety in terms of the preferred amount of information. Although participants frequently commented that it was easy for them to find requested information, we observed that the large majority of the participants were not able to find it. Conclusions Overall, older cancer patients appreciate and are able to use cancer information websites. However, this study shows the importance of maintaining awareness of age-related problems such as cognitive and functional decline and navigation difficulties with this target group in mind. The results of this study can be used to design usable and useful Web-based health information tools for older (cancer) patients. PMID:27457709
Rodda, S N; Manning, V; Dowling, N A; Lee, S J; Lubman, D I
2018-03-01
Despite high rates of comorbidity between problem gambling and mental health disorders, few studies have examined barriers or facilitators to the implementation of screening for problem gambling in mental health services. This exploratory qualitative study identified key themes associated with screening in mental health services. Semi-structured interviews were undertaken with 30 clinicians and managers from 11 mental health services in Victoria, Australia. Major themes and subthemes were identified using qualitative content analysis. Six themes emerged including competing priorities, importance of routine screening, access to appropriate screening tools, resources, patient responsiveness and workforce development. Barriers to screening included a focus on immediate risk as well as gambling being often considered as a longer-term concern. Clinicians perceived problem gambling as a relatively rare condition, but did acknowledge the need for brief screening. Facilitators to screening were changes to system processes, such as identification of an appropriate brief screening instrument, mandating its use as part of routine screening, as well as funded workforce development activities in the identification and management of problem gambling.
NASA Astrophysics Data System (ADS)
Lim, Cristina P.; Matsuda, Yoshiaki; Shigemi, Yukio
1995-11-01
The Philippine fisheries accounted for 3.7% of the gross national product at current prices. The sector employed about 990,872 persons. Of the divisions comprising the industry, municipal fisheries continued to contribute the largest share of fish production. However, the sector is beset with problems, many of which are best examplified by the case of San Miguel Bay (SMB). This paper presents the problems and constraints confronting SMB, a common property resource. This bay's open access condition has led to various problems, such as declining fishery resources, depressed socioeconomic conditions, illegal fishing, increasing population, and conflict among resource users. A poor marketing system, low level of fishing technology, fishermen's noncompliance and authorities' lax enforcement of rules and regulations, as well as lack of alternative sources of income further characterize the condition in SMB. Establishment of fishing rights, improvement of the marketing system, provision of alternative sources of income, and improvement of fishing technology were some of the solutions suggested. One major constraint, however, is financial, Comanagement complemented with other management tools has been proposed in addressing the problems in SMB.
Latifi, Rifat; Ziemba, Michelle; Leppäniemi, Ari; Dasho, Erion; Dogjani, Agron; Shatri, Zhaneta; Kociraj, Agim; Oldashi, Fatos; Shosha, Lida
2014-08-01
Trauma continues to be a major health problem worldwide, particularly in the developing world, with high mortality and morbidity. Yet most developing countries lack an organized trauma system. Furthermore, developing countries do not have in place any accreditation process for trauma centers; thus, no accepted standard assessment tools exist to evaluate their trauma services. The aims of this study were to evaluate the trauma system in Albania, using the basic trauma criteria of the American College of Surgeons/Committee on Trauma (ACS/COT) as assessment tools, and to provide the Government with a situational analysis relative to these criteria. We used the ACS/COT basic criteria as assessment tools to evaluate the trauma system in Albania. We conducted a series of semi-structured interviews, unstructured interviews, and focus groups with all stakeholders at the Ministry of Health, at the University Trauma Hospital (UTH) based in Tirana (the capital city), and at ten regional hospitals across the country. Albania has a dedicated national trauma center that serves as the only tertiary center, plus ten regional hospitals that provide some trauma care. However, overall, its trauma system is in need of major reforms involving all essential elements in order to meet the basic requirements of a structured trauma system. The ACS/COT basic criteria can be used as assessment tools to evaluate trauma care in developing countries. Further studies are needed in other developing countries to validate the applicability of these criteria.
Interventional radiology: a half century of innovation.
Baum, Richard A; Baum, Stanley
2014-11-01
The evolution of modern interventional radiology began over half century ago with a simple question. Was it possible to use the same diagnostic imaging tools that had revolutionized the practice of medicine to guide the real-time treatment of disease? This disruptive concept led to rapid treatment advances in every organ system of the body. It became clear that by utilizing imaging some patients could undergo targeted procedures, eliminating the need for major surgery, while others could undergo procedures for previously unsolvable problems. The breadth of these changes now encompasses all of medicine and has forever changed the way we think about disease. In this brief review article, major advances in the field, as chronicled in the pages of Radiology, will be described.
Novel Problem Solving - The NASA Solution Mechanism Guide
NASA Technical Reports Server (NTRS)
Keeton, Kathryn E.; Richard, Elizabeth E.; Davis, Jeffrey R.
2014-01-01
Over the past five years, the Human Health and Performance (HH&P) Directorate at the NASA Johnson Space Center (JSC) has conducted a number of pilot and ongoing projects in collaboration and open innovation. These projects involved the use of novel open innovation competitions that sought solutions from "the crowd", non-traditional problem solvers. The projects expanded to include virtual collaboration centers such as the NASA Human Health and Performance Center (NHHPC) and more recently a collaborative research project between NASA and the National Science Foundation (NSF). These novel problem-solving tools produced effective results and the HH&P wanted to capture the knowledge from these new tools, to teach the results to the directorate, and to implement new project management tools and coursework. The need to capture and teach the results of these novel problem solving tools, the HH&P decided to create a web-based tool to capture best practices and case studies, to teach novice users how to use new problem solving tools and to change project management training/. This web-based tool was developed with a small, multi-disciplinary group and named the Solution Mechanism Guide (SMG). An alpha version was developed that was tested against several sessions of user groups to get feedback on the SMG and determine a future course for development. The feedback was very positive and the HH&P decided to move to the beta-phase of development. To develop the web-based tool, the HH&P utilized the NASA Tournament Lab (NTL) to develop the software with TopCoder under an existing contract. In this way, the HH&P is using one new tool (the NTL and TopCoder) to develop the next generation tool, the SMG. The beta-phase of the SMG is planed for release in the spring of 2014 and results of the beta-phase testing will be available for the IAC meeting in September. The SMG is intended to disrupt the way problem solvers and project managers approach problem solving and to increase the use of novel and more cost and time effective problem solving tools such as open innovation, collaborative research, and virtual collaborative project centers. The HH&P envisions changing project management coursework by including the SMG in the teaching of project management problem solving tools.
[Development and Use of Hidrosig
NASA Technical Reports Server (NTRS)
Gupta, Vijay K.; Milne, Bruce T.
2003-01-01
The NASA portion of this joint NSF-NASA grant consists of objective 2 and a part of objective 3. A major effort was made on objective 2, and it consisted of developing a numerical GIs environment called Hidrosig. This major research tool is being developed by the University of Colorado for conducting river-network-based scaling analyses of coupled water-energy-landform-vegetation interactions including water and energy balances, and floods and droughts, at multiple space-time scales.Objective 2: To analyze the relevant remotely sensed products from satellites, radars and ground measurements to compute the transported water mass for each complete Strahler stream using an 'assimilated water balance equation' at daily and other appropriate time scales. This objective requires analysis of concurrent data sets for Precipitation (PPT), Evapotranspiration (ET) and stream flows (Q) on river networks. To solve this major problem, our decision was to develop Hidrosig, a new Open-Source GIs software. A research group in Colombia, South America, developed the first version of Hidrosig, and Ricardo Mantilla was part of this effort as an undergraduate student before joining the graduate program at the University of Colorado in 2001. Hydrosig automatically extracts river networks from large DEMs and creates a "link-based" data structure, which is required to conduct a variety of analyses under objective 2. It is programmed in Java, which is a multi-platform programming language freely distributed by SUN under a GPL license. Some existent commercial tools like Arc-Info, RiverTools and others are not suitable for our purpose for two reasons. First, the source code is not available that is needed to build on the network data structure. Second, these tools use different programming languages that are not most versatile for our purposes. For example, RiverTools uses an IDL platform that is not very efficient for organizing diverse data sets on river networks. Hidrosig establishes a clear data organization framework that allows a simultaneous analysis of spatial fields along river network structures involving Horton- Strahler framework. Software tools for network extraction from DEMs and network-based analysis of geomorphologic and topologic variables were developed during the first year and a part of second year.
Supporting performance and configuration management of GTE cellular networks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tan, Ming; Lafond, C.; Jakobson, G.
GTE Laboratories, in cooperation with GTE Mobilnet, has developed and deployed PERFFEX (PERFormance Expert), an intelligent system for performance and configuration management of cellular networks. PERFEX assists cellular network performance and radio engineers in the analysis of large volumes of cellular network performance and configuration data. It helps them locate and determine the probable causes of performance problems, and provides intelligent suggestions about how to correct them. The system combines an expert cellular network performance tuning capability with a map-based graphical user interface, data visualization programs, and a set of special cellular engineering tools. PERFEX is in daily use atmore » more than 25 GTE Mobile Switching Centers. Since the first deployment of the system in late 1993, PERFEX has become a major GTE cellular network performance optimization tool.« less
Nerys-Junior, Arildo; Costa, Lendel C; Braga-Dias, Luciene P; Oliveira, Márcia; Rossi, Atila D; da Cunha, Rodrigo Delvecchio; Gonçalves, Gabriel S; Tanuri, Amilcar
2014-03-01
Engineered nucleases such as zinc finger nucleases (ZFN) and transcription activator-like effector nucleases (TALEN) are one of the most promising tools for modifying genomes. These site-specific enzymes cause double-strand breaks that allow gene disruption or gene insertion, thereby facilitating genetic manipulation. The major problem associated with this approach is the labor-intensive procedures required to screen and confirm the cellular modification by nucleases. In this work, we produced a TALEN that targets the human CCR5 gene and developed a heteroduplex mobility assay for HEK 293T cells to select positive colonies for sequencing. This approach provides a useful tool for the quick detection and easy assessment of nuclease activity.
Test drilling in basalts, Lalamilo area, South Kohala District, Hawaii
Teasdale, Warren E.
1980-01-01
Test drilling has determined that a downhole-percussion airhammer can be used effectively to drill basalts in Hawaii. When used in conjunction with a foam-type drilling fluid, the hammer-bit penetration rate was rapid. Continuous drill cuttings from the materials penetrated were obtained throughout the borehole except from extremely fractured or weathered basalt zones where circulation was lost or limited. Cementing of these zones as soon as encountered reduced problems of stuck tools, washouts, and loss of drill-cuttings. Supplies and logistics on the Hawaiian Islands, always a major concern, require that all anticipated drilling supplies, spare rig and tool parts, drilling muds and additives, foam, and miscellaneous hardware be on hand before starting to drill. If not, the resulting rig downtime is costly in both time and money. (USGS)
Designing Tools for Reflection on Problem-Based Instruction and Problem-Based Instructional Design
ERIC Educational Resources Information Center
Keefer, Matthew W.; Hui, Diane; RuffusDoerr, Amy Marie
2009-01-01
The objective of this research project into teacher education was to document the collaborative development and refection on teachers' tools in a problem-based learning (PBL) program. These results were then used to design materials and formats for the transmission of this teaching knowledge to less-experienced PBL teachers. The tools were…
Problem-solving tools for analyzing system problems. The affinity map and the relationship diagram.
Lepley, C J
1998-12-01
The author describes how to use two management tools, an affinity map and a relationship diagram, to define and analyze aspects of a complex problem in a system. The affinity map identifies the key influencing elements of the problem, whereas the relationship diagram helps to identify the area that is the most important element of the issue. Managers can use the tools to draw a map of problem drivers, graphically display the drivers in a diagram, and use the diagram to develop a cause-and-effect relationship.
Integrated modeling of advanced optical systems
NASA Astrophysics Data System (ADS)
Briggs, Hugh C.; Needels, Laura; Levine, B. Martin
1993-02-01
This poster session paper describes an integrated modeling and analysis capability being developed at JPL under funding provided by the JPL Director's Discretionary Fund and the JPL Control/Structure Interaction Program (CSI). The posters briefly summarize the program capabilities and illustrate them with an example problem. The computer programs developed under this effort will provide an unprecedented capability for integrated modeling and design of high performance optical spacecraft. The engineering disciplines supported include structural dynamics, controls, optics and thermodynamics. Such tools are needed in order to evaluate the end-to-end system performance of spacecraft such as OSI, POINTS, and SMMM. This paper illustrates the proof-of-concept tools that have been developed to establish the technology requirements and demonstrate the new features of integrated modeling and design. The current program also includes implementation of a prototype tool based upon the CAESY environment being developed under the NASA Guidance and Control Research and Technology Computational Controls Program. This prototype will be available late in FY-92. The development plan proposes a major software production effort to fabricate, deliver, support and maintain a national-class tool from FY-93 through FY-95.
On the next generation of reliability analysis tools
NASA Technical Reports Server (NTRS)
Babcock, Philip S., IV; Leong, Frank; Gai, Eli
1987-01-01
The current generation of reliability analysis tools concentrates on improving the efficiency of the description and solution of the fault-handling processes and providing a solution algorithm for the full system model. The tools have improved user efficiency in these areas to the extent that the problem of constructing the fault-occurrence model is now the major analysis bottleneck. For the next generation of reliability tools, it is proposed that techniques be developed to improve the efficiency of the fault-occurrence model generation and input. Further, the goal is to provide an environment permitting a user to provide a top-down design description of the system from which a Markov reliability model is automatically constructed. Thus, the user is relieved of the tedious and error-prone process of model construction, permitting an efficient exploration of the design space, and an independent validation of the system's operation is obtained. An additional benefit of automating the model construction process is the opportunity to reduce the specialized knowledge required. Hence, the user need only be an expert in the system he is analyzing; the expertise in reliability analysis techniques is supplied.
ERIC Educational Resources Information Center
Friedman, Robert S.; Deek, Fadi P.
2002-01-01
Discusses how the design and implementation of problem-solving tools used in programming instruction are complementary with both the theories of problem-based learning (PBL), including constructivism, and the practices of distributed education environments. Examines how combining PBL, Web-based distributed education, and a problem-solving…
Major oil spill response coordination in the combat of spills in Bahrain waters
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, D.J.S.; James, J.D.
1985-01-01
On Aug. 25, 1980, without warning, a large oil slick invaded the north and west coasts of Bahrain. This serious incident was designated a national emergency, and a special high-level government committee was established immediately to handle the response procedure. This paper describes how the committee functioned with the private sector and the problems encountered and overcome. The lessons learned have stimulated interest in the proposed use of computers to aid combat operations as both a technical tool and an information service. In addition, this experience can guide those responsible for controlling response operations for future major oil spills inmore » coastal areas. With proper planning, a highly satisfactory cooperative organization can be established between goverment and the private sector to respond effectively, economically, and with minimum environmental damage to a major oil spill affecting coastal waters and shorelines. Part of this planning is to collate and record the major available response equipment and material packages, thereby increasing the necessary state of readiness.« less
Integrating Visualizations into Modeling NEST Simulations
Nowke, Christian; Zielasko, Daniel; Weyers, Benjamin; Peyser, Alexander; Hentschel, Bernd; Kuhlen, Torsten W.
2015-01-01
Modeling large-scale spiking neural networks showing realistic biological behavior in their dynamics is a complex and tedious task. Since these networks consist of millions of interconnected neurons, their simulation produces an immense amount of data. In recent years it has become possible to simulate even larger networks. However, solutions to assist researchers in understanding the simulation's complex emergent behavior by means of visualization are still lacking. While developing tools to partially fill this gap, we encountered the challenge to integrate these tools easily into the neuroscientists' daily workflow. To understand what makes this so challenging, we looked into the workflows of our collaborators and analyzed how they use the visualizations to solve their daily problems. We identified two major issues: first, the analysis process can rapidly change focus which requires to switch the visualization tool that assists in the current problem domain. Second, because of the heterogeneous data that results from simulations, researchers want to relate data to investigate these effectively. Since a monolithic application model, processing and visualizing all data modalities and reflecting all combinations of possible workflows in a holistic way, is most likely impossible to develop and to maintain, a software architecture that offers specialized visualization tools that run simultaneously and can be linked together to reflect the current workflow, is a more feasible approach. To this end, we have developed a software architecture that allows neuroscientists to integrate visualization tools more closely into the modeling tasks. In addition, it forms the basis for semantic linking of different visualizations to reflect the current workflow. In this paper, we present this architecture and substantiate the usefulness of our approach by common use cases we encountered in our collaborative work. PMID:26733860
Design of a Cognitive Tool to Enhance Problemsolving Performance
ERIC Educational Resources Information Center
Lee, Youngmin; Nelson, David
2005-01-01
The design of a cognitive tool to support problem-solving performance for external representation of knowledge is described. The limitations of conventional knowledge maps are analyzed in proposing the tool. The design principles and specifications are described. This tool is expected to enhance learners problem-solving performance by allowing…
The 2nd DBCLS BioHackathon: interoperable bioinformatics Web services for integrated applications
2011-01-01
Background The interaction between biological researchers and the bioinformatics tools they use is still hampered by incomplete interoperability between such tools. To ensure interoperability initiatives are effectively deployed, end-user applications need to be aware of, and support, best practices and standards. Here, we report on an initiative in which software developers and genome biologists came together to explore and raise awareness of these issues: BioHackathon 2009. Results Developers in attendance came from diverse backgrounds, with experts in Web services, workflow tools, text mining and visualization. Genome biologists provided expertise and exemplar data from the domains of sequence and pathway analysis and glyco-informatics. One goal of the meeting was to evaluate the ability to address real world use cases in these domains using the tools that the developers represented. This resulted in i) a workflow to annotate 100,000 sequences from an invertebrate species; ii) an integrated system for analysis of the transcription factor binding sites (TFBSs) enriched based on differential gene expression data obtained from a microarray experiment; iii) a workflow to enumerate putative physical protein interactions among enzymes in a metabolic pathway using protein structure data; iv) a workflow to analyze glyco-gene-related diseases by searching for human homologs of glyco-genes in other species, such as fruit flies, and retrieving their phenotype-annotated SNPs. Conclusions Beyond deriving prototype solutions for each use-case, a second major purpose of the BioHackathon was to highlight areas of insufficiency. We discuss the issues raised by our exploration of the problem/solution space, concluding that there are still problems with the way Web services are modeled and annotated, including: i) the absence of several useful data or analysis functions in the Web service "space"; ii) the lack of documentation of methods; iii) lack of compliance with the SOAP/WSDL specification among and between various programming-language libraries; and iv) incompatibility between various bioinformatics data formats. Although it was still difficult to solve real world problems posed to the developers by the biological researchers in attendance because of these problems, we note the promise of addressing these issues within a semantic framework. PMID:21806842
The 2nd DBCLS BioHackathon: interoperable bioinformatics Web services for integrated applications.
Katayama, Toshiaki; Wilkinson, Mark D; Vos, Rutger; Kawashima, Takeshi; Kawashima, Shuichi; Nakao, Mitsuteru; Yamamoto, Yasunori; Chun, Hong-Woo; Yamaguchi, Atsuko; Kawano, Shin; Aerts, Jan; Aoki-Kinoshita, Kiyoko F; Arakawa, Kazuharu; Aranda, Bruno; Bonnal, Raoul Jp; Fernández, José M; Fujisawa, Takatomo; Gordon, Paul Mk; Goto, Naohisa; Haider, Syed; Harris, Todd; Hatakeyama, Takashi; Ho, Isaac; Itoh, Masumi; Kasprzyk, Arek; Kido, Nobuhiro; Kim, Young-Joo; Kinjo, Akira R; Konishi, Fumikazu; Kovarskaya, Yulia; von Kuster, Greg; Labarga, Alberto; Limviphuvadh, Vachiranee; McCarthy, Luke; Nakamura, Yasukazu; Nam, Yunsun; Nishida, Kozo; Nishimura, Kunihiro; Nishizawa, Tatsuya; Ogishima, Soichi; Oinn, Tom; Okamoto, Shinobu; Okuda, Shujiro; Ono, Keiichiro; Oshita, Kazuki; Park, Keun-Joon; Putnam, Nicholas; Senger, Martin; Severin, Jessica; Shigemoto, Yasumasa; Sugawara, Hideaki; Taylor, James; Trelles, Oswaldo; Yamasaki, Chisato; Yamashita, Riu; Satoh, Noriyuki; Takagi, Toshihisa
2011-08-02
The interaction between biological researchers and the bioinformatics tools they use is still hampered by incomplete interoperability between such tools. To ensure interoperability initiatives are effectively deployed, end-user applications need to be aware of, and support, best practices and standards. Here, we report on an initiative in which software developers and genome biologists came together to explore and raise awareness of these issues: BioHackathon 2009. Developers in attendance came from diverse backgrounds, with experts in Web services, workflow tools, text mining and visualization. Genome biologists provided expertise and exemplar data from the domains of sequence and pathway analysis and glyco-informatics. One goal of the meeting was to evaluate the ability to address real world use cases in these domains using the tools that the developers represented. This resulted in i) a workflow to annotate 100,000 sequences from an invertebrate species; ii) an integrated system for analysis of the transcription factor binding sites (TFBSs) enriched based on differential gene expression data obtained from a microarray experiment; iii) a workflow to enumerate putative physical protein interactions among enzymes in a metabolic pathway using protein structure data; iv) a workflow to analyze glyco-gene-related diseases by searching for human homologs of glyco-genes in other species, such as fruit flies, and retrieving their phenotype-annotated SNPs. Beyond deriving prototype solutions for each use-case, a second major purpose of the BioHackathon was to highlight areas of insufficiency. We discuss the issues raised by our exploration of the problem/solution space, concluding that there are still problems with the way Web services are modeled and annotated, including: i) the absence of several useful data or analysis functions in the Web service "space"; ii) the lack of documentation of methods; iii) lack of compliance with the SOAP/WSDL specification among and between various programming-language libraries; and iv) incompatibility between various bioinformatics data formats. Although it was still difficult to solve real world problems posed to the developers by the biological researchers in attendance because of these problems, we note the promise of addressing these issues within a semantic framework.
Selection and Biosensor Application of Aptamers for Small Molecules
Pfeiffer, Franziska; Mayer, Günter
2016-01-01
Small molecules play a major role in the human body and as drugs, toxins, and chemicals. Tools to detect and quantify them are therefore in high demand. This review will give an overview about aptamers interacting with small molecules and their selection. We discuss the current state of the field, including advantages as well as problems associated with their use and possible solutions to tackle these. We then discuss different kinds of small molecule aptamer-based sensors described in literature and their applications, ranging from detecting drinking water contaminations to RNA imaging. PMID:27379229
Strategic management process in hospitals.
Zovko, V
2001-01-01
Strategic management is concerned with strategic choices and strategic implementation; it provides the means by which organizations meet their objectives. In the case of hospitals it helps executives and all employees to understand the real purpose and long term goals of the hospital. Also, it helps the hospital find its place in the health care service provision chain, and enables the hospital to coordinate its activities with other organizations in the health care system. Strategic management is a tool, rather than a solution, that helps executives to identify root causes of major problems in the hospital.
NASA Technical Reports Server (NTRS)
Welge, R. T.
1972-01-01
A CH-54B Skycrane helicopter was fabricated with boron/epoxy reinforced stringers in the tail cone and boron/epoxy tubes in the tail skid. The fabrication of the tail cone was made with conventional tooling, production shop personnel, and no major problems. The flight test program includes a stress and vibration survey using strain gages and vibration transducers located in critical areas. The program to inspect and monitor the reliability of the components is discussed.
Framework Support For Knowledge-Based Software Development
NASA Astrophysics Data System (ADS)
Huseth, Steve
1988-03-01
The advent of personal engineering workstations has brought substantial information processing power to the individual programmer. Advanced tools and environment capabilities supporting the software lifecycle are just beginning to become generally available. However, many of these tools are addressing only part of the software development problem by focusing on rapid construction of self-contained programs by a small group of talented engineers. Additional capabilities are required to support the development of large programming systems where a high degree of coordination and communication is required among large numbers of software engineers, hardware engineers, and managers. A major player in realizing these capabilities is the framework supporting the software development environment. In this paper we discuss our research toward a Knowledge-Based Software Assistant (KBSA) framework. We propose the development of an advanced framework containing a distributed knowledge base that can support the data representation needs of tools, provide environmental support for the formalization and control of the software development process, and offer a highly interactive and consistent user interface.
A new tool called DISSECT for analysing large genomic data sets using a Big Data approach
Canela-Xandri, Oriol; Law, Andy; Gray, Alan; Woolliams, John A.; Tenesa, Albert
2015-01-01
Large-scale genetic and genomic data are increasingly available and the major bottleneck in their analysis is a lack of sufficiently scalable computational tools. To address this problem in the context of complex traits analysis, we present DISSECT. DISSECT is a new and freely available software that is able to exploit the distributed-memory parallel computational architectures of compute clusters, to perform a wide range of genomic and epidemiologic analyses, which currently can only be carried out on reduced sample sizes or under restricted conditions. We demonstrate the usefulness of our new tool by addressing the challenge of predicting phenotypes from genotype data in human populations using mixed-linear model analysis. We analyse simulated traits from 470,000 individuals genotyped for 590,004 SNPs in ∼4 h using the combined computational power of 8,400 processor cores. We find that prediction accuracies in excess of 80% of the theoretical maximum could be achieved with large sample sizes. PMID:26657010
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pin, Francois G.; Love, Lonnie L.; Jung, David L.
2004-03-29
Contrary to the repetitive tasks performed by industrial robots, the tasks in most DOE missions such as environmental restoration or Decontamination and Decommissioning (D&D) can be characterized as ''batches-of-one'', in which robots must be capable of adapting to changes in constraints, tools, environment, criteria and configuration. No commercially available robot control code is suitable for use with such widely varying conditions. In this talk we present our development of a ''generic code'' to allow real time (at loop rate) robot behavior adaptation to changes in task objectives, tools, number and type of constraints, modes of controls or kinematics configuration. Wemore » present the analytical framework underlying our approach and detail the design of its two major modules for the automatic generation of the kinematics equations when the robot configuration or tools change and for the motion planning under time-varying constraints. Sample problems illustrating the capabilities of the developed system are presented.« less
CRISPR/Cas9-based tools for targeted genome editing and replication control of HBV.
Peng, Cheng; Lu, Mengji; Yang, Dongliang
2015-10-01
Hepatitis B virus (HBV) infection remains a major global health problem because current therapies rarely eliminate HBV infections to achieve a complete cure. A different treatment paradigm to effectively clear HBV infection and eradicate latent viral reservoirs is urgently required. In recent years, the development of a new RNA-guided gene-editing tool, the CRISPR/Cas9 (clustered regularly interspaced short palindromic repeats/CRISPR-associated nuclease 9) system, has greatly facilitated site-specific mutagenesis and represents a very promising potential therapeutic tool for diseases, including for eradication of invasive pathogens such as HBV. Here, we review recent advances in the use of CRISPR/Cas9, which is designed to target HBV specific DNA sequences to inhibit HBV replication and to induce viral genome mutation, in cell lines or animal models. Advantages, limitations and possible solutions, and proposed directions for future research are discussed to highlight the opportunities and challenges of CRISPR/Cas9 as a new, potentially curative therapy for chronic hepatitis B infection.
RHydro - Hydrological models and tools to represent and analyze hydrological data in R
NASA Astrophysics Data System (ADS)
Reusser, Dominik; Buytaert, Wouter
2010-05-01
In hydrology, basic equations and procedures keep being implemented from scratch by scientist, with the potential for errors and inefficiency. The use of libraries can overcome these problems. Other scientific disciplines such as mathematics and physics have benefited significantly from such an approach with freely available implementations for many routines. As an example, hydrological libraries could contain: Major representations of hydrological processes such as infiltration, sub-surface runoff and routing algorithms. Scaling functions, for instance to combine remote sensing precipitation fields with rain gauge data Data consistency checks Performance measures. Here we present a beginning for such a library implemented in the high level data programming language R. Currently, Top-model, data import routines for WaSiM-ETH as well basic visualization and evaluation tools are implemented. The design is such, that a definition of import scripts for additional models is sufficient to have access to the full set of evaluation and visualization tools.
Griffiths, Mark D; Wood, Richard T A; Parke, Jonathan
2009-08-01
To date, little empirical research has focused on social responsibility in gambling. This study examined players' attitudes and behavior toward using the social responsibility tool PlayScan designed by the Swedish gaming company Svenska Spel. Via PlayScan, players have the option to utilize various social responsibility control tools (e.g., personal gaming budgets, self-diagnostic tests of gambling habits, self-exclusion options). A total of 2,348 participants took part in an online questionnaire study. Participants were clientele of the Svenska Spel online gambling Web site. Results showed that just over a quarter of players (26%) had used PlayScan. The vast majority of those who had activated PlayScan (almost 9 in 10 users) said that PlayScan was easy to use. Over half of PlayScan users (52%) said it was useful; 19% said it was not. Many features were seen as useful by online gamblers, including limit setting (70%), viewing their gambling profile (49%), self-exclusion facilities (42%), self-diagnostic problem gambling tests (46%), information and support for gambling issues (40%), and gambling profile predictions (36%). In terms of actual (as opposed to theoretical) use, over half of PlayScan users (56%) had set spending limits, 40% had taken a self-diagnostic problem gambling test, and 17% had used a self-exclusion feature.
2014-01-01
Background With over 50 different disorders and a combined incidence of up to 1/3000 births, lysosomal storage diseases (LSDs) constitute a major public health problem and place an enormous burden on affected individuals and their families. Many factors make LSD diagnosis difficult, including phenotype and penetrance variability, shared signs and symptoms, and problems inherent to biochemical diagnosis. Developing a powerful diagnostic tool could mitigate the protracted diagnostic process for these families, lead to better outcomes for current and proposed therapies, and provide the basis for more appropriate genetic counseling. Methods We have designed a targeted resequencing assay for the simultaneous testing of 57 lysosomal genes, using in-solution capture as the enrichment method and two different sequencing platforms. A total of 84 patients with high to moderate-or low suspicion index for LSD were enrolled in different centers in Spain and Portugal, including 18 positive controls. Results We correctly diagnosed 18 positive blinded controls, provided genetic diagnosis to 25 potential LSD patients, and ended with 18 diagnostic odysseys. Conclusion We report the assessment of a next–generation-sequencing-based approach as an accessory tool in the diagnosis of LSDs, a group of disorders which have overlapping clinical profiles and genetic heterogeneity. We have also identified and quantified the strengths and limitations of next generation sequencing (NGS) technology applied to diagnosis. PMID:24767253
Family-based therapy for dementia caregivers: clinical observations
MITRANI, V. B.; CZAJA, S. J.
2008-01-01
Family caregiving for dementia patients is a major social and clinical problem. Family caregivers face major stressful emotional, social and economic burdens, and the negative consequences associated with caregiving are well documented. Given the projected increase in the number of people with dementia, there is a need to identify approaches that will help families manage the challenges of caregiving. Social support from friends and family members has consistently been found to mediate caregiver outcomes, yet many caregivers face problems with isolation and estrangement from family members. In this regard, family-based therapy is a promising intervention for increasing social support for caregivers, and enhancing their quality of life and ability to provide care.This paper will discuss how family-based therapy can be applied as an intervention for family caregivers of dementia patients.The clinical implications of specific interactional patterns will be presented via case examples from an ongoing clinical trial with white American and Cuban American caregivers of dementia patients.The intent is to demonstrate how identification of interactional patterns is a valuable tool for implementing family-based interventions. PMID:18548132
The development of tool manufacture in humans: what helps young children make innovative tools?
Chappell, Jackie; Cutting, Nicola; Apperly, Ian A; Beck, Sarah R
2013-11-19
We know that even young children are proficient tool users, but until recently, little was known about how they make tools. Here, we will explore the concepts underlying tool making, and the kinds of information and putative cognitive abilities required for children to manufacture novel tools. We will review the evidence for novel tool manufacture from the comparative literature and present a growing body of data from children suggesting that innovation of the solution to a problem by making a tool is a much more challenging task than previously thought. Children's difficulty with these kinds of tasks does not seem to be explained by perseveration with unmodified tools, difficulty with switching to alternative strategies, task pragmatics or issues with permission. Rather, making novel tools (without having seen an example of the required tool within the context of the task) appears to be hard, because it is an example of an 'ill-structured problem'. In this type of ill-structured problem, the starting conditions and end goal are known, but the transformations and/or actions required to get from one to the other are not specified. We will discuss the implications of these findings for understanding the development of problem-solving in humans and other animals.
LEGO Robotics: An Authentic Problem Solving Tool?
ERIC Educational Resources Information Center
Castledine, Alanah-Rei; Chalmers, Chris
2011-01-01
With the current curriculum focus on correlating classroom problem solving lessons to real-world contexts, are LEGO robotics an effective problem solving tool? This present study was designed to investigate this question and to ascertain what problem solving strategies primary students engaged with when working with LEGO robotics and whether the…
Data management in an object-oriented distributed aircraft conceptual design environment
NASA Astrophysics Data System (ADS)
Lu, Zhijie
In the competitive global market place, aerospace companies are forced to deliver the right products to the right market, with the right cost, and at the right time. However, the rapid development of technologies and new business opportunities, such as mergers, acquisitions, supply chain management, etc., have dramatically increased the complexity of designing an aircraft. Therefore, the pressure to reduce design cycle time and cost is enormous. One way to solve such a dilemma is to develop and apply advanced engineering environments (AEEs), which are distributed collaborative virtual design environments linking researchers, technologists, designers, etc., together by incorporating application tools and advanced computational, communications, and networking facilities. Aircraft conceptual design, as the first design stage, provides major opportunity to compress design cycle time and is the cheapest place for making design changes. However, traditional aircraft conceptual design programs, which are monolithic programs, cannot provide satisfactory functionality to meet new design requirements due to the lack of domain flexibility and analysis scalability. Therefore, we are in need of the next generation aircraft conceptual design environment (NextADE). To build the NextADE, the framework and the data management problem are two major problems that need to be addressed at the forefront. Solving these two problems, particularly the data management problem, is the focus of this research. In this dissertation, in light of AEEs, a distributed object-oriented framework is firstly formulated and tested for the NextADE. In order to improve interoperability and simplify the integration of heterogeneous application tools, data management is one of the major problems that need to be tackled. To solve this problem, taking into account the characteristics of aircraft conceptual design data, a robust, extensible object-oriented data model is then proposed according to the distributed object-oriented framework. By overcoming the shortcomings of the traditional approach of modeling aircraft conceptual design data, this data model makes it possible to capture specific detailed information of aircraft conceptual design without sacrificing generality, which is one of the most desired features of a data model for aircraft conceptual design. Based upon this data model, a prototype of the data management system, which is one of the fundamental building blocks of the NextADE, is implemented utilizing the state of the art information technologies. Using a general-purpose integration software package to demonstrate the efficacy of the proposed framework and the data management system, the NextADE is initially implemented by integrating the prototype of the data management system with other building blocks of the design environment, such as disciplinary analyses programs and mission analyses programs. As experiments, two case studies are conducted in the integrated design environments. One is based upon a simplified conceptual design of a notional conventional aircraft; the other is a simplified conceptual design of an unconventional aircraft. As a result of the experiments, the proposed framework and the data management approach are shown to be feasible solutions to the research problems.
TNA4OptFlux – a software tool for the analysis of strain optimization strategies
2013-01-01
Background Rational approaches for Metabolic Engineering (ME) deal with the identification of modifications that improve the microbes’ production capabilities of target compounds. One of the major challenges created by strain optimization algorithms used in these ME problems is the interpretation of the changes that lead to a given overproduction. Often, a single gene knockout induces changes in the fluxes of several reactions, as compared with the wild-type, and it is therefore difficult to evaluate the physiological differences of the in silico mutant. This is aggravated by the fact that genome-scale models per se are difficult to visualize, given the high number of reactions and metabolites involved. Findings We introduce a software tool, the Topological Network Analysis for OptFlux (TNA4OptFlux), a plug-in which adds to the open-source ME platform OptFlux the capability of creating and performing topological analysis over metabolic networks. One of the tool’s major advantages is the possibility of using these tools in the analysis and comparison of simulated phenotypes, namely those coming from the results of strain optimization algorithms. We illustrate the capabilities of the tool by using it to aid the interpretation of two E. coli strains designed in OptFlux for the overproduction of succinate and glycine. Conclusions Besides adding new functionalities to the OptFlux software tool regarding topological analysis, TNA4OptFlux methods greatly facilitate the interpretation of non-intuitive ME strategies by automating the comparison between perturbed and non-perturbed metabolic networks. The plug-in is available on the web site http://www.optflux.org, together with extensive documentation. PMID:23641878
NASA Astrophysics Data System (ADS)
Adams, Wendy Kristine
The purpose of my research was to produce a problem solving evaluation tool for physics. To do this it was necessary to gain a thorough understanding of how students solve problems. Although physics educators highly value problem solving and have put extensive effort into understanding successful problem solving, there is currently no efficient way to evaluate problem solving skill. Attempts have been made in the past; however, knowledge of the principles required to solve the subject problem are so absolutely critical that they completely overshadow any other skills students may use when solving a problem. The work presented here is unique because the evaluation tool removes the requirement that the student already have a grasp of physics concepts. It is also unique because I picked a wide range of people and picked a wide range of tasks for evaluation. This is an important design feature that helps make things emerge more clearly. This dissertation includes an extensive literature review of problem solving in physics, math, education and cognitive science as well as descriptions of studies involving student use of interactive computer simulations, the design and validation of a beliefs about physics survey and finally the design of the problem solving evaluation tool. I have successfully developed and validated a problem solving evaluation tool that identifies 44 separate assets (skills) necessary for solving problems. Rigorous validation studies, including work with an independent interviewer, show these assets identified by this content-free evaluation tool are the same assets that students use to solve problems in mechanics and quantum mechanics. Understanding this set of component assets will help teachers and researchers address problem solving within the classroom.
Building a strong geoscience department by emphasizing curriculum and pedagogy
NASA Astrophysics Data System (ADS)
Lea, P. D.; Beane, R. J.; Laine, E. P.
2005-12-01
About a decade ago the Bowdoin College Geology Department recognized a need for a new curriculum that more fully engaged majors and non-majors as active learners. To accomplish this curricular change the faculty have adopted differing pedagogies that all engage students in real projects. Research project-based learning, community-based learning, and problem-based service-learning form the core of our teaching efforts. The emphasis on problem-solving and inquiry in our courses has greatly strengthened our department's contributions to research, education, and service at the college. These courses have an added benefit of acquainting students with various aspects of their local and global environment. Geology majors leave Bowdoin equipped with tools and experiences they need for employment or graduate school as well life-long learners. To support the integration of research into our teaching we have successfully sought funding from NSF's CCLI and MRI programs. As a consequence, even first year students work with an SEM/EDAX/EBSD, with instrumented watersheds, and soon with an ocean observatory adjacent to our Coastal Studies Center, as well as taking greater advantage of local field opportunities. Our intense focus on improving curriculum and pedagogy organized and energized us within the department and helped us to present ourselves and our goals to the college.
Indoor Air Quality Problem Solving Tool
Use the IAQ Problem Solving Tool to learn about the connection between health complaints and common solutions in schools. This resource provides an easy, step-by-step process to start identifying and resolving IAQ problems found at your school.
Dataflow Design Tool: User's Manual
NASA Technical Reports Server (NTRS)
Jones, Robert L., III
1996-01-01
The Dataflow Design Tool is a software tool for selecting a multiprocessor scheduling solution for a class of computational problems. The problems of interest are those that can be described with a dataflow graph and are intended to be executed repetitively on a set of identical processors. Typical applications include signal processing and control law problems. The software tool implements graph-search algorithms and analysis techniques based on the dataflow paradigm. Dataflow analyses provided by the software are introduced and shown to effectively determine performance bounds, scheduling constraints, and resource requirements. The software tool provides performance optimization through the inclusion of artificial precedence constraints among the schedulable tasks. The user interface and tool capabilities are described. Examples are provided to demonstrate the analysis, scheduling, and optimization functions facilitated by the tool.
NASA's Biomedical Research Program
NASA Technical Reports Server (NTRS)
Ahn, Chung-Hae
1981-01-01
The biomedical research program has been established to investigate the major physiological and psychological problems encountered by man when he undertakes spaceflight. The program seeks to obtain a better definition of each problem, an understanding of its underlying mechanism, and ultimately a means of prevention. In pursuing these goals the program also includes a major effort to develop the research tools and procedures it needs where these are not being developed elsewhere. After almost twenty years of manned spaceflight activities and after a much longer period of space related ground-based research, the program now recognizes two characteristics of spaceflight which are truly unique to space. These are weightlessness and one specific form of radiation. In its present stage of maturity much of the research focuses on mechanisms underlying the basic responses of man and animals to weightlessness. The program consists of nine elements. Eight of these are referable to specific physiological problems that have either been encountered in previous manned spaceflight or which are anticipated to occur as spaceflights last longer, traverse steeper orbital inclinations, or are otherwise different from previous missions. The ninth addresses problems that have neither arisen nor can be reasonably predicted but are suspected on the basis of theoretical models, ground-based animal research, or for other reasons. The program's current emphasis is directed toward the motion sickness problem because of its relevance to Space Shuttle operations. Increased awareness and understanding of the radiation hazard has resulted in more emphasis being placed on the biological effects of high energy, high mass number particulate radiation and upon radiation protection . Cardiovascular and musculoskeleta1 studies are pursued in recognition of the considerable fundamental knowledge that must be acquired in these areas before effective countermeasures to the effects of repetitive or long-term flight can be devised. Major new avenues of research will deal with the psychological accompaniments of spaceflight and with mathematical modelling of physiological systems.
Corah, Louise; Mossop, Liz; Cobb, Kate; Dean, Rachel
2018-03-15
Consultations are complex interactions, are central to achieving optimal outcomes for all stakeholders, yet what constitutes a successful consultation has not been defined. The aim of this systematic review was to describe the scope of the literature available on specific health problem consultations and appraise their identified success measures. Searches of CAB Abstracts and MEDLINE were performed in May 2016 using species and consultation terms. Systematic sorting of the results allowed identification of consultation 'success factors' cited in peer-reviewed veterinary literature which were appraised using an appropriate critical appraisal tool (AXIS). Searches returned 11 330 results with a total of 17 publications meeting the inclusion criteria, of which four measured consultation success. Journal of the American Veterinary Medical Association was the most common journal of publication (9 of 17) and the majority of included papers had been published since 2010 (12 of 17). Success factors measured were compliance, client satisfaction and veterinary surgeon satisfaction, and publications primarily used communication analysis tools to measure success. The review highlights the paucity of peer-reviewed literature examining small animal, health problem veterinary consultations. The available evidence is of variable quality and provides weak evidence as to which factors contribute to a successful consultation. © British Veterinary Association (unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
The complex spine: the multidimensional system of causal pathways for low-back disorders.
Marras, William S
2012-12-01
The aim of this study was to examine the logic behind the knowledge of low-back problem causal pathways. Low-back pain and low-back disorders (LBDs) continue to represent the major musculoskeletal risk problem in the workplace,with the prevalence and costs of such disorders increasing over time. In recent years, there has been much criticism of the ability of ergonomics methods to control the risk of LBDs. Logical assessment of the systems logic associated with our understanding and prevention of LBDs. Current spine loading as well as spine tolerance research efforts are bringing the field to the point where there is a better systems understanding of the inextricable link between the musculoskeletal system and the cognitive system. Loading is influenced by both the physical environment factors as well as mental demands, whereas tolerances are defined by both physical tissue tolerance and biochemically based tissue sensitivities to pain. However, the logic used in many low-back risk assessment tools may be overly simplistic, given what is understood about causal pathways. Current tools typically assess only load or position in a very cursory manner. Efforts must work toward satisfying both the physical environment and the cognitive environment for the worker if one is to reliably lower the risk of low-back problems. This systems representation of LBD development may serve as a guide to identify gaps in our understanding of LBDs.
Generation of Look-Up Tables for Dynamic Job Shop Scheduling Decision Support Tool
NASA Astrophysics Data System (ADS)
Oktaviandri, Muchamad; Hassan, Adnan; Mohd Shaharoun, Awaluddin
2016-02-01
Majority of existing scheduling techniques are based on static demand and deterministic processing time, while most job shop scheduling problem are concerned with dynamic demand and stochastic processing time. As a consequence, the solutions obtained from the traditional scheduling technique are ineffective wherever changes occur to the system. Therefore, this research intends to develop a decision support tool (DST) based on promising artificial intelligent that is able to accommodate the dynamics that regularly occur in job shop scheduling problem. The DST was designed through three phases, i.e. (i) the look-up table generation, (ii) inverse model development and (iii) integration of DST components. This paper reports the generation of look-up tables for various scenarios as a part in development of the DST. A discrete event simulation model was used to compare the performance among SPT, EDD, FCFS, S/OPN and Slack rules; the best performances measures (mean flow time, mean tardiness and mean lateness) and the job order requirement (inter-arrival time, due dates tightness and setup time ratio) which were compiled into look-up tables. The well-known 6/6/J/Cmax Problem from Muth and Thompson (1963) was used as a case study. In the future, the performance measure of various scheduling scenarios and the job order requirement will be mapped using ANN inverse model.
School Attendance Problems: Using the TQM Tools To Identify Root Causes.
ERIC Educational Resources Information Center
Weller, L. David
2000-01-01
Deming's principles and TQM problem-solving tools and techniques can be used to solve noninstructional problems such as vandalism, dropouts, and student absenteeism. This case study presents a model for principals to apply to identify root causes, resolve problems, and provide quality outcomes (at reduced cost) in noninstructional areas. (Contains…
The Microcomputer--A Problem Solving Tool.
ERIC Educational Resources Information Center
Hoelscher, Karen J.
Designed to assist teachers in using the microcomputer as a tool to teach problem solving strategies, this document is divided into two sections: the first introduces the concept of problem solving as a thinking process, and suggests means by which a teacher can become an effective guide for the learning of problem solving skills; the second…
Using Digital Mapping Tool in Ill-Structured Problem Solving
ERIC Educational Resources Information Center
Bai, Hua
2013-01-01
Scaffolding students' problem solving and helping them to improve problem solving skills are critical in instructional design courses. This study investigated the effects of students' uses of a digital mapping tool on their problem solving performance in a design case study. It was found that the students who used the digital mapping tool…
Stiles, Brandie M; Fish, Anne F; Vandermause, Roxanne; Malik, Azfar M
2018-06-01
Up to 40% of patients with bipolar disorder are misdiagnosed, usually with major depression disorder. The purpose was to describe the current state of the science of the misdiagnosis of bipolar disorder, with the ultimate goal of improving psychiatric diagnostic workups including screening. An integrative review was conducted using standard criteria for evaluating research articles. Forty-nine articles met the eligibility criteria. Articles explored patient-related and health care provider-related factors contributing to the misdiagnosis of bipolar disorder as well as consequences of misdiagnosis. Clinically oriented, reliable, and valid screening tools for bipolar disorder also were reviewed. Awareness of multiple, challenging patient-related factors and more comprehensive assessment and screening by health care providers may reduce misdiagnosis.
The upside down world of diabetes care medical economics and what we might do to improve it.
Harlan, David M; Hirsch, Irl B
2017-04-01
Increasingly over the past generation, the American healthcare delivery system has received consistently poor marks with regard to public health outcomes and costs. This review by two seasoned diabetes care providers is intended to shed light on the fundamental flaws we believe to underlie that poor performance, and suggest options for better outcomes and cost efficiencies. Despite major advances in diabetes management medications and tools, overall public health with regard to diabetes outcomes remains poor. Efforts focused on controlling costs appear to be exacerbating the problem. For chronic diseases like diabetes, fee-for-service care models are fundamentally flawed and predictably fail. We suggest that a major overhaul of the medical economics underlying diabetes care can improve patient outcomes and decrease costs.
Nerys-Junior, Arildo; Costa, Lendel C.; Braga-Dias, Luciene P.; Oliveira, Márcia; Rossi, Átila D.; da Cunha, Rodrigo Delvecchio; Gonçalves, Gabriel S.; Tanuri, Amilcar
2014-01-01
Engineered nucleases such as zinc finger nucleases (ZFN) and transcription activator-like effector nucleases (TALEN) are one of the most promising tools for modifying genomes. These site-specific enzymes cause double-strand breaks that allow gene disruption or gene insertion, thereby facilitating genetic manipulation. The major problem associated with this approach is the labor-intensive procedures required to screen and confirm the cellular modification by nucleases. In this work, we produced a TALEN that targets the human CCR5 gene and developed a heteroduplex mobility assay for HEK 293T cells to select positive colonies for sequencing. This approach provides a useful tool for the quick detection and easy assessment of nuclease activity. PMID:24688299
The efficiency of geophysical adjoint codes generated by automatic differentiation tools
NASA Astrophysics Data System (ADS)
Vlasenko, A. V.; Köhl, A.; Stammer, D.
2016-02-01
The accuracy of numerical models that describe complex physical or chemical processes depends on the choice of model parameters. Estimating an optimal set of parameters by optimization algorithms requires knowledge of the sensitivity of the process of interest to model parameters. Typically the sensitivity computation involves differentiation of the model, which can be performed by applying algorithmic differentiation (AD) tools to the underlying numerical code. However, existing AD tools differ substantially in design, legibility and computational efficiency. In this study we show that, for geophysical data assimilation problems of varying complexity, the performance of adjoint codes generated by the existing AD tools (i) Open_AD, (ii) Tapenade, (iii) NAGWare and (iv) Transformation of Algorithms in Fortran (TAF) can be vastly different. Based on simple test problems, we evaluate the efficiency of each AD tool with respect to computational speed, accuracy of the adjoint, the efficiency of memory usage, and the capability of each AD tool to handle modern FORTRAN 90-95 elements such as structures and pointers, which are new elements that either combine groups of variables or provide aliases to memory addresses, respectively. We show that, while operator overloading tools are the only ones suitable for modern codes written in object-oriented programming languages, their computational efficiency lags behind source transformation by orders of magnitude, rendering the application of these modern tools to practical assimilation problems prohibitive. In contrast, the application of source transformation tools appears to be the most efficient choice, allowing handling even large geophysical data assimilation problems. However, they can only be applied to numerical models written in earlier generations of programming languages. Our study indicates that applying existing AD tools to realistic geophysical problems faces limitations that urgently need to be solved to allow the continuous use of AD tools for solving geophysical problems on modern computer architectures.
Why do children lack the flexibility to innovate tools?
Cutting, Nicola; Apperly, Ian A; Beck, Sarah R
2011-08-01
Despite being proficient tool users, young children have surprising difficulty in innovating tools (making novel tools to solve problems). Two experiments found that 4- to 7-year-olds had difficulty on two tool innovation problems and explored reasons for this inflexibility. Experiment 1 (N=51) showed that children's performance was unaffected by the need to switch away from previously correct strategies. Experiment 2 (N=92) suggested that children's difficulty could not easily be explained by task pragmatics or permission issues. Both experiments found evidence that some children perseverated on a single incorrect strategy, but such perseveration was insufficient to explain children's tendency not to innovate tools. We suggest that children's difficulty lies not with switching, task pragmatics, or behavioral perseveration but rather with solving the fundamentally "ill-structured" nature of tool innovation problems. Copyright © 2011 Elsevier Inc. All rights reserved.
Mechanical problem-solving strategies in left-brain damaged patients and apraxia of tool use.
Osiurak, François; Jarry, Christophe; Lesourd, Mathieu; Baumard, Josselin; Le Gall, Didier
2013-08-01
Left brain damage (LBD) can impair the ability to use familiar tools (apraxia of tool use) as well as novel tools to solve mechanical problems. Thus far, the emphasis has been placed on quantitative analyses of patients' performance. Nevertheless, the question still to be answered is, what are the strategies employed by those patients when confronted with tool use situations? To answer it, we asked 16 LBD patients and 43 healthy controls to solve mechanical problems by means of several potential tools. To specify the strategies, we recorded the time spent in performing four kinds of action (no manipulation, tool manipulation, box manipulation, and tool-box manipulation) as well as the number of relevant and irrelevant tools grasped. We compared LBD patients' performance with that of controls who encountered difficulties with the task (controls-) or not (controls+). Our results indicated that LBD patients grasped a higher number of irrelevant tools than controls+ and controls-. Concerning time allocation, controls+ and controls- spent significantly more time in performing tool-box manipulation than LBD patients. These results are inconsistent with the possibility that LBD patients could engage in trial-and-error strategies and, rather, suggest that they tend to be perplexed. These findings seem to indicate that the inability to reason about the objects' physical properties might prevent LBD patients from following any problem-solving strategy. Copyright © 2013 Elsevier Ltd. All rights reserved.
Mechanical problem-solving strategies in Alzheimer's disease and semantic dementia.
Lesourd, Mathieu; Baumard, Josselin; Jarry, Christophe; Etcharry-Bouyx, Frédérique; Belliard, Serge; Moreaud, Olivier; Croisile, Bernard; Chauviré, Valérie; Granjon, Marine; Le Gall, Didier; Osiurak, François
2016-07-01
The goal of this study was to explore whether the tool-use disorders observed in Alzheimer's disease (AD) and semantic dementia (SD) are of the same nature as those observed in left brain-damaged (LBD) patients. Recent evidence indicates that LBD patients with apraxia of tool use encounter difficulties in solving mechanical problems, characterized by the absence of specific strategies. This pattern may show the presence of impaired mechanical knowledge, critical for both familiar and novel tool use. So, we explored the strategies followed by AD and SD patients in mechanical problem-solving tasks in order to determine whether mechanical knowledge is also impaired in these patients. We used a mechanical problem-solving task in both choice (i.e., several tools were proposed) and no-choice (i.e., only 1 tool was proposed) conditions. We analyzed quantitative data and strategy profiles. AD patients but not SD patients met difficulties in solving mechanical problem-solving tasks. However, the key finding is that AD patients, despite their difficulties, showed strategy profiles that are similar to that of SD patients or controls. Moreover, AD patients exhibited a strategy profile distinct from the one previously observed in LBD patients. Those observations lead us to consider that difficulties met by AD patients to solve mechanical problems or even to use familiar tools may not be caused by mechanical knowledge impairment per se. In broad terms, what we call apraxia of tool use in AD is certainly not the same as apraxia of tool use observed in LBD patients. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
NASA Astrophysics Data System (ADS)
Durga Prasada Rao, V.; Harsha, N.; Raghu Ram, N. S.; Navya Geethika, V.
2018-02-01
In this work, turning was performed to optimize the surface finish or roughness (Ra) of stainless steel 304 with uncoated and coated carbide tools under dry conditions. The carbide tools were coated with Titanium Aluminium Nitride (TiAlN) nano coating using Physical Vapour Deposition (PVD) method. The machining parameters, viz., cutting speed, depth of cut and feed rate which show major impact on Ra are considered during turning. The experiments are designed as per Taguchi orthogonal array and machining process is done accordingly. Then second-order regression equations have been developed on the basis of experimental results for Ra in terms of machining parameters used. Regarding the effect of machining parameters, an upward trend is observed in Ra with respect to feed rate, and as cutting speed increases the Ra value increased slightly due to chatter and vibrations. The adequacy of response variable (Ra) is tested by conducting additional experiments. The predicted Ra values are found to be a close match of their corresponding experimental values of uncoated and coated tools. The corresponding average % errors are found to be within the acceptable limits. Then the surface roughness equations of uncoated and coated tools are set as the objectives of optimization problem and are solved by using Differential Evolution (DE) algorithm. Also the tool lives of uncoated and coated tools are predicted by using Taylor’s tool life equation.
Review of computational fluid dynamics applications in biotechnology processes.
Sharma, C; Malhotra, D; Rathore, A S
2011-01-01
Computational fluid dynamics (CFD) is well established as a tool of choice for solving problems that involve one or more of the following phenomena: flow of fluids, heat transfer,mass transfer, and chemical reaction. Unit operations that are commonly utilized in biotechnology processes are often complex and as such would greatly benefit from application of CFD. The thirst for deeper process and product understanding that has arisen out of initiatives such as quality by design provides further impetus toward usefulness of CFD for problems that may otherwise require extensive experimentation. Not surprisingly, there has been increasing interest in applying CFD toward a variety of applications in biotechnology processing in the last decade. In this article, we will review applications in the major unit operations involved with processing of biotechnology products. These include fermentation,centrifugation, chromatography, ultrafiltration, microfiltration, and freeze drying. We feel that the future applications of CFD in biotechnology processing will focus on establishing CFD as a tool of choice for providing process understanding that can be then used to guide more efficient and effective experimentation. This article puts special emphasis on the work done in the last 10 years. © 2011 American Institute of Chemical Engineers
Students' Problem Solving as Mediated by Their Cognitive Tool Use: A Study of Tool Use Patterns
ERIC Educational Resources Information Center
Liu, M.; Horton, L. R.; Corliss, S. B.; Svinicki, M. D.; Bogard, T.; Kim, J.; Chang, M.
2009-01-01
The purpose of this study was to use multiple data sources, both objective and subjective, to capture students' thinking processes as they were engaged in problem solving, examine the cognitive tool use patterns, and understand what tools were used and why they were used. The findings of this study confirmed previous research and provided clear…
Software Process Automation: Experiences from the Trenches.
1996-07-01
Integration of problem database Weaver tions) J Process WordPerfect, All-in-One, Oracle, CM Integration of tools Weaver System K Process Framemaker , CM...handle change requests and problem reports. * Autoplan, a project management tool * Framemaker , a document processing system * Worldview, a document...Cadre, Team Work, FrameMaker , some- thing for requirements traceability, their own homegrown scheduling tool, and their own homegrown tool integrator
3D first-arrival traveltime tomography with modified total variation regularization
NASA Astrophysics Data System (ADS)
Jiang, Wenbin; Zhang, Jie
2018-02-01
Three-dimensional (3D) seismic surveys have become a major tool in the exploration and exploitation of hydrocarbons. 3D seismic first-arrival traveltime tomography is a robust method for near-surface velocity estimation. A common approach for stabilizing the ill-posed inverse problem is to apply Tikhonov regularization to the inversion. However, the Tikhonov regularization method recovers smooth local structures while blurring the sharp features in the model solution. We present a 3D first-arrival traveltime tomography method with modified total variation (MTV) regularization to preserve sharp velocity contrasts and improve the accuracy of velocity inversion. To solve the minimization problem of the new traveltime tomography method, we decouple the original optimization problem into two following subproblems: a standard traveltime tomography problem with the traditional Tikhonov regularization and a L2 total variation problem. We apply the conjugate gradient method and split-Bregman iterative method to solve these two subproblems, respectively. Our synthetic examples show that the new method produces higher resolution models than the conventional traveltime tomography with Tikhonov regularization. We apply the technique to field data. The stacking section shows significant improvements with static corrections from the MTV traveltime tomography.
Complex fuzzy soft expert sets
NASA Astrophysics Data System (ADS)
Selvachandran, Ganeshsree; Hafeed, Nisren A.; Salleh, Abdul Razak
2017-04-01
Complex fuzzy sets and its accompanying theory although at its infancy, has proven to be superior to classical type-1 fuzzy sets, due its ability in representing time-periodic problem parameters and capturing the seasonality of the fuzziness that exists in the elements of a set. These are important characteristics that are pervasive in most real world problems. However, there are two major problems that are inherent in complex fuzzy sets: it lacks a sufficient parameterization tool and it does not have a mechanism to validate the values assigned to the membership functions of the elements in a set. To overcome these problems, we propose the notion of complex fuzzy soft expert sets which is a hybrid model of complex fuzzy sets and soft expert sets. This model incorporates the advantages of complex fuzzy sets and soft sets, besides having the added advantage of allowing the users to know the opinion of all the experts in a single model without the need for any additional cumbersome operations. As such, this model effectively improves the accuracy of representation of problem parameters that are periodic in nature, besides having a higher level of computational efficiency compared to similar models in literature.
Hierarchical Winner-Take-All Particle Swarm Optimization Social Network for Neural Model Fitting
Coventry, Brandon S.; Parthasarathy, Aravindakshan; Sommer, Alexandra L.; Bartlett, Edward L.
2016-01-01
Particle swarm optimization (PSO) has gained widespread use as a general mathematical programming paradigm and seen use in a wide variety of optimization and machine learning problems. In this work, we introduce a new variant on the PSO social network and apply this method to the inverse problem of input parameter selection from recorded auditory neuron tuning curves. The topology of a PSO social network is a major contributor to optimization success. Here we propose a new social network which draws influence from winner-take-all coding found in visual cortical neurons. We show that the winner-take-all network performs exceptionally well on optimization problems with greater than 5 dimensions and runs at a lower iteration count as compared to other PSO topologies. Finally we show that this variant of PSO is able to recreate auditory frequency tuning curves and modulation transfer functions, making it a potentially useful tool for computational neuroscience models. PMID:27726048
Nelson, Carl A; Miller, David J; Oleynikov, Dmitry
2008-01-01
As modular systems come into the forefront of robotic telesurgery, streamlining the process of selecting surgical tools becomes an important consideration. This paper presents a method for optimal queuing of tools in modular surgical tool systems, based on patterns in tool-use sequences, in order to minimize time spent changing tools. The solution approach is to model the set of tools as a graph, with tool-change frequency expressed as edge weights in the graph, and to solve the Traveling Salesman Problem for the graph. In a set of simulations, this method has shown superior performance at optimizing tool arrangements for streamlining surgical procedures.
NASA Astrophysics Data System (ADS)
Abayan, Kenneth Munoz
Stoichiometry is a fundamental topic in chemistry that measures a quantifiable relationship between atoms, molecules, etc. Stoichiometry is usually taught using expository teaching methods. Students are passively given information, in the hopes they will retain the transmission of information to be able to solve stoichiometry problems masterfully. Cognitive science research has shown that this kind of instructional teaching method is not very effecting in meaningful learning practice. Instead, students must take ownership of their learning. The students need to actively construct their own knowledge by receiving, interpreting, integrating and reorganizing that information into their own mental schemas. In the absence of active learning practices, tools must be created in such a way to be able to scaffold difficult problems by encoding opportunities necessary to make the construction of knowledge memorable, thereby creating a usable knowledge base. Using an online e-learning tool and its potential to create a dynamic and interactive learning environment may facilitate the learning of stoichiometry. The study entailed requests from volunteer students, IRB consent form, a baseline questionnaire, random assignment of treatment, pre- and post-test assessment, and post assessment survey. These activities were given online. A stoichiometry-based assessment was given in a proctored examination at the University of Texas at Arlington (UTA) campus. The volunteer students who took part in these studies were at least 18 of age and were enrolled in General Chemistry 1441, at the University of Texas at Arlington. Each participant gave their informed consent to use their data in the following study. Students were randomly assigned to one of 4 treatments groups based on teaching methodology, (Dimensional Analysis, Operational Method, Ratios and Proportions) and a control group who just received instruction through lecture only. In this study, an e-learning tool was created to demonstrate several methodologies, on how to solve stoichiometry, which are all supported by chemical education research. Comparisons of student performance based on pre- and post-test assessment, and a stoichiometry-based examination was done to determine if the information provided within the e-learning tool yielded greater learning outcomes compared to the students in the absence of scaffold learning material. The e-learning tool was created to help scaffold the problem solving process necessary to help students (N=394) solve stoichiometry problems. Therein the study investigated possible predictors for success on a stoichiometry based examination, students' conceptual understanding of solving stoichiometry problems, and their explanation of reasoning. It was found that the way the student answered a given stoichiometry question (i.e. whether the student used dimensional analysis, operational method or any other process) was not statistically relevant (p=0.05). More importantly, if the students were able to describe their thought process clearly, these students scored significantly higher on stoichiometry test (mean 84, p<0.05). This finding has major implications in teaching the topic, as lecturers tend to stress and focus on the method rather than the process on how to solve stoichiometry problems.
GlycoWorkbench: a tool for the computer-assisted annotation of mass spectra of glycans.
Ceroni, Alessio; Maass, Kai; Geyer, Hildegard; Geyer, Rudolf; Dell, Anne; Haslam, Stuart M
2008-04-01
Mass spectrometry is the main analytical technique currently used to address the challenges of glycomics as it offers unrivalled levels of sensitivity and the ability to handle complex mixtures of different glycan variations. Determination of glycan structures from analysis of MS data is a major bottleneck in high-throughput glycomics projects, and robust solutions to this problem are of critical importance. However, all the approaches currently available have inherent restrictions to the type of glycans they can identify, and none of them have proved to be a definitive tool for glycomics. GlycoWorkbench is a software tool developed by the EUROCarbDB initiative to assist the manual interpretation of MS data. The main task of GlycoWorkbench is to evaluate a set of structures proposed by the user by matching the corresponding theoretical list of fragment masses against the list of peaks derived from the spectrum. The tool provides an easy to use graphical interface, a comprehensive and increasing set of structural constituents, an exhaustive collection of fragmentation types, and a broad list of annotation options. The aim of GlycoWorkbench is to offer complete support for the routine interpretation of MS data. The software is available for download from: http://www.eurocarbdb.org/applications/ms-tools.
Clément, Fabrice; Gruber, Thibaud
2017-01-01
Children are skilful at acquiring tool-using skills by faithfully copying relevant and irrelevant actions performed by others, but poor at innovating tools to solve problems. Five- to twelve-year-old urban French and rural Serbian children (N = 208) were exposed to a Hook task; a jar containing a reward in a bucket and a pipe cleaner as potential recovering tool material. In both countries, few children under the age of 10 made a hook from the pipe cleaner to retrieve the reward on their own. However, from five onward, the majority of unsuccessful children succeeded after seeing an adult model manufacturing a hook without completing the task. Additionally, a third of the children who observed a similar demonstration including an irrelevant action performed with a second object, a string, replicated this meaningless action. Children's difficulty with innovation and early capacity for overimitation thus do not depend on socio-economic background. Strikingly, we document a sex difference in overimitation across cultures, with boys engaging more in overimitation than girls, a finding that may result from differences regarding explorative tool-related behaviour. This male-biased sex effect sheds new light on our understanding of overimitation, and more generally, on how human tool culture evolved. PMID:29308216
Modeling languages for biochemical network simulation: reaction vs equation based approaches.
Wiechert, Wolfgang; Noack, Stephan; Elsheikh, Atya
2010-01-01
Biochemical network modeling and simulation is an essential task in any systems biology project. The systems biology markup language (SBML) was established as a standardized model exchange language for mechanistic models. A specific strength of SBML is that numerous tools for formulating, processing, simulation and analysis of models are freely available. Interestingly, in the field of multidisciplinary simulation, the problem of model exchange between different simulation tools occurred much earlier. Several general modeling languages like Modelica have been developed in the 1990s. Modelica enables an equation based modular specification of arbitrary hierarchical differential algebraic equation models. Moreover, libraries for special application domains can be rapidly developed. This contribution compares the reaction based approach of SBML with the equation based approach of Modelica and explains the specific strengths of both tools. Several biological examples illustrating essential SBML and Modelica concepts are given. The chosen criteria for tool comparison are flexibility for constraint specification, different modeling flavors, hierarchical, modular and multidisciplinary modeling. Additionally, support for spatially distributed systems, event handling and network analysis features is discussed. As a major result it is shown that the choice of the modeling tool has a strong impact on the expressivity of the specified models but also strongly depends on the requirements of the application context.
Frick, Aurélien; Clément, Fabrice; Gruber, Thibaud
2017-12-01
Children are skilful at acquiring tool-using skills by faithfully copying relevant and irrelevant actions performed by others, but poor at innovating tools to solve problems. Five- to twelve-year-old urban French and rural Serbian children ( N = 208) were exposed to a Hook task ; a jar containing a reward in a bucket and a pipe cleaner as potential recovering tool material. In both countries, few children under the age of 10 made a hook from the pipe cleaner to retrieve the reward on their own. However, from five onward, the majority of unsuccessful children succeeded after seeing an adult model manufacturing a hook without completing the task. Additionally, a third of the children who observed a similar demonstration including an irrelevant action performed with a second object, a string, replicated this meaningless action. Children's difficulty with innovation and early capacity for overimitation thus do not depend on socio-economic background. Strikingly, we document a sex difference in overimitation across cultures, with boys engaging more in overimitation than girls, a finding that may result from differences regarding explorative tool-related behaviour. This male-biased sex effect sheds new light on our understanding of overimitation, and more generally, on how human tool culture evolved.
Suzer, Ozge
2015-05-01
The matter of environmental concern prioritization integrated into globally used green building rating systems is a fundamental issue since it determines how the performance of a structure or development is reflected. Certain nationally-developed certification systems are used globally without being subjected to adjustments with respect to local geographical, cultural, economic and social parameters. This may lead to a situation where the results of an evaluation may not reflect the reality of the region and/or the site of construction. The main objective of this paper is to examine and underline the problems regarding the issue of weighting environmental concerns in the Leadership in Energy and Environmental Design (LEED) certification system, which is a US-originated but globally used assessment tool. The methodology of this study consists of; (i) an analysis of the approach of LEED in the New Construction and Major Renovations scheme in version 3 (LEED NC, v.3) and the Building Design and Construction scheme in version 4 (LEED BD + C, v.4), (ii) case studies in which regional priority credits (RPCs) set by LEED for four countries (Canada, Turkey, China and Egypt) are criticized with respect to countries' own local conditions, and, (iii) an analysis of the approaches of major environmental assessment tools, namely; BREEAM, SBTool, CASBEE and Green Star, in comparison to the approach in LEED, regarding the main issue of this paper. This work shows that, even in its latest version (v.4) LEED still displays some inadequacies and inconsistencies from the aspect of environmental concern prioritization and has not yet managed to incorporate a system which is more sensitive to this issue. This paper further outlines the differences and similarities between the approaches of the aforementioned major environmental assessment tools with respect to the issue of concern and the factors that should be integrated into future versions of LEED. Copyright © 2015 Elsevier Ltd. All rights reserved.
MemAxes: Visualization and Analytics for Characterizing Complex Memory Performance Behaviors.
Gimenez, Alfredo; Gamblin, Todd; Jusufi, Ilir; Bhatele, Abhinav; Schulz, Martin; Bremer, Peer-Timo; Hamann, Bernd
2018-07-01
Memory performance is often a major bottleneck for high-performance computing (HPC) applications. Deepening memory hierarchies, complex memory management, and non-uniform access times have made memory performance behavior difficult to characterize, and users require novel, sophisticated tools to analyze and optimize this aspect of their codes. Existing tools target only specific factors of memory performance, such as hardware layout, allocations, or access instructions. However, today's tools do not suffice to characterize the complex relationships between these factors. Further, they require advanced expertise to be used effectively. We present MemAxes, a tool based on a novel approach for analytic-driven visualization of memory performance data. MemAxes uniquely allows users to analyze the different aspects related to memory performance by providing multiple visual contexts for a centralized dataset. We define mappings of sampled memory access data to new and existing visual metaphors, each of which enabling a user to perform different analysis tasks. We present methods to guide user interaction by scoring subsets of the data based on known performance problems. This scoring is used to provide visual cues and automatically extract clusters of interest. We designed MemAxes in collaboration with experts in HPC and demonstrate its effectiveness in case studies.
Problems in Choosing Tools and Methods for Teaching Programming
ERIC Educational Resources Information Center
Vitkute-Adžgauskiene, Davia; Vidžiunas, Antanas
2012-01-01
The paper analyses the problems in selecting and integrating tools for delivering basic programming knowledge at the university level. Discussion and analysis of teaching the programming disciplines, the main principles of study programme design, requirements for teaching tools, methods and corresponding languages is presented, based on literature…
Writing in Groups as a Tool for Non-Routine Problem Solving in First Year University Mathematics
ERIC Educational Resources Information Center
Taylor, J. A.; McDonald, C.
2007-01-01
Development of mathematical problem solving skills is an age old problem in mathematics. This paper details the design of a component of a first year university mathematics course in which group work and mathematical communication skills, especially writing skills, are used as a tool to develop non-routine problem solving skills. In this design…
NASA Astrophysics Data System (ADS)
Esperon-Miguez, Manuel; John, Philip; Jennions, Ian K.
2013-01-01
Integrated Vehicle Health Management (IVHM) comprises a set of tools, technologies and techniques for automated detection, diagnosis and prognosis of faults in order to support platforms more efficiently. Specific challenges are faced when IVHM tools are to be retrofitted into legacy vehicles since major modifications are much more challenging than with platforms whose design can still be modified. The topics covered in this Review Paper include the state of the art of IVHM tools and how their characteristics match the requirements of legacy aircraft, a summary of problems faced in the past trying to retrofit IVHM tools both from a technical and organisational perspective and the current level of implementation of IVHM in industry. Although the technology has not reached the level necessary to implement IVHM to its full potential on every kind of component, significant progress has been achieved on rotating equipment, structures or electronics. Attempts to retrofit some of these tools in the past faced both technical difficulties and opposition by some stakeholders, the later being responsible for the failure of technically sound projects in more than one occasion. Nevertheless, despite these difficulties, products and services based on IVHM technology have started to be offered by the manufacturers and, what is more important, demanded by the operators, providing guidance on what the industry would demand from IVHM on legacy aircraft.
Tool use and mechanical problem solving in apraxia.
Goldenberg, G; Hagmann, S
1998-07-01
Moorlaas (1928) proposed that apraxic patients can identify objects and can remember the purpose they have been made for but do not know the way in which they must be used to achieve that purpose. Knowledge about the use of objects and tools can have two sources: It can be based on retrieval of instructions of use from semantic memory or on a direct inference of function from structure. The ability to infer function from structure enables subjects to use unfamiliar tools and to detect alternative uses of familiar tools. It is the basis of mechanical problem solving. The purpose of the present study was to analyze retrieval of instruction of use, mechanical problem solving, and actual tool use in patients with apraxia due to circumscribed lesions of the left hemisphere. For assessing mechanical problem solving we developed a test of selection and application of novel tools. Access to instruction of use was tested by pantomime of tool use. Actual tool use was examined for the same familiar tools. Forty two patients with left brain damage (LBD) and aphasia, 22 patients with right brain damage (RBD) and 22 controls were examined. Only LBD patients differed from controls on all tests. RBD patients had difficulties with the use but not with the selection of novel tools. In LBD patients there was a significant correlation between pantomime of tool use and novel tool selection but there were single cases who scored in the defective range on one of these tests and normally on the other. Analysis of LBD patients' lesions suggested that frontal lobe damage does not disturb novel tool selection. Only LBD patients who failed on pantomime of object use and on novel tool selection committed errors in actual use of familiar tools. The finding that mechanical problem solving is invariably defective in apraxic patients who commit errors with familiar tools is in good accord with clinical observations, as the gravity of their errors goes beyond what one would expect as a mere sequel of loss of access to instruction of use.
Applications of large-scale density functional theory in biology
NASA Astrophysics Data System (ADS)
Cole, Daniel J.; Hine, Nicholas D. M.
2016-10-01
Density functional theory (DFT) has become a routine tool for the computation of electronic structure in the physics, materials and chemistry fields. Yet the application of traditional DFT to problems in the biological sciences is hindered, to a large extent, by the unfavourable scaling of the computational effort with system size. Here, we review some of the major software and functionality advances that enable insightful electronic structure calculations to be performed on systems comprising many thousands of atoms. We describe some of the early applications of large-scale DFT to the computation of the electronic properties and structure of biomolecules, as well as to paradigmatic problems in enzymology, metalloproteins, photosynthesis and computer-aided drug design. With this review, we hope to demonstrate that first principles modelling of biological structure-function relationships are approaching a reality.
NASA Astrophysics Data System (ADS)
Mues, Sarah; Lilge, Inga; Schönherr, Holger; Kemper, Björn; Schnekenburger, Jürgen
2017-02-01
The major problem of Digital Holographic Microscopy (DHM) long term live cell imaging is that over time most of the tracked cells move out of the image area and other ones move in. Therefore, most of the cells are lost for the evaluation of individual cellular processes. Here, we present an effective solution for this crucial problem of long-term microscopic live cell analysis. We have generated functionalized slides containing areas of 250 μm per 200 μm. These micropatterned biointerfaces consist of passivating polyaclrylamide brushes (PAAm). Inner areas are backfilled with octadecanthiol (ODT), which allows cell attachment. The fouling properties of these surfaces are highly controllable and therefore the defined areas designed for the size our microscopic image areas were effective in keeping all cells inside the rectangles over the selected imaging period.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liphardt, Jan
In April 1953, Watson and Crick largely defined the program of 20th century biology: obtaining the blueprint of life encoded in the DNA. Fifty years later, in 2003, the sequencing of the human genome was completed. Like any major scientific breakthrough, the sequencing of the human genome raised many more questions than it answered. I'll brief you on some of the big open problems in cell and developmental biology, and I'll explain why approaches, tools, and ideas from the physical sciences are currently reshaping biological research. Super-resolution light microscopies are revealing the intricate spatial organization of cells, single-molecule methods showmore » how molecular machines function, and new probes are clarifying the role of mechanical forces in cell and tissue function. At the same time, Physics stands to gain beautiful new problems in soft condensed matter, quantum mechanics, and non-equilibrium thermodynamics.« less
NASA Astrophysics Data System (ADS)
Cummings, Karen; Marx, Jeffrey D.
2010-10-01
We have developed an assessment of students' ability to solve standard textbook style problems and are currently engaged in the validation and revision process. The assessment covers the topics of force and motion, conservation of momentum and conservation of energy at a level consistent with most calculus-based, introductory physics courses. This tool is discussed in more detail in an accompanying paper by Marx and Cummings. [1] Here we present preliminary beta-test data collected at four schools during the 2009/2010 academic year. Data include both pre- and post-instruction results for introductory physics courses as well as results for physics majors in later years. In addition, we present evidence that right/wrong grading may well be a perfectly acceptable grading procedure for a course-level assessment of this type.
The rising tide of ocean diseases: Unsolved problems and research priorities
Harvell, Drew; Aronson, Richard; Baron, Nancy; Connell, Joseph; Dobson, Andrew P.; Ellner, Steve; Gerber, Leah R.; Kim, Kiho; Kuris, Armand M.; McCallum, Hamish; Lafferty, Kevin D.; McKay, Bruce; Porter, James; Pascual, Mercedes; Smith, Garriett; Sutherland, Katherine; Ward, Jessica
2004-01-01
New studies have detected a rising number of reports of diseases in marine organisms such as corals, molluscs, turtles, mammals, and echinoderms over the past three decades. Despite the increasing disease load, microbiological, molecular, and theoretical tools for managing disease in the world's oceans are under-developed. Review of the new developments in the study of these diseases identifies five major unsolved problems and priorities for future research: (1) detecting origins and reservoirs for marine diseases and tracing the flow of some new pathogens from land to sea; (2) documenting the longevity and host range of infectious stages; (3) evaluating the effect of greater taxonomic diversity of marine relative to terrestrial hosts and pathogens; (4) pinpointing the facilitating role of anthropogenic agents as incubators and conveyors of marine pathogens; (5) adapting epidemiological models to analysis of marine disease.
The Intersection of Physics and Biology
Liphardt, Jan
2017-12-22
In April 1953, Watson and Crick largely defined the program of 20th century biology: obtaining the blueprint of life encoded in the DNA. Fifty years later, in 2003, the sequencing of the human genome was completed. Like any major scientific breakthrough, the sequencing of the human genome raised many more questions than it answered. I'll brief you on some of the big open problems in cell and developmental biology, and I'll explain why approaches, tools, and ideas from the physical sciences are currently reshaping biological research. Super-resolution light microscopies are revealing the intricate spatial organization of cells, single-molecule methods show how molecular machines function, and new probes are clarifying the role of mechanical forces in cell and tissue function. At the same time, Physics stands to gain beautiful new problems in soft condensed matter, quantum mechanics, and non-equilibrium thermodynamics.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lherbier, Louis, W.; Novotnak, David, J.; Herling, Darrell, R.
Hot forming processes such as forging, die casting and glass forming require tooling that is subjected to high temperatures during the manufacturing of components. Current tooling is adversely affected by prolonged exposure at high temperatures. Initial studies were conducted to determine the root cause of tool failures in a number of applications. Results show that tool failures vary and depend on the operating environment under which they are used. Major root cause failures include (1) thermal softening, (2) fatigue and (3) tool erosion, all of which are affected by process boundary conditions such as lubrication, cooling, process speed, etc. Whilemore » thermal management is a key to addressing tooling failures, it was clear that new tooling materials with superior high temperature strength could provide improved manufacturing efficiencies. These efficiencies are based on the use of functionally graded materials (FGM), a new subset of hybrid tools with customizable properties that can be fabricated using advanced powder metallurgy manufacturing technologies. Modeling studies of the various hot forming processes helped identify the effect of key variables such as stress, temperature and cooling rate and aid in the selection of tooling materials for specific applications. To address the problem of high temperature strength, several advanced powder metallurgy nickel and cobalt based alloys were selected for evaluation. These materials were manufactured into tooling using two relatively new consolidation processes. One process involved laser powder deposition (LPD) and the second involved a solid state dynamic powder consolidation (SSDPC) process. These processes made possible functionally graded materials (FGM) that resulted in shaped tooling that was monolithic, bi-metallic or substrate coated. Manufacturing of tooling with these processes was determined to be robust and consistent for a variety of materials. Prototype and production testing of FGM tooling showed the benefits of the nickel and cobalt based powder metallurgy alloys in a number of applications evaluated. Improvements in tool life ranged from three (3) to twenty (20) or more times than currently used tooling. Improvements were most dramatic where tool softening and deformation were the major cause of tool failures in hot/warm forging applications. Significant improvement was also noted in erosion of aluminum die casting tooling. Cost and energy savings can be realized as a result of increased tooling life, increased productivity and a reduction in scrap because of improved dimensional controls. Although LPD and SSDPC tooling usually have higher acquisition costs, net tooling costs per component produced drops dramatically with superior tool performance. Less energy is used to manufacture the tooling because fewer tools are required and less recycling of used tools are needed for the hot forming process. Energy is saved during the component manufacturing cycle because more parts can be produced in shorter periods of time. Energy is also saved by minimizing heating furnace idling time because of less downtime for tooling changes.« less
Trends and Issues in Fuzzy Control and Neuro-Fuzzy Modeling
NASA Technical Reports Server (NTRS)
Chiu, Stephen
1996-01-01
Everyday experience in building and repairing things around the home have taught us the importance of using the right tool for the right job. Although we tend to think of a 'job' in broad terms, such as 'build a bookcase,' we understand well that the 'right job' associated with each 'right tool' is typically a narrowly bounded subtask, such as 'tighten the screws.' Unfortunately, we often lose sight of this principle when solving engineering problems; we treat a broadly defined problem, such as controlling or modeling a system, as a narrow one that has a single 'right tool' (e.g., linear analysis, fuzzy logic, neural network). We need to recognize that a typical real-world problem contains a number of different sub-problems, and that a truly optimal solution (the best combination of cost, performance and feature) is obtained by applying the right tool to the right sub-problem. Here I share some of my perspectives on what constitutes the 'right job' for fuzzy control and describe recent advances in neuro-fuzzy modeling to illustrate and to motivate the synergistic use of different tools.
APGEN Scheduling: 15 Years of Experience in Planning Automation
NASA Technical Reports Server (NTRS)
Maldague, Pierre F.; Wissler, Steve; Lenda, Matthew; Finnerty, Daniel
2014-01-01
In this paper, we discuss the scheduling capability of APGEN (Activity Plan Generator), a multi-mission planning application that is part of the NASA AMMOS (Advanced Multi- Mission Operations System), and how APGEN scheduling evolved over its applications to specific Space Missions. Our analysis identifies two major reasons for the successful application of APGEN scheduling to real problems: an expressive DSL (Domain-Specific Language) for formulating scheduling algorithms, and a well-defined process for enlisting the help of auxiliary modeling tools in providing high-fidelity, system-level simulations of the combined spacecraft and ground support system.
Management and development of local area network upgrade prototype
NASA Technical Reports Server (NTRS)
Fouser, T. J.
1981-01-01
Given the situation of having management and development users accessing a central computing facility and given the fact that these same users have the need for local computation and storage, the utilization of a commercially available networking system such as CP/NET from Digital Research provides the building blocks for communicating intelligent microsystems to file and print services. The major problems to be overcome in the implementation of such a network are the dearth of intelligent communication front-ends for the microcomputers and the lack of a rich set of management and software development tools.
Technology developments toward 30-year-life of photovoltaic modules
NASA Technical Reports Server (NTRS)
Ross, R. G., Jr.
1984-01-01
As part of the United States National Photovoltaics Program, the Jet Propulsion Laboratory's Flat-Plate Solar Array Project (FSA) has maintained a comprehensive reliability and engineering sciences activity addressed toward understanding the reliability attributes of terrestrial flat-plate photovoltaic arrays and to deriving analysis and design tools necessary to achieve module designs with a 30-year useful life. The considerable progress to date stemming from the ongoing reliability research is discussed, and the major areas requiring continued research are highlighted. The result is an overview of the total array reliability problem and of available means of achieving high reliability at minimum cost.
NASA Astrophysics Data System (ADS)
Authier-Martin, Monique
Dustiness of calcined alumina is a major concern, causing undesirable working conditions and serious alumina losses. These losses occur primarily during unloading and handling or pot loading and crust breaking. The handling side of the problem is first addressed. The Perra pulvimeter constitutes a simple and reproducible tool to quantify handling dustiness and yields results in agreement with plant experience. Attempts are made to correlate dustiness with bulk properties (particle size, attrition index, …) for a large number of diverse aluminas. The characterization of the dust generated with the Perra pulvimeter is most revealing. The effect of the addition of E.S.P. dust is also reported.
Experimental study on deep hole drilling of 17-4PH material
NASA Astrophysics Data System (ADS)
Uzhanfeng, LI; Uquantai, LI
2018-02-01
This paper uses 17-4PH material as the research object, according to the material characteristics of 17-4PH, designed and carried out deep hole drilling test. The purpose of the experiment is to study and discuss the three major problems of tool wear, chip shape and axial deviation of the hole in the process of deep hole drilling of 17-4PH materials. Through the deep hole drilling test of 17-4PH material, the variation of the chip shape and the deflection of the hole axis was obtained under different wear conditions.
Roux, C; Wyman, A; Hooven, F H; Gehlbach, S H; Adachi, J D; Chapurlat, R D; Compston, J E; Cooper, C; Díez-Pérez, A; Greenspan, S L; Lacroix, A Z; Netelenbos, J C; Pfeilschifter, J; Rossini, M; Saag, K G; Sambrook, P N; Silverman, S; Siris, E S; Watts, N B; Boonen, S
2012-12-01
Among 50,461 postmenopausal women, 1,822 fractures occurred (57% minor non-hip, non-vertebral [NHNV], 26% major NHNV, 10% spine, 7% hip) over 1 year. Spine fractures had the greatest detrimental effect on EQ-5D, followed by major NHNV and hip fractures. Decreases in physical function and health status were greatest for spine or hip fractures. There is growing evidence that NHNV fractures result in substantial morbidity and healthcare costs. The aim of this prospective study was to assess the effect of these NHNV fractures on quality of life. We analyzed the 1-year incidences of hip, spine, major NHNV (pelvis/leg, shoulder/arm) and minor NHNV (wrist/hand, ankle/foot, rib/clavicle) fractures among women from the Global Longitudinal study of Osteoporosis in Women (GLOW). Health-related quality of life (HRQL) was analyzed using the EuroQol EQ-5D tool and the SF-36 health survey. Among 50,461 women analyzed, there were 1,822 fractures (57% minor NHNV, 26% major NHNV, 10% spine, 7% hip) over 1 year. Spine fractures had the greatest detrimental effect on EQ-5D summary scores, followed by major NHNV and hip fractures. The number of women with mobility problems increased most for those with major NHNV and spine fractures (both +8%); spine fractures were associated with the largest increases in problems with self care (+11%), activities (+14%), and pain/discomfort (+12%). Decreases in physical function and health status were greatest for those with spine or hip fractures. Multivariable modeling found that EQ-5D reduction was greatest for spine fractures, followed by hip and major/minor NHNV. Statistically significant reductions in SF-36 physical function were found for spine fractures, and were borderline significant for major NHNV fractures. This prospective study shows that NHNV fractures have a detrimental effect on HRQL. Efforts to optimize the care of osteoporosis patients should include the prevention of NHNV fractures.
Students' Use of Technological Features while Solving a Mathematics Problem
ERIC Educational Resources Information Center
Lee, Hollylynne Stohl; Hollebrands, Karen F.
2006-01-01
The design of technology tools has the potential to dramatically influence how students interact with tools, and these interactions, in turn, may influence students' mathematical problem solving. To better understand these interactions, we analyzed eighth grade students' problem solving as they used a java applet designed to specifically accompany…
Use of an electronic problem list by primary care providers and specialists.
Wright, Adam; Feblowitz, Joshua; Maloney, Francine L; Henkin, Stanislav; Bates, David W
2012-08-01
Accurate patient problem lists are valuable tools for improving the quality of care, enabling clinical decision support, and facilitating research and quality measurement. However, problem lists are frequently inaccurate and out-of-date and use varies widely across providers. Our goal was to assess provider use of an electronic problem list and identify differences in usage between medical specialties. Chart review of a random sample of 100,000 patients who had received care in the past two years at a Boston-based academic medical center. Counts were collected of all notes and problems added for each patient from 1/1/2002 to 4/30/2010. For each entry, the recording provider and the clinic in which the entry was recorded was collected. We used the Healthcare Provider Taxonomy Code Set to categorize each clinic by specialty. We analyzed the problem list use across specialties, controlling for note volume as a proxy for visits. A total of 2,264,051 notes and 158,105 problems were recorded in the electronic medical record for this population during the study period. Primary care providers added 82.3% of all problems, despite writing only 40.4% of all notes. Of all patients, 49.1% had an assigned primary care provider (PCP) affiliated with the hospital; patients with a PCP had an average of 4.7 documented problems compared to 1.5 problems for patients without a PCP. Primary care providers were responsible for the majority of problem documentation; surgical and medical specialists and subspecialists recorded a disproportionately small number of problems on the problem list.
Drenth-van Maanen, A Clara; Leendertse, Anne J; Jansen, Paul A F; Knol, Wilma; Keijsers, Carolina J P W; Meulendijk, Michiel C; van Marum, Rob J
2018-04-01
Inappropriate prescribing is a major health care issue, especially regarding older patients on polypharmacy. Multiple implicit and explicit prescribing tools have been developed to improve prescribing, but these have hardly ever been used in combination. The Systematic Tool to Reduce Inappropriate Prescribing (STRIP) combines implicit prescribing tools with the explicit Screening Tool to Alert physicians to the Right Treatment and Screening Tool of Older People's potentially inappropriate Prescriptions criteria and has shared decision-making with the patient as a critical step. This article describes the STRIP and its ability to identify potentially inappropriate prescribing. The STRIP improved general practitioners' and final-year medical students' medication review skills. The Web-application STRIP Assistant was developed to enable health care providers to use the STRIP in daily practice and will be incorporated in clinical decision support systems. It is currently being used in the European Optimizing thERapy to prevent Avoidable hospital admissions in the Multimorbid elderly (OPERAM) project, a multicentre randomized controlled trial involving patients aged 75 years and older using multiple medications for multiple medical conditions. In conclusion, the STRIP helps health care providers to systematically identify potentially inappropriate prescriptions and medication-related problems and to change the patient's medication regimen in accordance with the patient's needs and wishes. This article describes the STRIP and the available evidence so far. The OPERAM study is investigating the effect of STRIP use on clinical and economic outcomes. © 2017 John Wiley & Sons, Ltd.
Building Diagnostic Market Deployment - Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Katipamula, S.; Gayeski, N.
2012-04-30
Operational faults are pervasive across the commercial buildings sector, wasting energy and increasing energy costs by up to about 30% (Mills 2009, Liu et al. 2003, Claridge et al. 2000, Katipamula and Brambley 2008, and Brambley and Katipamula 2009). Automated fault detection and diagnostic (AFDD) tools provide capabilities essential for detecting and correcting these problems and eliminating the associated energy waste and costs. The U.S. Department of Energy's (DOE) Building Technology Program (BTP) has previously invested in developing and testing of such diagnostic tools for whole-building (and major system) energy use, air handlers, chillers, cooling towers, chilled-water distribution systems, andmore » boilers. These diagnostic processes can be used to make the commercial buildings more energy efficient. The work described in this report was done as part of a Cooperative Research and Development Agreement (CRADA) between the U.S. Department of Energy's Pacific Northwest National Laboratory (PNNL) and KGS Building LLC (KGS). PNNL and KGS both believe that the widespread adoption of AFDD tools will result in significant reduction to energy and peak energy consumption. The report provides an introduction and summary of the various tasks performed under the CRADA. The CRADA project had three major focus areas: (1) Technical Assistance for Whole Building Energy Diagnostician (WBE) Commercialization, (2) Market Transfer of the Outdoor Air/Economizer Diagnostician (OAE), and (3) Development and Deployment of Automated Diagnostics to Improve Large Commercial Building Operations. PNNL has previously developed two diagnostic tools: (1) whole building energy (WBE) diagnostician and (2) outdoor air/economizer (OAE) diagnostician. WBE diagnostician is currently licensed non-exclusively to one company. As part of this CRADA, PNNL developed implementation documentation and provided technical support to KGS to implement the tool into their software suite, Clockworks. PNNL also provided validation data sets and the WBE software tool to validate the KGS implementation. OAE diagnostician automatically detects and diagnoses problems with outdoor air ventilation and economizer operation for air handling units (AHUs) in commercial buildings using data available from building automation systems (BASs). As part of this CRADA, PNNL developed implementation documentation and provided technical support to KGS to implement the tool into their software suite. PNNL also provided validation data sets and the OAE software tool to validate the KGS implementation. Finally, as part of this CRADA project, PNNL developed new processes to automate parts of the re-tuning process and transfer those process to KGS for integration into their software product. The transfer of DOE-funded technologies will transform the commercial buildings sector by making buildings more energy efficient and also reducing the carbon footprint from the buildings. As part of the CRADA with PNNL, KGS implemented the whole building energy diagnostician, a portion of outdoor air economizer diagnostician and a number of measures that automate the identification of re-tuning measures.« less
ERIC Educational Resources Information Center
Kostousov, Sergei; Kudryavtsev, Dmitry
2017-01-01
Problem solving is a critical competency for modern world and also an effective way of learning. Education should not only transfer domain-specific knowledge to students, but also prepare them to solve real-life problems--to apply knowledge from one or several domains within specific situation. Problem solving as teaching tool is known for a long…
Major chest wall reconstruction after chest wall irradiation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Larson, D.L.; McMurtrey, M.J.; Howe, H.J.
1982-03-15
In the last year, 12 patients have undergone extensive chest wall resection. Eight patients had recurrent cancer after prior resection and irradiation with an average defect of 160 square centimeters, usually including ribs and a portion of the sternum; four had radionecrosis of soft tissue and/or bone. Methods of reconstruction included latissimus dorsi musculocutaneous (MC) flap (five patients), pectoralis major MC flap (seven patients), and omental flap and skin graft (one patient). The donor site was usually closed primarily. All flaps survived providing good wound coverage. The only complication was partial loss of a latissimus dorsi MC flap related tomore » an infected wound; this reconstruction was salvaged with a pectoralis major MC flap. The hospital stay ranged from 10-25 days with a median stay of 11 days. Use of the MC flap is a valuable tool which can be used to significantly decrease morbidity, hospital stay, and patient discomfort related to the difficult problem of chest wall reconstruction after radiation therapy.« less
Problem Solving in a Middle School Robotics Design Classroom
NASA Astrophysics Data System (ADS)
Norton, Stephen J.; McRobbie, Campbell J.; Ginns, Ian S.
2007-07-01
Little research has been conducted on how students work when they are required to plan, build and evaluate artefacts in technology rich learning environments such as those supported by tools including flow charts, Labview programming and Lego construction. In this study, activity theory was used as an analytic tool to examine the social construction of meaning. There was a focus on the effect of teachers’ goals and the rules they enacted upon student use of the flow chart planning tool, and the tools of the programming language Labview and Lego construction. It was found that the articulation of a teacher’s goals via rules and divisions of labour helped to form distinct communities of learning and influenced the development of different problem solving strategies. The use of the planning tool flow charting was associated with continuity of approach, integration of problem solutions including appreciation of the nexus between construction and programming, and greater educational transformation. Students who flow charted defined problems in a more holistic way and demonstrated more methodical, insightful and integrated approaches to their use of tools. The findings have implications for teaching in design dominated learning environments.
Jane: a new tool for the cophylogeny reconstruction problem.
Conow, Chris; Fielder, Daniel; Ovadia, Yaniv; Libeskind-Hadas, Ran
2010-02-03
This paper describes the theory and implementation of a new software tool, called Jane, for the study of historical associations. This problem arises in parasitology (associations of hosts and parasites), molecular systematics (associations of orderings and genes), and biogeography (associations of regions and orderings). The underlying problem is that of reconciling pairs of trees subject to biologically plausible events and costs associated with these events. Existing software tools for this problem have strengths and limitations, and the new Jane tool described here provides functionality that complements existing tools. The Jane software tool uses a polynomial time dynamic programming algorithm in conjunction with a genetic algorithm to find very good, and often optimal, solutions even for relatively large pairs of trees. The tool allows the user to provide rich timing information on both the host and parasite trees. In addition the user can limit host switch distance and specify multiple host switch costs by specifying regions in the host tree and costs for host switches between pairs of regions. Jane also provides a graphical user interface that allows the user to interactively experiment with modifications to the solutions found by the program. Jane is shown to be a useful tool for cophylogenetic reconstruction. Its functionality complements existing tools and it is therefore likely to be of use to researchers in the areas of parasitology, molecular systematics, and biogeography.
Blackwell, Simon E.; Browning, Michael; Mathews, Andrew; Pictet, Arnaud; Welch, James; Davies, Jim; Watson, Peter; Geddes, John R.
2015-01-01
Depression is a global health problem requiring treatment innovation. Targeting neglected cognitive aspects may provide a useful route. We tested a cognitive-training paradigm using positive mental imagery (imagery cognitive bias modification, imagery CBM), developed via experimental psychopathology studies, in a randomized controlled trial. Training was delivered via the Internet to 150 individuals with current major depression. Unexpectedly, there was no significant advantage for imagery CBM compared with a closely matched control for depression symptoms as a whole in the full sample. In exploratory analyses, compared with the control, imagery CBM significantly improved anhedonia over the intervention and improved depression symptoms as a whole for those participants with fewer than five episodes of depression and those who engaged to a threshold level of imagery. Results suggest avenues for improving imagery CBM to inform low-intensity treatment tools for depression. Anhedonia may be a useful treatment target for future work. PMID:25984421
A tool for simulating parallel branch-and-bound methods
NASA Astrophysics Data System (ADS)
Golubeva, Yana; Orlov, Yury; Posypkin, Mikhail
2016-01-01
The Branch-and-Bound method is known as one of the most powerful but very resource consuming global optimization methods. Parallel and distributed computing can efficiently cope with this issue. The major difficulty in parallel B&B method is the need for dynamic load redistribution. Therefore design and study of load balancing algorithms is a separate and very important research topic. This paper presents a tool for simulating parallel Branchand-Bound method. The simulator allows one to run load balancing algorithms with various numbers of processors, sizes of the search tree, the characteristics of the supercomputer's interconnect thereby fostering deep study of load distribution strategies. The process of resolution of the optimization problem by B&B method is replaced by a stochastic branching process. Data exchanges are modeled using the concept of logical time. The user friendly graphical interface to the simulator provides efficient visualization and convenient performance analysis.
Space-based Remote Sensing: A Tool for Studying Bird Migration Across Multiple Scales
NASA Technical Reports Server (NTRS)
Smith, James A.
2005-01-01
The study of bird migration on a global scale is one of the compelling and challenging problems of modern biology with major implications for human health and conservation biology. Migration and conservation efforts cross national boundaries and are subject to numerous international agreements and treaties. Space based technology offers new opportunities to shed understanding on the distribution and migration of organisms on the planet and their sensitivity to human disturbances and environmental changes. Our working hypothesis is that individual organism biophysical models of energy and water balance, driven by satellite measurements of spatio-temporal gradients in climate and habitat, will help us to explain the variability in avian species richness and distribution. Further, these models provide an ecological forecasting tool for science and application users to visualize the possible consequences of loss of wetlands, flooding, or other natural disasters such as hurricanes on avian biodiversity and bird migration.
PowderSim: Lagrangian Discrete and Mesh-Free Continuum Simulation Code for Cohesive Soils
NASA Technical Reports Server (NTRS)
Johnson, Scott; Walton, Otis; Settgast, Randolph
2013-01-01
PowderSim is a calculation tool that combines a discrete-element method (DEM) module, including calibrated interparticle-interaction relationships, with a mesh-free, continuum, SPH (smoothed-particle hydrodynamics) based module that utilizes enhanced, calibrated, constitutive models capable of mimicking both large deformations and the flow behavior of regolith simulants and lunar regolith under conditions anticipated during in situ resource utilization (ISRU) operations. The major innovation introduced in PowderSim is to use a mesh-free method (SPH-based) with a calibrated and slightly modified critical-state soil mechanics constitutive model to extend the ability of the simulation tool to also address full-scale engineering systems in the continuum sense. The PowderSim software maintains the ability to address particle-scale problems, like size segregation, in selected regions with a traditional DEM module, which has improved contact physics and electrostatic interaction models.
Cardiac magnetic resonance imaging and computed tomography in ischemic cardiomyopathy: an update*
Assunção, Fernanda Boldrini; de Oliveira, Diogo Costa Leandro; Souza, Vitor Frauches; Nacif, Marcelo Souto
2016-01-01
Ischemic cardiomyopathy is one of the major health problems worldwide, representing a significant part of mortality in the general population nowadays. Cardiac magnetic resonance imaging (CMRI) and cardiac computed tomography (CCT) are noninvasive imaging methods that serve as useful tools in the diagnosis of coronary artery disease and may also help in screening individuals with risk factors for developing this illness. Technological developments of CMRI and CCT have contributed to the rise of several clinical indications of these imaging methods complementarily to other investigation methods, particularly in cases where they are inconclusive. In terms of accuracy, CMRI and CCT are similar to the other imaging methods, with few absolute contraindications and minimal risks of adverse side-effects. This fact strengthens these methods as powerful and safe tools in the management of patients. The present study is aimed at describing the role played by CMRI and CCT in the diagnosis of ischemic cardiomyopathies. PMID:26929458
Eliminating the Neglected Tropical Diseases: Translational Science and New Technologies.
Hotez, Peter J; Pecoul, Bernard; Rijal, Suman; Boehme, Catharina; Aksoy, Serap; Malecela, Mwelecele; Tapia-Conyer, Roberto; Reeder, John C
2016-03-01
Today, the World Health Organization recognizes 17 major parasitic and related infections as the neglected tropical diseases (NTDs). Despite recent gains in the understanding of the nature and prevalence of NTDs, as well as successes in recent scaled-up preventive chemotherapy strategies and other health interventions, the NTDs continue to rank among the world's greatest global health problems. For virtually all of the NTDs (including those slated for elimination under the auspices of a 2012 London Declaration for NTDs and a 2013 World Health Assembly resolution [WHA 66.12]), additional control mechanisms and tools are needed, including new NTD drugs, vaccines, diagnostics, and vector control agents and strategies. Elimination will not be possible without these new tools. Here we summarize some of the key challenges in translational science to develop and introduce these new technologies in order to ensure success in global NTD elimination efforts.
Risk analysis and bovine tuberculosis, a re-emerging zoonosis.
Etter, Eric; Donado, Pilar; Jori, Ferran; Caron, Alexandre; Goutard, Flavie; Roger, François
2006-10-01
The widespread of immunodeficiency with AIDS, the consequence of poverty on sanitary protection and information at both individual and state levels lead control of tuberculosis (TB) to be one of the priorities of World Health Organization programs. The impact of bovine tuberculosis (BTB) on humans is poorly documented. However, BTB remains a major problem for livestock in developing countries particularly in Africa and wildlife is responsible for the failure of TB eradication programs. In Africa, the consumption of raw milk and raw meat, and the development of bushmeat consumption as a cheap source of proteins, represent one of the principal routes for human contaminations with BTB. The exploration of these different pathways using tools as participatory epidemiology allows the risk analysis of the impact of BTB on human health in Africa. This analysis represents a management support and decision tool in the study and the control of zoonotic BTB.
Design tool for multiprocessor scheduling and evaluation of iterative dataflow algorithms
NASA Technical Reports Server (NTRS)
Jones, Robert L., III
1995-01-01
A graph-theoretic design process and software tool is defined for selecting a multiprocessing scheduling solution for a class of computational problems. The problems of interest are those that can be described with a dataflow graph and are intended to be executed repetitively on a set of identical processors. Typical applications include signal processing and control law problems. Graph-search algorithms and analysis techniques are introduced and shown to effectively determine performance bounds, scheduling constraints, and resource requirements. The software tool applies the design process to a given problem and includes performance optimization through the inclusion of additional precedence constraints among the schedulable tasks.
Garg, Ankur; Singh, Mongjam Meghachandra; Gupta, Vimal Kishore; Garg, Suneela; Daga, Mradul Kumar; Saha, Renuka
2012-10-01
To assess the prevalence and correlates of current smoking, awareness of hazards, and quitting behavior among smokers 30 years and above. Cross-sectional; Gokulpuri, a resettlement colony in East Delhi, India; 911, persons aged 30 years and above using systematic random sampling; Study tools: Semi-structured questionnaire. Prevalence of current smoking was found to be 24.6% (95% CI 21.90 - 27.49). Majority 198 (88.4%) of current smokers smoked bidi exclusively, and on an average 13.5 bidi/cigarette were smoked per day. Multivariate analysis showed the factors associated with current smoking as male sex, advancing age, illiteracy, skilled occupation, low socio-economic status, and low BMI (P < 0.001). 64.2% were aware of the hazards of smoking. 63 (21.9%) had quit smoking in the past, majority due to the health problems. Low educational status was associated with poor hazard awareness and quitting behavior. Smoking is a significant problem among poor and illiterate males, shows an increasing trend with an advancing age and is directly associated with skilled occupation and low BMI. There are significant gaps in knowledge regarding hazards of smoking.
Active Subspace Methods for Data-Intensive Inverse Problems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Qiqi
2017-04-27
The project has developed theory and computational tools to exploit active subspaces to reduce the dimension in statistical calibration problems. This dimension reduction enables MCMC methods to calibrate otherwise intractable models. The same theoretical and computational tools can also reduce the measurement dimension for calibration problems that use large stores of data.
ERIC Educational Resources Information Center
Mills, Kathy A.; Chandra, Vinesh; Park, Ji Yong
2013-01-01
This paper demonstrates, following Vygotsky, that language and tool use has a critical role in the collaborative problem-solving behaviour of school-age children. It reports original ethnographic classroom research examining the convergence of speech and practical activity in children's collaborative problem solving with robotics programming…
Trident: A Universal Tool for Generating Synthetic Absorption Spectra from Astrophysical Simulations
NASA Astrophysics Data System (ADS)
Hummels, Cameron B.; Smith, Britton D.; Silvia, Devin W.
2017-09-01
Hydrodynamical simulations are increasingly able to accurately model physical systems on stellar, galactic, and cosmological scales; however, the utility of these simulations is often limited by our ability to directly compare them with the data sets produced by observers: spectra, photometry, etc. To address this problem, we have created trident, a Python-based open-source tool for post-processing hydrodynamical simulations to produce synthetic absorption spectra and related data. trident can (I) create absorption-line spectra for any trajectory through a simulated data set mimicking both background quasar and down-the-barrel configurations; (II) reproduce the spectral characteristics of common instruments like the Cosmic Origins Spectrograph; (III) operate across the ultraviolet, optical, and infrared using customizable absorption-line lists; (IV) trace simulated physical structures directly to spectral features; (v) approximate the presence of ion species absent from the simulation outputs; (VI) generate column density maps for any ion; and (vii) provide support for all major astrophysical hydrodynamical codes. trident was originally developed to aid in the interpretation of observations of the circumgalactic medium and intergalactic medium, but it remains a general tool applicable in other contexts.
Optimization in Cardiovascular Modeling
NASA Astrophysics Data System (ADS)
Marsden, Alison L.
2014-01-01
Fluid mechanics plays a key role in the development, progression, and treatment of cardiovascular disease. Advances in imaging methods and patient-specific modeling now reveal increasingly detailed information about blood flow patterns in health and disease. Building on these tools, there is now an opportunity to couple blood flow simulation with optimization algorithms to improve the design of surgeries and devices, incorporating more information about the flow physics in the design process to augment current medical knowledge. In doing so, a major challenge is the need for efficient optimization tools that are appropriate for unsteady fluid mechanics problems, particularly for the optimization of complex patient-specific models in the presence of uncertainty. This article reviews the state of the art in optimization tools for virtual surgery, device design, and model parameter identification in cardiovascular flow and mechanobiology applications. In particular, it reviews trade-offs between traditional gradient-based methods and derivative-free approaches, as well as the need to incorporate uncertainties. Key future challenges are outlined, which extend to the incorporation of biological response and the customization of surgeries and devices for individual patients.
Saudi English-Major Undergraduates' Academic Writing Problems: A Taif University Perspective
ERIC Educational Resources Information Center
Al-Khairy, Mohamed Ali
2013-01-01
This study attempted to investigate Saudi English-major undergraduates studying at Taif University to identify a) the types of academic writing Saudi English-major undergraduates carry out at English departments, b) Saudi English-major undergraduates' writing problems, c) the reasons behind Saudi English-major undergraduates' writing problems and…
Leveraging Modeling Approaches: Reaction Networks and Rules
Blinov, Michael L.; Moraru, Ion I.
2012-01-01
We have witnessed an explosive growth in research involving mathematical models and computer simulations of intracellular molecular interactions, ranging from metabolic pathways to signaling and gene regulatory networks. Many software tools have been developed to aid in the study of such biological systems, some of which have a wealth of features for model building and visualization, and powerful capabilities for simulation and data analysis. Novel high resolution and/or high throughput experimental techniques have led to an abundance of qualitative and quantitative data related to the spatio-temporal distribution of molecules and complexes, their interactions kinetics, and functional modifications. Based on this information, computational biology researchers are attempting to build larger and more detailed models. However, this has proved to be a major challenge. Traditionally, modeling tools require the explicit specification of all molecular species and interactions in a model, which can quickly become a major limitation in the case of complex networks – the number of ways biomolecules can combine to form multimolecular complexes can be combinatorially large. Recently, a new breed of software tools has been created to address the problems faced when building models marked by combinatorial complexity. These have a different approach for model specification, using reaction rules and species patterns. Here we compare the traditional modeling approach with the new rule-based methods. We make a case for combining the capabilities of conventional simulation software with the unique features and flexibility of a rule-based approach in a single software platform for building models of molecular interaction networks. PMID:22161349
Leveraging modeling approaches: reaction networks and rules.
Blinov, Michael L; Moraru, Ion I
2012-01-01
We have witnessed an explosive growth in research involving mathematical models and computer simulations of intracellular molecular interactions, ranging from metabolic pathways to signaling and gene regulatory networks. Many software tools have been developed to aid in the study of such biological systems, some of which have a wealth of features for model building and visualization, and powerful capabilities for simulation and data analysis. Novel high-resolution and/or high-throughput experimental techniques have led to an abundance of qualitative and quantitative data related to the spatiotemporal distribution of molecules and complexes, their interactions kinetics, and functional modifications. Based on this information, computational biology researchers are attempting to build larger and more detailed models. However, this has proved to be a major challenge. Traditionally, modeling tools require the explicit specification of all molecular species and interactions in a model, which can quickly become a major limitation in the case of complex networks - the number of ways biomolecules can combine to form multimolecular complexes can be combinatorially large. Recently, a new breed of software tools has been created to address the problems faced when building models marked by combinatorial complexity. These have a different approach for model specification, using reaction rules and species patterns. Here we compare the traditional modeling approach with the new rule-based methods. We make a case for combining the capabilities of conventional simulation software with the unique features and flexibility of a rule-based approach in a single software platform for building models of molecular interaction networks.
Final report of the Conference on the eradicability of Onchocerciasis.
Dadzie, Yankum; Neira, Maria; Hopkins, Donald
2003-02-07
Sixty-four experts from a variety of disciplines attended a Conference on the Eradicability of Onchocerciasis at The Carter Center, in Atlanta GA, held January 22-24, 2002. The Conference, which was organized by The Carter Center and the World Health Organization, with funding from the Bill & Melinda Gates Foundation, addressed the question: "Is onchocerciasis (River Blindness) eradicable with current knowledge and tools?" Former US President Jimmy Carter attended part of the final plenary proceedings on January 24.The Conference consisted of a series of presentations by invited expert speakers (Appendix C) and further deliberations in four workgroups (Appendix D) followed by plenary discussion of major conclusions. The presentations underlined epidemiological and entomological differences between onchocerciasis in Africa and the Americas. Whilst onchocerciasis in Africa covers extensive areas and is associated with striking human and fly population migrations and remarkably efficient black fly vectors, in the Americas onchocerciasis is found in limited foci. Human and fly population migration are not major problems in the Americas, where most black fly species are inefficient, though some efficient black flies are also found there. Vector control has been effectively applied in the Onchocerciasis Control Program in West Africa (OCP) with remarkable results, interrupting transmission in most parts of the original Program area. The use of ivermectin has given variable results: while ivermectin treatment has been effective in all endemic areas in controlling onchocerciasis as a public health problem, its potential for interrupting transmission is more promising in hypo- and mesoendemic areas. The African Program for Onchocerciasis Control (APOC), which supports onchocerciasis control in endemic African countries outside the OCP, applies ivermectin, its principal control tool, to communities in high-risk areas as determined by rapid epidemiological mapping of onchocerciasis (REMO) and Geographic Information Systems (GIS). In the Americas, through support of the Onchocerciasis Elimination Program in the Americas (OEPA), a strategy of bi-annual ivermectin treatment of at least 85% of the eligible populations in all endemic communities is showing very good results and promises to be effective in eliminating onchocerciasis in the region.The Conference concluded that onchocerciasis is not eradicable using current tools due to the major barriers to eradication in Africa. However, the Conference also concluded that in most if not all the Americas, and possibly Yemen and some sites in Africa, transmission of onchocerciasis can be eliminated using current tools. The Conference recommended that where interruption of transmission is feasible and cost effective, programs should aim for that goal using all appropriate and available interventions so that the Onchocerca volvulus can eventually be eliminated and interventions halted. Although interruption of transmission of onchocerciasis cannot currently be achieved in most of Africa, the Conference recommended that efforts be made to preserve areas in West Africa made free of onchocerciasis transmission through the Onchocerciasis Control Program over the past 25 years. In the remaining hyper and mesoendemic foci in Africa, continued annual distribution of ivermectin will keep onchocerciasis controlled to a point where it is no longer a public health problem or constraint to economic development.
Software Performs Complex Design Analysis
NASA Technical Reports Server (NTRS)
2008-01-01
Designers use computational fluid dynamics (CFD) to gain greater understanding of the fluid flow phenomena involved in components being designed. They also use finite element analysis (FEA) as a tool to help gain greater understanding of the structural response of components to loads, stresses and strains, and the prediction of failure modes. Automated CFD and FEA engineering design has centered on shape optimization, which has been hindered by two major problems: 1) inadequate shape parameterization algorithms, and 2) inadequate algorithms for CFD and FEA grid modification. Working with software engineers at Stennis Space Center, a NASA commercial partner, Optimal Solutions Software LLC, was able to utilize its revolutionary, one-of-a-kind arbitrary shape deformation (ASD) capability-a major advancement in solving these two aforementioned problems-to optimize the shapes of complex pipe components that transport highly sensitive fluids. The ASD technology solves the problem of inadequate shape parameterization algorithms by allowing the CFD designers to freely create their own shape parameters, therefore eliminating the restriction of only being able to use the computer-aided design (CAD) parameters. The problem of inadequate algorithms for CFD grid modification is solved by the fact that the new software performs a smooth volumetric deformation. This eliminates the extremely costly process of having to remesh the grid for every shape change desired. The program can perform a design change in a markedly reduced amount of time, a process that would traditionally involve the designer returning to the CAD model to reshape and then remesh the shapes, something that has been known to take hours, days-even weeks or months-depending upon the size of the model.
Arbitrary Shape Deformation in CFD Design
NASA Technical Reports Server (NTRS)
Landon, Mark; Perry, Ernest
2014-01-01
Sculptor(R) is a commercially available software tool, based on an Arbitrary Shape Design (ASD), which allows the user to perform shape optimization for computational fluid dynamics (CFD) design. The developed software tool provides important advances in the state-of-the-art of automatic CFD shape deformations and optimization software. CFD is an analysis tool that is used by engineering designers to help gain a greater understanding of the fluid flow phenomena involved in the components being designed. The next step in the engineering design process is to then modify, the design to improve the components' performance. This step has traditionally been performed manually via trial and error. Two major problems that have, in the past, hindered the development of an automated CFD shape optimization are (1) inadequate shape parameterization algorithms, and (2) inadequate algorithms for CFD grid modification. The ASD that has been developed as part of the Sculptor(R) software tool is a major advancement in solving these two issues. First, the ASD allows the CFD designer to freely create his own shape parameters, thereby eliminating the restriction of only being able to use the CAD model parameters. Then, the software performs a smooth volumetric deformation, which eliminates the extremely costly process of having to remesh the grid for every shape change (which is how this process had previously been achieved). Sculptor(R) can be used to optimize shapes for aerodynamic and structural design of spacecraft, aircraft, watercraft, ducts, and other objects that affect and are affected by flows of fluids and heat. Sculptor(R) makes it possible to perform, in real time, a design change that would manually take hours or days if remeshing were needed.
ERIC Educational Resources Information Center
Guerra, Norma S.
2009-01-01
Graphic organizers are powerful visual tools. The representation provides dimension and relationship to ideas and a framework for organization and elaboration. The LIBRE Stick Figure Tool is a graphic organizer for the problem-solving application of the LIBRE Model counseling approach. It resembles a "stick person" and offers the teacher and…
Ontology-Driven Search and Triage: Design of a Web-Based Visual Interface for MEDLINE.
Demelo, Jonathan; Parsons, Paul; Sedig, Kamran
2017-02-02
Diverse users need to search health and medical literature to satisfy open-ended goals such as making evidence-based decisions and updating their knowledge. However, doing so is challenging due to at least two major difficulties: (1) articulating information needs using accurate vocabulary and (2) dealing with large document sets returned from searches. Common search interfaces such as PubMed do not provide adequate support for exploratory search tasks. Our objective was to improve support for exploratory search tasks by combining two strategies in the design of an interactive visual interface by (1) using a formal ontology to help users build domain-specific knowledge and vocabulary and (2) providing multi-stage triaging support to help mitigate the information overload problem. We developed a Web-based tool, Ontology-Driven Visual Search and Triage Interface for MEDLINE (OVERT-MED), to test our design ideas. We implemented a custom searchable index of MEDLINE, which comprises approximately 25 million document citations. We chose a popular biomedical ontology, the Human Phenotype Ontology (HPO), to test our solution to the vocabulary problem. We implemented multistage triaging support in OVERT-MED, with the aid of interactive visualization techniques, to help users deal with large document sets returned from searches. Formative evaluation suggests that the design features in OVERT-MED are helpful in addressing the two major difficulties described above. Using a formal ontology seems to help users articulate their information needs with more accurate vocabulary. In addition, multistage triaging combined with interactive visualizations shows promise in mitigating the information overload problem. Our strategies appear to be valuable in addressing the two major problems in exploratory search. Although we tested OVERT-MED with a particular ontology and document collection, we anticipate that our strategies can be transferred successfully to other contexts. ©Jonathan Demelo, Paul Parsons, Kamran Sedig. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 02.02.2017.
Ontology-Driven Search and Triage: Design of a Web-Based Visual Interface for MEDLINE
2017-01-01
Background Diverse users need to search health and medical literature to satisfy open-ended goals such as making evidence-based decisions and updating their knowledge. However, doing so is challenging due to at least two major difficulties: (1) articulating information needs using accurate vocabulary and (2) dealing with large document sets returned from searches. Common search interfaces such as PubMed do not provide adequate support for exploratory search tasks. Objective Our objective was to improve support for exploratory search tasks by combining two strategies in the design of an interactive visual interface by (1) using a formal ontology to help users build domain-specific knowledge and vocabulary and (2) providing multi-stage triaging support to help mitigate the information overload problem. Methods We developed a Web-based tool, Ontology-Driven Visual Search and Triage Interface for MEDLINE (OVERT-MED), to test our design ideas. We implemented a custom searchable index of MEDLINE, which comprises approximately 25 million document citations. We chose a popular biomedical ontology, the Human Phenotype Ontology (HPO), to test our solution to the vocabulary problem. We implemented multistage triaging support in OVERT-MED, with the aid of interactive visualization techniques, to help users deal with large document sets returned from searches. Results Formative evaluation suggests that the design features in OVERT-MED are helpful in addressing the two major difficulties described above. Using a formal ontology seems to help users articulate their information needs with more accurate vocabulary. In addition, multistage triaging combined with interactive visualizations shows promise in mitigating the information overload problem. Conclusions Our strategies appear to be valuable in addressing the two major problems in exploratory search. Although we tested OVERT-MED with a particular ontology and document collection, we anticipate that our strategies can be transferred successfully to other contexts. PMID:28153818
Molyneux, David H
2008-06-01
This paper suggests that the 'other diseases' of Millennium Development Goal 6 (MDG 6) are ignored by policy-makers and politicians who overfocus on unachievable objectives and targets around the 'big three' diseases of HIV, tuberculosis (TB) and malaria, which if the planet was viewed by aliens would be seen as the only diseases that existed on the planet. The diseases of the majority of the poor represent 'low hanging fruit' for control and elimination and opportunities are ignored despite the availability of cheap or donated drugs and ample evidence that such interventions are effective and reduce incidence, as well as mortality and morbidity. The time frame available to achieve the MDGs of some 7-8 years requires a re-evaluation of what can be done with the tools available now and which can address the problems faced by the majority of poor people afflicted by disabling conditions which together represent a global burden greater than malaria or TB. The author considers also the volume of research relevant to the MDGs and their achievement is distorted by the focus on high tech end research which cannot be delivered by 2015 and that in terms of the 90:10 gap in research relevant to the problems of the poorest the real gap is 99:1. The concepts of distortion of donor funding for diseases of MDG 6 for implementation of largely curative interventions which do not reduce incidence as well as research which addresses problems that cannot reach poor people in the time frame to 2015 is emphasised. New paradigms are required if any impact on MDG 6 is to be achieved recognising the needs of the majority via an equitable distribution of funding.
Assessment of injury severity in patients with major trauma.
Stanford, Penelope; Booth, Nicola; Suckley, Janet; Twelvetree, Timothy; Thomas, Debbie
2016-08-03
Major trauma centres provide specialised care for patients who have experienced serious traumatic injury. This article provides information about major trauma centres and outlines the assessment tools used in this setting. Since patients in major trauma centres will be transferred to other settings, including inpatient wards and primary care, this article is relevant for both nurses working in major trauma centres and in these areas. Traumatic injuries require rapid assessment to ensure the patient receives prompt, adequate and appropriate treatment. A range of assessment tools are available to assist nurses in major trauma centres and emergency care to assess the severity of a patient's injury. The most commonly used tools are triage, Catastrophic Haemorrhage Airway to Exposure assessment, pain assessment and the Glasgow Coma Scale. This article summarises the use of these assessment tools in these settings, and discusses the use of the Injury Severity Score (ISS) to determine the severity of patient injuries.
Insightful problem solving and creative tool modification by captive nontool-using rooks.
Bird, Christopher D; Emery, Nathan J
2009-06-23
The ability to use tools has been suggested to indicate advanced physical cognition in animals. Here we show that rooks, a member of the corvid family that do not appear to use tools in the wild are capable of insightful problem solving related to sophisticated tool use, including spontaneously modifying and using a variety of tools, shaping hooks out of wire, and using a series of tools in a sequence to gain a reward. It is remarkable that a species that does not use tools in the wild appears to possess an understanding of tools rivaling habitual tool users such as New Caledonian crows and chimpanzees. Our findings suggest that the ability to represent tools may be a domain-general cognitive capacity rather than an adaptive specialization and questions the relationship between physical intelligence and wild tool use.
Training generalized improvisation of tools by preschool children1
Parsonson, Barry S.; Baer, Donald M.
1978-01-01
The development of new, “creative” behaviors was examined in a problem-solving context. One form of problem solving, improvisation, was defined as finding a substitute to replace the specifically designated, but currently unavailable, tool ordinarily used to solve the problem. The study examined whether preschool children spontaneously displayed generalized improvisation skills, and if not, whether they could be trained to do so within different classes of tools. Generalization across different tool classes was monitored but not specifically trained. Five preschool children participated in individual sessions that first probed their skill at improvising tools, and later trained and probed generalized improvisation in one or more of three tool classes (Hammers, Containers, and Shoelaces), using a multiple-baseline design. All five children were trained with Hammers, two were trained in two classes, and two were trained in all three tool classes. Four of the five children improvised little in Baseline. During Training, all five showed increased generalized improvisation within the trained class, but none across classes. Tools fabricated by item combinations were rare in Baseline, but common in Training. Followup probes showed that the training effects were durable. PMID:16795596
Computer-aided programming for message-passing system; Problems and a solution
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, M.Y.; Gajski, D.D.
1989-12-01
As the number of processors and the complexity of problems to be solved increase, programming multiprocessing systems becomes more difficult and error-prone. Program development tools are necessary since programmers are not able to develop complex parallel programs efficiently. Parallel models of computation, parallelization problems, and tools for computer-aided programming (CAP) are discussed. As an example, a CAP tool that performs scheduling and inserts communication primitives automatically is described. It also generates the performance estimates and other program quality measures to help programmers in improving their algorithms and programs.
1986-10-31
Reference Card Given to Participants) Cognoter Reference Select = LeftButton Menu = MiddleButton TitleBar menu for tool operations Item menu for item...collaborative tools and their uses, the Colab system and the Cognoter presentation tool were implemented and used for both real and posed idea organization...tasks. To test the system design and its effect on structured problem-solving, many early Colab/ Cognoter meetings were monitored and a series of
Learning as a Problem Solving Tool. Technical Report CS74018-R.
ERIC Educational Resources Information Center
Claybrook, Billy G.
This paper explores the use of learning as a practical tool in problem solving. The idea that learning should and eventually will be a vital component of most Artificial Intelligence programs is pursued. Current techniques in learning systems are compared. A detailed discussion of the problems of representing, modifying, and creating heuristics is…
Drawing Dynamic Geometry Figures Online with Natural Language for Junior High School Geometry
ERIC Educational Resources Information Center
Wong, Wing-Kwong; Yin, Sheng-Kai; Yang, Chang-Zhe
2012-01-01
This paper presents a tool for drawing dynamic geometric figures by understanding the texts of geometry problems. With the tool, teachers and students can construct dynamic geometric figures on a web page by inputting a geometry problem in natural language. First we need to build the knowledge base for understanding geometry problems. With the…
Using the Wonder of Inequalities between Averages for Mathematics Problems Solving
ERIC Educational Resources Information Center
Shaanan, Rachel Mogilevsky; Gordon, Moshe Stupel
2016-01-01
The study presents an introductory idea of using mathematical averages as a tool for enriching mathematical problem solving. Throughout students' activities, a research was conducted on their ability to solve mathematical problems, and how to cope with a variety of mathematical tasks, in a variety of ways, using the skills, tools and experiences…
Characteristics of a Cognitive Tool That Helps Students Learn Diagnostic Problem Solving
ERIC Educational Resources Information Center
Danielson, Jared A.; Mills, Eric M.; Vermeer, Pamela J.; Preast, Vanessa A.; Young, Karen M.; Christopher, Mary M.; George, Jeanne W.; Wood, R. Darren; Bender, Holly S.
2007-01-01
Three related studies replicated and extended previous work (J.A. Danielson et al. (2003), "Educational Technology Research and Development," 51(3), 63-81) involving the Diagnostic Pathfinder (dP) (previously Problem List Generator [PLG]), a cognitive tool for learning diagnostic problem solving. In studies 1 and 2, groups of 126 and 113…
The development of tool manufacture in humans: what helps young children make innovative tools?
Chappell, Jackie; Cutting, Nicola; Apperly, Ian A.; Beck, Sarah R.
2013-01-01
We know that even young children are proficient tool users, but until recently, little was known about how they make tools. Here, we will explore the concepts underlying tool making, and the kinds of information and putative cognitive abilities required for children to manufacture novel tools. We will review the evidence for novel tool manufacture from the comparative literature and present a growing body of data from children suggesting that innovation of the solution to a problem by making a tool is a much more challenging task than previously thought. Children's difficulty with these kinds of tasks does not seem to be explained by perseveration with unmodified tools, difficulty with switching to alternative strategies, task pragmatics or issues with permission. Rather, making novel tools (without having seen an example of the required tool within the context of the task) appears to be hard, because it is an example of an ‘ill-structured problem’. In this type of ill-structured problem, the starting conditions and end goal are known, but the transformations and/or actions required to get from one to the other are not specified. We will discuss the implications of these findings for understanding the development of problem-solving in humans and other animals. PMID:24101620
Comparing the effectiveness of TWEAK and T-ACE in determining problem drinkers in pregnancy.
Sarkar, M; Einarson, T; Koren, G
2010-01-01
The TWEAK and T-ACE screening tools are validated methods of identifying problem drinking in a pregnant population. The objective of this study was to compare the effectiveness of the TWEAK and T-ACE screening tools in identifying problem drinking using traditional cut-points (CP). Study participants consisted of women calling the Motherisk Alcohol Helpline for information regarding their alcohol use in pregnancy. In this cohort, concerns surrounding underreporting are not likely as women self-report their alcohol consumption. Participant's self-identification, confirmed by her amount of alcohol use, determined whether she was a problem drinker or not. The TWEAK and T-ACE tools were administered on both groups and subsequent analysis was done to determine if one tool was more effective in predicting problem drinking. The study consisted of 75 problem and 100 non-problem drinkers. Using traditional CP, the TWEAK and T-ACE tools both performed similarly at identifying potential at-risk women (positive predictive value = 0.54), with very high sensitivity rates (100-99% and 100-93%, respectively) but poor specificity rates (36-43% and 19-34%, respectively). Upon comparison, there was no statistical difference in the effectiveness for one test performing better than next using either CP of 2 (P = 0.66) or CP of 3 (P = 0.38). Despite the lack of difference in performance, improved specificity associated with TWEAK suggests that it may be better suited to screen at-risk populations seeking advice from a helpline.
Sharing clinical information across care settings: the birth of an integrated assessment system
Gray, Leonard C; Berg, Katherine; Fries, Brant E; Henrard, Jean-Claude; Hirdes, John P; Steel, Knight; Morris, John N
2009-01-01
Background Population ageing, the emergence of chronic illness, and the shift away from institutional care challenge conventional approaches to assessment systems which traditionally are problem and setting specific. Methods From 2002, the interRAI research collaborative undertook development of a suite of assessment tools to support assessment and care planning of persons with chronic illness, frailty, disability, or mental health problems across care settings. The suite constitutes an early example of a "third generation" assessment system. Results The rationale and development strategy for the suite is described, together with a description of potential applications. To date, ten instruments comprise the suite, each comprising "core" items shared among the majority of instruments and "optional" items that are specific to particular care settings or situations. Conclusion This comprehensive suite offers the opportunity for integrated multi-domain assessment, enabling electronic clinical records, data transfer, ease of interpretation and streamlined training. PMID:19402891
Medical sociology as a vocation.
Bosk, Charles L
2014-12-01
This article extends Weber's discussion of science as a vocation by applying it to medical sociology. Having used qualitative methods for nearly 40 years to interpret problems of meaning as they arise in the context of health care, I describe how ethnography, in particular, and qualitative inquiry, more generally, may be used as a tool for understanding fundamental questions close to the heart but far from the mind of medical sociology. Such questions overlap with major policy questions such as how do we achieve a higher standard for quality of care and assure the safety of patients. Using my own research, I show how this engagement takes the form of showing how simple narratives of policy change fail to address the complexities of the problems that they are designed to remedy. I also attempt to explain how I balance objectivity with a commitment to creating a more equitable framework for health care. © American Sociological Association 2014.
Helene, L M; Rocha, M T
1998-10-01
The purpose of this study was to identify leprosy patients' psychosocial problems experienced after they were informed about their diagnosis. We focused attention upon concerns and behavioral changes related to their families, friends, jobs and to themselves. Data were obtained by a two opened questions interview and they were analysed with the aid of artificial intelligence techniques. These intelligence tools were used to discover the most frequent words, phrases and concepts existing in the interview reports. The results showed that after being informed about their diagnosis, the majority of the patients referred some concerns and behavioral changes related to their families, friends, jobs and to themselves. The main concerns of the population were related to the disease (transmission, the treatment extension, the possibility of hospitalization, the uncertainty about the cure). These facts induced some of the patients to avoid telling people about the disease they have.
Statistical analysis of effective singular values in matrix rank determination
NASA Technical Reports Server (NTRS)
Konstantinides, Konstantinos; Yao, Kung
1988-01-01
A major problem in using SVD (singular-value decomposition) as a tool in determining the effective rank of a perturbed matrix is that of distinguishing between significantly small and significantly large singular values to the end, conference regions are derived for the perturbed singular values of matrices with noisy observation data. The analysis is based on the theories of perturbations of singular values and statistical significance test. Threshold bounds for perturbation due to finite-precision and i.i.d. random models are evaluated. In random models, the threshold bounds depend on the dimension of the matrix, the noisy variance, and predefined statistical level of significance. Results applied to the problem of determining the effective order of a linear autoregressive system from the approximate rank of a sample autocorrelation matrix are considered. Various numerical examples illustrating the usefulness of these bounds and comparisons to other previously known approaches are given.
NASA Technical Reports Server (NTRS)
Ridd, M. K.; Merola, J. A.; Jaynes, R. A.
1983-01-01
Conversion of agricultural land to a variety of urban uses is a major problem along the Wasatch Front, Utah. Although LANDSAT MSS data is a relatively coarse tool for discriminating categories of change in urban-size plots, its availability prompts a thorough test of its power to detect change. The procedures being applied to a test area in Salt Lake County, Utah, where the land conversion problem is acute are presented. The identity of land uses before and after conversion was determined and digital procedures for doing so were compared. Several algorithms were compared, utilizing both raw data and preprocessed data. Verification of results involved high quality color infrared photography and field observation. Two data sets were digitally registered, specific change categories internally identified in the software, results tabulated by computer, and change maps printed at 1:24,000 scale.
Optimization techniques applied to passive measures for in-orbit spacecraft survivability
NASA Technical Reports Server (NTRS)
Mog, Robert A.; Price, D. Marvin
1991-01-01
Spacecraft designers have always been concerned about the effects of meteoroid impacts on mission safety. The engineering solution to this problem has generally been to erect a bumper or shield placed outboard from the spacecraft wall to disrupt/deflect the incoming projectiles. Spacecraft designers have a number of tools at their disposal to aid in the design process. These include hypervelocity impact testing, analytic impact predictors, and hydrodynamic codes. Analytic impact predictors generally provide the best quick-look estimate of design tradeoffs. The most complete way to determine the characteristics of an analytic impact predictor is through optimization of the protective structures design problem formulated with the predictor of interest. Space Station Freedom protective structures design insight is provided through the coupling of design/material requirements, hypervelocity impact phenomenology, meteoroid and space debris environment sensitivities, optimization techniques and operations research strategies, and mission scenarios. Major results are presented.
Sustainable biorefining in wastewater by engineered extreme alkaliphile Bacillus marmarensis.
Wernick, David G; Pontrelli, Sammy P; Pollock, Alexander W; Liao, James C
2016-02-01
Contamination susceptibility, water usage, and inability to utilize 5-carbon sugars and disaccharides are among the major obstacles in industrialization of sustainable biorefining. Extremophilic thermophiles and acidophiles are being researched to combat these problems, but organisms which answer all the above problems have yet to emerge. Here, we present engineering of the unexplored, extreme alkaliphile Bacillus marmarensis as a platform for new bioprocesses which meet all these challenges. With a newly developed transformation protocol and genetic tools, along with optimized RBSs and antisense RNA, we engineered B. marmarensis to produce ethanol at titers of 38 g/l and 65% yields from glucose in unsterilized media. Furthermore, ethanol titers and yields of 12 g/l and 50%, respectively, were produced from cellobiose and xylose in unsterilized seawater and algal-contaminated wastewater. As such, B. marmarensis presents a promising approach for the contamination-resistant biorefining of a wide range of carbohydrates in unsterilized, non-potable seawater.
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, M.K.
1994-06-01
The United States Department of Energy (DOE) faces the major task of cleaning up hundreds of waste sites across the nation, which will require completion of a large number of remedial investigation/feasibility studies (RI/FSs). The intent of each RI/FS is to characterize the waste problems and environmental conditions at the operable unit level, segment the remediation problem into manageable medium-specific and contaminant-specific pieces, define corresponding remediation objectives, and identify remedial response actions to satisfy those objectives. The RI/FS team can then identify combinations of remediation technologies that will meet the remediation objectives. Finally, the team must evaluate these remedial alternativesmore » in terms of effectiveness, implementability, cost, and acceptability. The Remedial Action Assessment System (RAAS) is being developed by Pacific Northwest Laboratory (PNL) to support DOE in this effort.« less
Interactive visualization tools for the structural biologist.
Porebski, Benjamin T; Ho, Bosco K; Buckle, Ashley M
2013-10-01
In structural biology, management of a large number of Protein Data Bank (PDB) files and raw X-ray diffraction images often presents a major organizational problem. Existing software packages that manipulate these file types were not designed for these kinds of file-management tasks. This is typically encountered when browsing through a folder of hundreds of X-ray images, with the aim of rapidly inspecting the diffraction quality of a data set. To solve this problem, a useful functionality of the Macintosh operating system (OSX) has been exploited that allows custom visualization plugins to be attached to certain file types. Software plugins have been developed for diffraction images and PDB files, which in many scenarios can save considerable time and effort. The direct visualization of diffraction images and PDB structures in the file browser can be used to identify key files of interest simply by scrolling through a list of files.
Plazzotta, Fernando; Otero, Carlos; Luna, Daniel; de Quiros, Fernan Gonzalez Bernaldo
2013-01-01
Physicians do not always keep the problem list accurate, complete and updated. To analyze natural language processing (NLP) techniques and inference rules as strategies to maintain completeness and accuracy of the problem list in EHRs. Non systematic literature review in PubMed, in the last 10 years. Strategies to maintain the EHRs problem list were analyzed in two ways: inputting and removing problems from the problem list. NLP and inference rules have acceptable performance for inputting problems into the problem list. No studies using these techniques for removing problems were published Conclusion: Both tools, NLP and inference rules have had acceptable results as tools for maintain the completeness and accuracy of the problem list.
Novices and Experts in Geoinformatics: the Cognitive Gap.
NASA Astrophysics Data System (ADS)
Zhilin, M.
2012-04-01
Modern geoinformatics is an extremely powerful tool for problem analysis and decision making in various fields. Currently general public uses geoinformatics predominantly for navigating (GPS) and sharing information about particular places (GoogleMaps, Wikimapia). Communities also use geoinformatics for particular purposes: fans of history use it to correspond historical and actual maps (www.retromap.ru), birdwatchers point places where they met birds (geobirds.com/rangemaps) etc. However the majority of stakeholders local authorities are not aware of advantages and possibilities of geoinformatics. The same problem is observed for students. At the same time many professional geoinformatic tools are developed, but sometimes the experts even can't explain their purpose to non-experts. So the question is how to shrink the gap between experts and non-experts in understanding and application of geoinformatics. We think that this gap has a cognitive basis. According to modern cognitive theories (Shiffrin-Atkinson and descending) the information primary has to pass through the perceptual filter that cuts off the information that seems to be irrelevant. The mind estimates the relevance implicitly (unconsciously) basing on previous knowledge and judgments what is important. Then it comes to the working memory which is used (a) for proceeding and (b) for problem solving. The working memory has limited capacity and can operate only with about 7 objects simultaneously. Then information passes to the long-term memory that is of unlimited capacity. There it is stored as more or less complex structures with associative links. When necessary it is extracted into the working memory. If great amount of information is linked ("chunked") the working memory operates with it as one object of seven thus overcoming the limitations of the working memory capacity. To adopt any information it should (a) pass through the perceptual filter, (b) not to overload the working memory and (c) to be structured in the long-term memory. Expert easily adopt domain-specific information because they (a) understand terminology and consider the information to be important thus passing it through the perceptual filter and (b) have a lot of complex domain-specific chunks that are processed by the working memory as a whole thus avoiding to overload it. Novices (students and general public) have neither understanding and feeling importance nor necessary chunks. The following measures should be taken to bridge experts' and novices' understanding of geoinformatics. Expert community should popularize geoscientific problems developing understandable language and available tools for their solving. This requires close collaboration with educational system (especially second education). If students understand a problem, they can find and apply appropriate tool for it. Geoscientific problems and models are extremely complex. In cognitive terms, they require hierarchy of chunks. This hierarchy should coherently develop beginning from simple ones later joining them to complex. It requires an appropriate sequence of learning tasks. There is no necessity in correct solutions - the students should understand how are they solved and realize limitations of models. We think that tasks of weather forecast, global climate modeling etc are suitable. The first step on bridging experts and novices is the elaboration of a set and a sequence of learning tasks and its sequence as well as tools for their solution. The tools should be easy for everybody who understands the task and as versatile as possible - otherwise students will waste a lot of time mastering it. This development requires close collaboration between geoscientists and educators.
Teaching medical professionals and trainees about adolescent suicide prevention: five key problems.
Sher, Leo
2012-01-01
Predicting and preventing suicide represent very difficult challenges for clinicians. The awareness of adolescent suicide as a major social and medical problem has increased over the past years. However, many health care professionals who have frequent contact with adolescents are not sufficiently trained in suicide evaluation techniques and approaches to adolescents with suicidal behavior. Suicide prevention efforts among adolescents are restricted by the fact that there are five key problems related to the evaluation and management of suicidality in adolescents: 1. Many clinicians underestimate the importance of the problem of adolescent suicidal behavior and underestimate its prevalence. 2. There is a misconception that direct questioning of adolescents about suicidality is sufficient to evaluate suicide risk. 3. Another misconception is that adolescents with non-psychiatric illnesses do not need to be evaluated for suicidality. 4. Many clinicians do not know about or underestimate the role of contagion in adolescent suicidal behavior. 5. There is a mistaken belief that adolescent males are at lower suicide risk than adolescent females. Educating medical professionals and trainees about the warning signs and symptoms of adolescent suicide and providing them with tools to recognize, evaluate, and manage suicidal patients represent a promising approach to adolescent suicide prevention.
A Silent Revolution: From Sketching to Coding--A Case Study on Code-Based Design Tool Learning
ERIC Educational Resources Information Center
Xu, Song; Fan, Kuo-Kuang
2017-01-01
Along with the information technology rising, Computer Aided Design activities are becoming more modern and more complex. But learning how to operation these new design tools has become the main problem lying in front of each designer. This study was purpose on finding problems encountered during code-based design tools learning period of…
ERIC Educational Resources Information Center
Li, Rui; Liu, Min
2007-01-01
The purpose of this study is to examine the potential of using computer databases as cognitive tools to share learners' cognitive load and facilitate learning in a multimedia problem-based learning (PBL) environment designed for sixth graders. Two research questions were: (a) can the computer database tool share sixth-graders' cognitive load? and…
Brack, Werner; Altenburger, Rolf; Schüürmann, Gerrit; Krauss, Martin; López Herráez, David; van Gils, Jos; Slobodnik, Jaroslav; Munthe, John; Gawlik, Bernd Manfred; van Wezel, Annemarie; Schriks, Merijn; Hollender, Juliane; Tollefsen, Knut Erik; Mekenyan, Ovanes; Dimitrov, Saby; Bunke, Dirk; Cousins, Ian; Posthuma, Leo; van den Brink, Paul J; López de Alda, Miren; Barceló, Damià; Faust, Michael; Kortenkamp, Andreas; Scrimshaw, Mark; Ignatova, Svetlana; Engelen, Guy; Massmann, Gudrun; Lemkine, Gregory; Teodorovic, Ivana; Walz, Karl-Heinz; Dulio, Valeria; Jonker, Michiel T O; Jäger, Felix; Chipman, Kevin; Falciani, Francesco; Liska, Igor; Rooke, David; Zhang, Xiaowei; Hollert, Henner; Vrana, Branislav; Hilscherova, Klara; Kramer, Kees; Neumann, Steffen; Hammerbacher, Ruth; Backhaus, Thomas; Mack, Juliane; Segner, Helmut; Escher, Beate; de Aragão Umbuzeiro, Gisela
2015-01-15
SOLUTIONS (2013 to 2018) is a European Union Seventh Framework Programme Project (EU-FP7). The project aims to deliver a conceptual framework to support the evidence-based development of environmental policies with regard to water quality. SOLUTIONS will develop the tools for the identification, prioritisation and assessment of those water contaminants that may pose a risk to ecosystems and human health. To this end, a new generation of chemical and effect-based monitoring tools is developed and integrated with a full set of exposure, effect and risk assessment models. SOLUTIONS attempts to address legacy, present and future contamination by integrating monitoring and modelling based approaches with scenarios on future developments in society, economy and technology and thus in contamination. The project follows a solutions-oriented approach by addressing major problems of water and chemicals management and by assessing abatement options. SOLUTIONS takes advantage of the access to the infrastructure necessary to investigate the large basins of the Danube and Rhine as well as relevant Mediterranean basins as case studies, and puts major efforts on stakeholder dialogue and support. Particularly, the EU Water Framework Directive (WFD) Common Implementation Strategy (CIS) working groups, International River Commissions, and water works associations are directly supported with consistent guidance for the early detection, identification, prioritisation, and abatement of chemicals in the water cycle. SOLUTIONS will give a specific emphasis on concepts and tools for the impact and risk assessment of complex mixtures of emerging pollutants, their metabolites and transformation products. Analytical and effect-based screening tools will be applied together with ecological assessment tools for the identification of toxicants and their impacts. The SOLUTIONS approach is expected to provide transparent and evidence-based candidates or River Basin Specific Pollutants in the case study basins and to assist future review of priority pollutants under the WFD as well as potential abatement options. Copyright © 2014 Elsevier B.V. All rights reserved.
Practical global oceanic state estimation
NASA Astrophysics Data System (ADS)
Wunsch, Carl; Heimbach, Patrick
2007-06-01
The problem of oceanographic state estimation, by means of an ocean general circulation model (GCM) and a multitude of observations, is described and contrasted with the meteorological process of data assimilation. In practice, all such methods reduce, on the computer, to forms of least-squares. The global oceanographic problem is at the present time focussed primarily on smoothing, rather than forecasting, and the data types are unlike meteorological ones. As formulated in the consortium Estimating the Circulation and Climate of the Ocean (ECCO), an automatic differentiation tool is used to calculate the so-called adjoint code of the GCM, and the method of Lagrange multipliers used to render the problem one of unconstrained least-squares minimization. Major problems today lie less with the numerical algorithms (least-squares problems can be solved by many means) than with the issues of data and model error. Results of ongoing calculations covering the period of the World Ocean Circulation Experiment, and including among other data, satellite altimetry from TOPEX/POSEIDON, Jason-1, ERS- 1/2, ENVISAT, and GFO, a global array of profiling floats from the Argo program, and satellite gravity data from the GRACE mission, suggest that the solutions are now useful for scientific purposes. Both methodology and applications are developing in a number of different directions.
Braun, M Miles
2013-10-01
Study of complementary and alternative medicine's mind and body interventions (CAM-MABI) is hindered not only by the inability to mask participants and their teachers to the study intervention but also by the major practical hurdles of long-term study of practices that can be lifelong. Two other important methodological problems are that study of newly trained practitioners cannot directly address long-term practice, and that long-term practitioners likely self-select in ways that make finding appropriate controls (or a comparison group) challenging. The temporary practice pause then resumption study design (TPPR) introduced here is a new tool that extends the withdrawal study design, established in the field of drug evaluation, to the field of CAM-MABI. With the exception of the inability to mask, TPPR can address the other methodological problems noted above. Of great interest to investigators will likely be measures in practitioners of CAM-MABI that change with temporary pausing of CAM-MABI practice, followed by return of the measures to pre-pause levels with resumption of practice; this would suggest a link of the practice to measured changes. Such findings using this tool may enhance our insight into fundamental biological processes, leading to beneficial practical applications.
Students' perceptions of the relevance of mathematics in engineering
NASA Astrophysics Data System (ADS)
Flegg, Jennifer; Mallet, Dann; Lupton, Mandy
2012-09-01
In this article, we report on the findings of an exploratory study into the experience of students as they learn first year engineering mathematics. Here we define engineering as the application of mathematics and sciences to the building and design of projects for the use of society [M. Kirschenman and B. Brenner, Education for Civil Engineering: A Profession of Practice, Leader. Manag. Eng. 10 (2010), p. 54]. Qualitative and quantitative data on students' views of the relevance of their mathematics study to their engineering studies and future careers in engineering was collected. The students described using a range of mathematics techniques (mathematics skills developed, mathematics concepts applied to engineering and skills developed relevant for engineering) for various usages (as a subject of study, a tool for other subjects or a tool for real world problems). We found a number of themes relating to the design of engineering mathematics curriculum emerged from the data. These included the relevance of mathematics within different engineering majors, the relevance of mathematics to future studies, the relevance of learning mathematical rigour and the effectiveness of problem-solving tasks in conveying the relevance of mathematics more effectively than other forms of assessment. We make recommendations for the design of engineering mathematics curriculum based on our findings.
Guindo, Gabriel; Dubourg, Dominique; Marchal, Bruno; Blaise, Pierre; De Brouwere, Vincent
2004-10-01
A national retrospective survey on the unmet need for major obstetric surgery using the Unmet Obstetric Need Approach was carried out in Mali in 1999. In Koutiala, the district health team decided to carry on the monitoring of the met need for several years in order to assess their progress over time. The first prospective study, for 1999, estimated that more than 100 women in need of obstetric care never reached the hospital and probably died as a consequence. This surprising result shocked the district health team and the resulting increased awareness of service deficits triggered operational measures to tackle the problem. The Unmet Obstetric Need study in Koutiala district was implemented without financial support and only limited external technical back-up. The appropriation of the study by the district team for solving local problems of access to obstetric care may have contributed to the success of the experience. Used as a health service management tool, the study and its results started a dialogue between the hospital staff and both health centre staff and community representatives. This had not only the effect of triggering consideration of coverage, but also of quality of obstetric care. Copyright 2004 Oxford University Press
Broadening the application of evolutionarily based genetic pest management.
Gould, Fred
2008-02-01
Insect- and tick-vectored diseases such as malaria, dengue fever, and Lyme disease cause human suffering, and current approaches for prevention are not adequate. Invasive plants and animals such as Scotch broom, zebra mussels, and gypsy moths continue to cause environmental damage and economic losses in agriculture and forestry. Rodents transmit diseases and cause major pre- and postharvest losses, especially in less affluent countries. Each of these problems might benefit from the developing field of Genetic Pest Management that is conceptually based on principles of evolutionary biology. This article briefly describes the history of this field, new molecular tools in this field, and potential applications of those tools. There will be a need for evolutionary biologists to interact with researchers and practitioners in a variety of other fields to determine the most appropriate targets for genetic pest management, the most appropriate methods for specific targets, and the potential of natural selection to diminish the effectiveness of genetic pest management. In addition to producing environmentally sustainable pest management solutions, research efforts in this area could lead to new insights about the evolution of selfish genetic elements in natural systems and will provide students with the opportunity to develop a more sophisticated understanding of the role of evolutionary biology in solving societal problems.
Skipworth, R J E; Terrace, J D; Fulton, L A; Anderson, D N
2008-11-01
Imposed reductions in working hours will impact significantly on the ability of surgical trainees to achieve competency. The objective of this study was to obtain the opinions of Scottish surgical trainees concerning the training they receive, in order to inform and guide the development of future, high-standard training programmes. An anonymous questionnaire was sent to basic surgical trainees on the Edinburgh, Aberdeen and Dundee Basic Surgical Rotations commencing after August 2002. Thirty six questionnaire responses were analysed. Very few of the returned comments were complimentary to the existing training structure; indeed, most comments demonstrated significant trainee disappointment. Despite "regular" exposure to operative sessions, training tutorials and named consultant trainers, the most common concern was a perceived lack of high-quality, structured, operative exposure and responsibility. Textbooks and journals remain the most frequently utilised learning tools, with high-tech systems such as teleconferencing, videos, CD-ROMS, and DVDs being poorly exploited. Current surgical training is not meeting the expectation of the majority of its trainees. To solve this problem will require extensive revision of attitudes and current educational format. A greater emphasis on the integration of 21st century learning tools in the training programme may help bridge this gap.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-07
... Authorization Act for Fiscal Year 2009. Section 815 requires acquisition plans for major weapons systems to... hardware for major defense acquisition programs through the end of the service life of the related weapons... affects all contracts for major weapons that will require special tooling associated with the production...
Optimization problems in natural gas transportation systems. A state-of-the-art review
Ríos-Mercado, Roger Z.; Borraz-Sánchez, Conrado
2015-03-24
Our paper provides a review on the most relevant research works conducted to solve natural gas transportation problems via pipeline systems. The literature reveals three major groups of gas pipeline systems, namely gathering, transmission, and distribution systems. In this work, we aim at presenting a detailed discussion of the efforts made in optimizing natural gas transmission lines.There is certainly a vast amount of research done over the past few years on many decision-making problems in the natural gas industry and, specifically, in pipeline network optimization. In this work, we present a state-of-the-art survey focusing on specific categories that include short-termmore » basis storage (line-packing problems), gas quality satisfaction (pooling problems), and compressor station modeling (fuel cost minimization problems). We also discuss both steady-state and transient optimization models highlighting the modeling aspects and the most relevant solution approaches known to date. Although the literature on natural gas transmission system problems is quite extensive, this is, to the best of our knowledge, the first comprehensive review or survey covering this specific research area on natural gas transmission from an operations research perspective. Furthermore, this paper includes a discussion of the most important and promising research areas in this field. Hence, our paper can serve as a useful tool to gain insight into the evolution of the many real-life applications and most recent advances in solution methodologies arising from this exciting and challenging research area of decision-making problems.« less
Optimization problems in natural gas transportation systems. A state-of-the-art review
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ríos-Mercado, Roger Z.; Borraz-Sánchez, Conrado
Our paper provides a review on the most relevant research works conducted to solve natural gas transportation problems via pipeline systems. The literature reveals three major groups of gas pipeline systems, namely gathering, transmission, and distribution systems. In this work, we aim at presenting a detailed discussion of the efforts made in optimizing natural gas transmission lines.There is certainly a vast amount of research done over the past few years on many decision-making problems in the natural gas industry and, specifically, in pipeline network optimization. In this work, we present a state-of-the-art survey focusing on specific categories that include short-termmore » basis storage (line-packing problems), gas quality satisfaction (pooling problems), and compressor station modeling (fuel cost minimization problems). We also discuss both steady-state and transient optimization models highlighting the modeling aspects and the most relevant solution approaches known to date. Although the literature on natural gas transmission system problems is quite extensive, this is, to the best of our knowledge, the first comprehensive review or survey covering this specific research area on natural gas transmission from an operations research perspective. Furthermore, this paper includes a discussion of the most important and promising research areas in this field. Hence, our paper can serve as a useful tool to gain insight into the evolution of the many real-life applications and most recent advances in solution methodologies arising from this exciting and challenging research area of decision-making problems.« less
Global initiatives for improving hospital care for children: state of the art and future prospects.
Campbell, Harry; Duke, Trevor; Weber, Martin; English, Mike; Carai, Susanne; Tamburlini, Giorgio
2008-04-01
Deficiencies in the quality of health care are major limiting factors to the achievement of the Millennium Development Goals for child and maternal health. Quality of patient care in hospitals is firmly on the agendas of Western countries but has been slower to gain traction in developing countries, despite evidence that there is substantial scope for improvement, that hospitals have a major role in child survival, and that inequities in quality may be as important as inequities in access. There is now substantial global experience of strategies and interventions that improve the quality of care for children in hospitals with limited resources. The World Health Organization has developed a toolkit that contains adaptable instruments, including a framework for quality improvement, evidence-based clinical guidelines in the form of the Pocket Book of Hospital Care for Children, teaching material, assessment, and mortality audit tools. These tools have been field-tested by doctors, nurses, and other child health workers in many developing countries. This collective experience was brought together in a global World Health Organization meeting in Bali in 2007. This article describes how many countries are achieving improvements in quality of pediatric care, despite limited resources and other major obstacles, and how the evidence has progressed in recent years from documenting the nature and scope of the problems to describing the effectiveness of innovative interventions. The challenges remain to bring these and other strategies to scale and to support research into their use, impact, and sustainability in different environments.
A Python tool to set up relative free energy calculations in GROMACS
Klimovich, Pavel V.; Mobley, David L.
2015-01-01
Free energy calculations based on molecular dynamics (MD) simulations have seen a tremendous growth in the last decade. However, it is still difficult and tedious to set them up in an automated manner, as the majority of the present-day MD simulation packages lack that functionality. Relative free energy calculations are a particular challenge for several reasons, including the problem of finding a common substructure and mapping the transformation to be applied. Here we present a tool, alchemical-setup.py, that automatically generates all the input files needed to perform relative solvation and binding free energy calculations with the MD package GROMACS. When combined with Lead Optimization Mapper [14], recently developed in our group, alchemical-setup.py allows fully automated setup of relative free energy calculations in GROMACS. Taking a graph of the planned calculations and a mapping, both computed by LOMAP, our tool generates the topology and coordinate files needed to perform relative free energy calculations for a given set of molecules, and provides a set of simulation input parameters. The tool was validated by performing relative hydration free energy calculations for a handful of molecules from the SAMPL4 challenge [16]. Good agreement with previously published results and the straightforward way in which free energy calculations can be conducted make alchemical-setup.py a promising tool for automated setup of relative solvation and binding free energy calculations. PMID:26487189
Insightful problem solving and creative tool modification by captive nontool-using rooks
Bird, Christopher D.; Emery, Nathan J.
2009-01-01
The ability to use tools has been suggested to indicate advanced physical cognition in animals. Here we show that rooks, a member of the corvid family that do not appear to use tools in the wild are capable of insightful problem solving related to sophisticated tool use, including spontaneously modifying and using a variety of tools, shaping hooks out of wire, and using a series of tools in a sequence to gain a reward. It is remarkable that a species that does not use tools in the wild appears to possess an understanding of tools rivaling habitual tool users such as New Caledonian crows and chimpanzees. Our findings suggest that the ability to represent tools may be a domain-general cognitive capacity rather than an adaptive specialization and questions the relationship between physical intelligence and wild tool use. PMID:19478068
Software management tools: Lessons learned from use
NASA Technical Reports Server (NTRS)
Reifer, D. J.; Valett, J.; Knight, J.; Wenneson, G.
1985-01-01
Experience in inserting software project planning tools into more than 100 projects producing mission critical software are discussed. The problems the software project manager faces are listed along with methods and tools available to handle them. Experience is reported with the Project Manager's Workstation (PMW) and the SoftCost-R cost estimating package. Finally, the results of a survey, which looked at what could be done in the future to overcome the problems experienced and build a set of truly useful tools, are presented.
Heuristic algorithms for solving of the tool routing problem for CNC cutting machines
NASA Astrophysics Data System (ADS)
Chentsov, P. A.; Petunin, A. A.; Sesekin, A. N.; Shipacheva, E. N.; Sholohov, A. E.
2015-11-01
The article is devoted to the problem of minimizing the path of the cutting tool to shape cutting machines began. This problem can be interpreted as a generalized traveling salesman problem. Earlier version of the dynamic programming method to solve this problem was developed. Unfortunately, this method allows to process an amount not exceeding thirty circuits. In this regard, the task of constructing quasi-optimal route becomes relevant. In this paper we propose options for quasi-optimal greedy algorithms. Comparison of the results of exact and approximate algorithms is given.
Teaching Problem-Solving and Critical-Thinking Skills Online Using Problem-Based Learning
ERIC Educational Resources Information Center
Romero, Liz; Orzechowski, Agnes; Rahatka, Ola
2014-01-01
The availability of technological tools is promoting a shift toward more student-centered online instruction. This article describes the implementation of a Problem-Based Learning (PBL) model and the technological tools used to meet the expectations of the model as well as the needs of the students. The end product is a hybrid course with eight…
Connecting Students and Policymakers through Science and Service-Learning
NASA Astrophysics Data System (ADS)
Szymanski, D. W.
2017-12-01
Successful collaborations in community science require the participation of non-scientists as advocates for the use of science in addressing complex problems. This is especially true, but particularly difficult, with respect to the wicked problems of sustainability. The complicated, unsolvable, and inherently political nature of challenges like climate change can provoke cynicism and apathy about the use of science. While science education is a critical part of preparing all students to address wicked problems, it is not sufficient. Non-scientists must also learn how to advocate for the role of science in policy solutions. Fortunately, the transdisciplinary nature of sustainability provides a venue for engaging all undergraduates in community science, regardless of major. I describe a model for involving non-science majors in a form of service-learning, where the pursuit of community science becomes a powerful pedagogical tool for civic engagement. Bentley University is one of the few stand-alone business schools in the United States and provides an ideal venue to test this model, given that 95% of Bentley's 4000 undergraduates major in a business discipline. The technology-focused business program is combined with an integrated arts & sciences curriculum and experiential learning opportunities though the nationally recognized Bentley Service-Learning and Civic Engagement Center. In addition to a required general education core that includes the natural sciences, students may opt to complete a second major in liberal studies with thematic concentrations like Earth, Environment, and Global Sustainability. In the course Science in Environmental Policy, students may apply to complete a service-learning project for an additional course credit. The smaller group of students then act as consultants, conducting research for a non-profit organization in the Washington, D.C. area involved in geoscience policy. At the end of the semester, students travel to D.C. and present their findings to the non-profit partner and make policy recommendations to legislators in Capitol Hill visits. The projects have been highly impactful as a form of community science, creating passionate science advocacy among non-majors, improving collaborations with community partners, and spurring action by federal policymakers.
Cardoso, Raphael Moura; Ottoni, Eduardo B
2016-11-01
The effects of culture on individual cognition have become a core issue among cultural primatologists. Field studies with wild populations provide evidence on the role of social cues in the ontogeny of tool use in non-human primates, and on the transmission of such behaviours over generations through socially biased learning. Recent experimental studies have shown that cultural knowledge may influence problem solving in wild populations of chimpanzees. Here, we present the results from a field experiment comparing the performance of bearded capuchin monkeys (Sapajus libidinosus) from two wild savannah populations with distinct toolkits in a probing task. Only the population that already exhibited the customary use of probing tools succeeded in solving the new problem, suggesting that their cultural repertoire shaped their approach to the new task. Moreover, only this population, which uses stone tools in a broader range of contexts, tried to use them to solve the problem. Social interactions can affect the formation of learning sets and they affect the performance of the monkeys in problem solving. We suggest that behavioural traditions affect the ways non-human primates solve novel foraging problems using tools. © 2016 The Author(s).
Analysis of complex decisionmaking processes. [with application to jet engine development
NASA Technical Reports Server (NTRS)
Hill, J. D.; Ollila, R. G.
1978-01-01
The analysis of corporate decisionmaking processes related to major system developments is unusually difficult because of the number of decisionmakers involved in the process and the long development cycle. A method for analyzing such decision processes is developed and illustrated through its application to the analysis of the commercial jet engine development process. The method uses interaction matrices as the key tool for structuring the problem, recording data, and analyzing the data to establish the rank order of the major factors affecting development decisions. In the example, the use of interaction matrices permitted analysts to collect and analyze approximately 50 factors that influenced decisions during the four phases of the development cycle, and to determine the key influencers of decisions at each development phase. The results of this study indicate that the cost of new technology installed on an aircraft is the prime concern of the engine manufacturer.
[A computerised clinical decision-support system for the management of depression in Primary Care].
Aragonès, Enric; Comín, Eva; Cavero, Myriam; Pérez, Víctor; Molina, Cristina; Palao, Diego
Despite its clinical relevance and its importance as a public health problem, there are major gaps in the management of depression. Evidence-based clinical guidelines are useful to improve processes and clinical outcomes. In order to make their implementation easier these guidelines have been transformed into computerised clinical decision support systems. In this article, a description is presented on the basics and characteristics of a new computerised clinical guideline for the management of major depression, developed in the public health system in Catalonia. This tool helps the clinician to establish reliable and accurate diagnoses of depression, to choose the best treatment a priori according to the disease and the patient characteristics. It also emphasises the importance of systematic monitoring to assess the clinical course, and to adjust therapeutic interventions to the patient's needs at all times. Copyright © 2016 Elsevier España, S.L.U. All rights reserved.
Safran, C
2014-08-15
To provide an overview of the benefits of clinical data collected as a by-product of the care process, the potential problems with large aggregations of these data, the policy frameworks that have been formulated, and the major challenges in the coming years. This report summarizes some of the major observations from AMIA and IMIA conferences held on this admittedly broad topic from 2006 through 2013. This report also includes many unsupported opinions of the author. The benefits of aggregating larger and larger sets of routinely collected clinical data are well documented and of great societal benefit. These large data sets will probably never answer all possible clinical questions for methodological reasons. Non-traditional sources of health data that are patient-sources will pose new data science challenges. If we ever hope to have tools that can rapidly provide evidence for daily practice of medicine we need a science of health data perhaps modeled after the science of astronomy.
OPTIMIZING USABILITY OF AN ECONOMIC DECISION SUPPORT TOOL: PROTOTYPE OF THE EQUIPT TOOL.
Cheung, Kei Long; Hiligsmann, Mickaël; Präger, Maximilian; Jones, Teresa; Józwiak-Hagymásy, Judit; Muñoz, Celia; Lester-George, Adam; Pokhrel, Subhash; López-Nicolás, Ángel; Trapero-Bertran, Marta; Evers, Silvia M A A; de Vries, Hein
2018-01-01
Economic decision-support tools can provide valuable information for tobacco control stakeholders, but their usability may impact the adoption of such tools. This study aims to illustrate a mixed-method usability evaluation of an economic decision-support tool for tobacco control, using the EQUIPT ROI tool prototype as a case study. A cross-sectional mixed methods design was used, including a heuristic evaluation, a thinking aloud approach, and a questionnaire testing and exploring the usability of the Return of Investment tool. A total of sixty-six users evaluated the tool (thinking aloud) and completed the questionnaire. For the heuristic evaluation, four experts evaluated the interface. In total twenty-one percent of the respondents perceived good usability. A total of 118 usability problems were identified, from which twenty-six problems were categorized as most severe, indicating high priority to fix them before implementation. Combining user-based and expert-based evaluation methods is recommended as these were shown to identify unique usability problems. The evaluation provides input to optimize usability of a decision-support tool, and may serve as a vantage point for other developers to conduct usability evaluations to refine similar tools before wide-scale implementation. Such studies could reduce implementation gaps by optimizing usability, enhancing in turn the research impact of such interventions.
Analysis Tools for CFD Multigrid Solvers
NASA Technical Reports Server (NTRS)
Mineck, Raymond E.; Thomas, James L.; Diskin, Boris
2004-01-01
Analysis tools are needed to guide the development and evaluate the performance of multigrid solvers for the fluid flow equations. Classical analysis tools, such as local mode analysis, often fail to accurately predict performance. Two-grid analysis tools, herein referred to as Idealized Coarse Grid and Idealized Relaxation iterations, have been developed and evaluated within a pilot multigrid solver. These new tools are applicable to general systems of equations and/or discretizations and point to problem areas within an existing multigrid solver. Idealized Relaxation and Idealized Coarse Grid are applied in developing textbook-efficient multigrid solvers for incompressible stagnation flow problems.
A Comparative Study of Interval Management Control Law Capabilities
NASA Technical Reports Server (NTRS)
Barmore, Bryan E.; Smith, Colin L.; Palmer, Susan O.; Abbott, Terence S.
2012-01-01
This paper presents a new tool designed to allow for rapid development and testing of different control algorithms for airborne spacing. This tool, Interval Management Modeling and Spacing Tool (IM MAST), is a fast-time, low-fidelity tool created to model the approach of aircraft to a runway, with a focus on their interactions with each other. Errors can be induced between pairs of aircraft by varying initial positions, winds, speed profiles, and altitude profiles. Results to-date show that only a few of the algorithms tested had poor behavior in the arrival and approach environment. The majority of the algorithms showed only minimal variation in performance under the test conditions. Trajectory-based algorithms showed high susceptibility to wind forecast errors, while performing marginally better than the other algorithms under other conditions. Trajectory-based algorithms have a sizable advantage, however, of being able to perform relative spacing operations between aircraft on different arrival routes and flight profiles without employing ghosting. methods. This comes at the higher cost of substantially increased complexity, however. Additionally, it was shown that earlier initiation of relative spacing operations provided more time for corrections to be made without any significant problems in the spacing operation itself. Initiating spacing farther out, however, would require more of the aircraft to begin spacing before they merge onto a common route.
Prioritising coastal zone management issues through fuzzy cognitive mapping approach.
Meliadou, Aleka; Santoro, Francesca; Nader, Manal R; Dagher, Manale Abou; Al Indary, Shadi; Salloum, Bachir Abi
2012-04-30
Effective public participation is an essential component of Integrated Coastal Zone Management implementation. To promote such participation, a shared understanding of stakeholders' objectives has to be built to ultimately result in common coastal management strategies. The application of quantitative and semi-quantitative methods involving tools such as Fuzzy Cognitive Mapping is presently proposed for reaching such understanding. In this paper we apply the Fuzzy Cognitive Mapping tool to elucidate the objectives and priorities of North Lebanon's coastal productive sectors, and to formalize their coastal zone perceptions and knowledge. Then, we investigate the potential of Fuzzy Cognitive Mapping as tool for support coastal zone management. Five round table discussions were organized; one for the municipalities of the area and one for each of the main coastal productive sectors (tourism, industry, fisheries, agriculture), where the participants drew cognitive maps depicting their views. The analysis of the cognitive maps showed a large number of factors perceived as affecting the current situation of the North Lebanon coastal zone that were classified into five major categories: governance, infrastructure, environment, intersectoral interactions and sectoral initiatives. Furthermore, common problems, expectations and management objectives for all sectors were exposed. Within this context, Fuzzy Cognitive Mapping proved to be an essential tool for revealing stakeholder knowledge and perception and understanding complex relationships. Copyright © 2011 Elsevier Ltd. All rights reserved.
Somme, Dominique; Hébert, Réjean; Bravo, Gina; Blanchard, François; Saint-Jean, Olivier
2007-01-01
Introduction One aspect of clinical integration involves case managers' tools and particularly the individualized service plan. Methods We examined individualized service plan content and use in the PRISMA experiment. We analyzed 50 charts, and conducted and recorded interviews regarding individualized service plan use with all the case managers concerned (n=13). Results Delays between starting case management and writing the individualized service plan were long and varied (0–596 days, mean: 117 days). During the interviews, the individualized service plan was described as the ‘last step’ once the active planning phase was over. The reasons for formulating plans were mainly administrative. From a clinical viewpoint, individualized service plans were used as memoranda and not to describe services (842 interventions not mentioned in the plans) or needs (694 active problems not mentioned). Case managers felt uncomfortable with the individualized planning task and expected a tool more adapted to their needs. Conclusion Although a majority of the case managers' charts contained an individualized service plan, implementation of this tool seems tenuous. Because of the discrepancy between the potential usefulness expected by case managers and their actual use, a working committee was created to develop proposals for modifying the instrument. PMID:19503736
Eckart, J Dana; Sobral, Bruno W S
2003-01-01
The emergent needs of the bioinformatics community challenge current information systems. The pace of biological data generation far outstrips Moore's Law. Therefore, a gap continues to widen between the capabilities to produce biological (molecular and cell) data sets and the capability to manage and analyze these data sets. As a result, Federal investments in large data set generation produces diminishing returns in terms of the community's capabilities of understanding biology and leveraging that understanding to make scientific and technological advances that improve society. We are building an open framework to address various data management issues including data and tool interoperability, nomenclature and data communication standardization, and database integration. PathPort, short for Pathogen Portal, employs a generic, web-services based framework to deal with some of the problems identified by the bioinformatics community. The motivating research goal of a scalable system to provide data management and analysis for key pathosystems, especially relating to molecular data, has resulted in a generic framework using two major components. On the server-side, we employ web-services. On the client-side, a Java application called ToolBus acts as a client-side "bus" for contacting data and tools and viewing results through a single, consistent user interface.
NASA Astrophysics Data System (ADS)
Ibrahim, Bashirah; Ding, Lin; Heckler, Andrew F.; White, Daniel R.; Badeau, Ryan
2017-12-01
We examine students' mathematical performance on quantitative "synthesis problems" with varying mathematical complexity. Synthesis problems are tasks comprising multiple concepts typically taught in different chapters. Mathematical performance refers to the formulation, combination, and simplification of equations. Generally speaking, formulation and combination of equations require conceptual reasoning; simplification of equations requires manipulation of equations as computational tools. Mathematical complexity is operationally defined by the number and the type of equations to be manipulated concurrently due to the number of unknowns in each equation. We use two types of synthesis problems, namely, sequential and simultaneous tasks. Sequential synthesis tasks require a chronological application of pertinent concepts, and simultaneous synthesis tasks require a concurrent application of the pertinent concepts. A total of 179 physics major students from a second year mechanics course participated in the study. Data were collected from written tasks and individual interviews. Results show that mathematical complexity negatively influences the students' mathematical performance on both types of synthesis problems. However, for the sequential synthesis tasks, it interferes only with the students' simplification of equations. For the simultaneous synthesis tasks, mathematical complexity additionally impedes the students' formulation and combination of equations. Several reasons may explain this difference, including the students' different approaches to the two types of synthesis problems, cognitive load, and the variation of mathematical complexity within each synthesis type.
Kano, Yoshinobu; Nguyen, Ngan; Saetre, Rune; Yoshida, Kazuhiro; Miyao, Yusuke; Tsuruoka, Yoshimasa; Matsubayashi, Yuichiro; Ananiadou, Sophia; Tsujii, Jun'ichi
2008-01-01
Recently, several text mining programs have reached a near-practical level of performance. Some systems are already being used by biologists and database curators. However, it has also been recognized that current Natural Language Processing (NLP) and Text Mining (TM) technology is not easy to deploy, since research groups tend to develop systems that cater specifically to their own requirements. One of the major reasons for the difficulty of deployment of NLP/TM technology is that re-usability and interoperability of software tools are typically not considered during development. While some effort has been invested in making interoperable NLP/TM toolkits, the developers of end-to-end systems still often struggle to reuse NLP/TM tools, and often opt to develop similar programs from scratch instead. This is particularly the case in BioNLP, since the requirements of biologists are so diverse that NLP tools have to be adapted and re-organized in a much more extensive manner than was originally expected. Although generic frameworks like UIMA (Unstructured Information Management Architecture) provide promising ways to solve this problem, the solution that they provide is only partial. In order for truly interoperable toolkits to become a reality, we also need sharable type systems and a developer-friendly environment for software integration that includes functionality for systematic comparisons of available tools, a simple I/O interface, and visualization tools. In this paper, we describe such an environment that was developed based on UIMA, and we show its feasibility through our experience in developing a protein-protein interaction (PPI) extraction system.
Chromatography in the detection and characterization of illegal pharmaceutical preparations.
Deconinck, Eric; Sacré, Pierre-Yves; Courselle, Patricia; De Beer, Jacques O
2013-09-01
Counterfeit and illegal pharmaceutical products are an increasing worldwide problem and constitute a major challenge for analytical laboratories to detect and characterize them. Spectroscopic techniques such as infrared spectroscopy and Raman spectroscopy have always been the first methods of choice to detect counterfeits and illegal preparations, but due to the evolution in the seized products and the necessity of risk assessment, chromatographic methods are becoming more important in this domain. This review intends to give a general overview of the techniques described in literature to characterize counterfeit and illegal pharmaceutical preparations, focusing on the role of chromatographic techniques with different detection tools.
Vaccine antigen production in transgenic plants: strategies, gene constructs and perspectives.
Sala, Francesco; Manuela Rigano, M; Barbante, Alessandra; Basso, Barbara; Walmsley, Amanda M; Castiglione, Stefano
2003-01-30
Stable integration of a gene into the plant nuclear or chloroplast genome can transform higher plants (e.g. tobacco, potato, tomato, banana) into bioreactors for the production of subunit vaccines for oral or parental administration. This can also be achieved by using recombinant plant viruses as transient expression vectors in infected plants. The use of plant-derived vaccines may overcome some of the major problems encountered with traditional vaccination against infectious diseases, autoimmune diseases and tumours. They also offer a convenient tool against the threat of bio-terrorism. State of the art, experimental strategies, safety and perspectives are discussed in this article.
Antiprotozoal activity of proton-pump inhibitors.
Pérez-Villanueva, Jaime; Romo-Mancillas, Antonio; Hernández-Campos, Alicia; Yépez-Mulia, Lilián; Hernández-Luis, Francisco; Castillo, Rafael
2011-12-15
Parasitic diseases are still a major health problem in developing countries. In our effort to find new antiparasitic agents, in this Letter we report the in vitro antiprotozoal activity of omeprazole, lansoprazole, rabeprazole and pantoprazole against Trichomonas vaginalis, Giardia intestinalis and Entamoeba histolytica. Molecular modeling studies were an important tool to highlight the potential antiprotozoal activity of these drugs. Experimental evaluations revealed a strong activity for all compounds tested. Rabeprazole and pantoprazole were the most active compounds, having IC(50) values in the nanomolar range, which were even better than metronidazole, the drug of choice for these parasites. Copyright © 2011 Elsevier Ltd. All rights reserved.
Non-gravitational perturbations and satellite geodesy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Milani, A.; Nobill, A.M.; Farinella, P.
1987-01-01
This book presents the basic ideas of the physics of non-gravitational perturbations and the mathematics required to compute their orbital effects. It conveys the relevance of the different problems that must be solved to achieve a given level of accuracy in orbit determination and in recovery of geophysically significant parameters. Selected Contents are: Orders of Magnitude of the Perturbing Forces, Tides and Apparent Forces, Tools from Celestial Mechanics, Solar Radiation Pressure-Direct Effects: Satellite-Solar Radiation Interaction, Long-Term Effects on Semi-Major Axis, Radiation Pressure-Indirect Effects: Earth-Reflected Radiation Pressure, Anisotropic Thermal Emission, Drag: Orbital Perturbations by a Drag-Like Force, and Charged Particle Drag.
Malware distributed collection and pre-classification system using honeypot technology
NASA Astrophysics Data System (ADS)
Grégio, André R. A.; Oliveira, Isabela L.; Santos, Rafael D. C.; Cansian, Adriano M.; de Geus, Paulo L.
2009-04-01
Malware has become a major threat in the last years due to the ease of spread through the Internet. Malware detection has become difficult with the use of compression, polymorphic methods and techniques to detect and disable security software. Those and other obfuscation techniques pose a problem for detection and classification schemes that analyze malware behavior. In this paper we propose a distributed architecture to improve malware collection using different honeypot technologies to increase the variety of malware collected. We also present a daemon tool developed to grab malware distributed through spam and a pre-classification technique that uses antivirus technology to separate malware in generic classes.
Knowledge Acquisition: A Review of Tools and Ideas.
1987-08-01
tools. However, none could be applied directly to solving the problem of acquiring knowledge for the ASPA. RECOMMENDATIONS Develop a tool based on...the social sciences. BACKGROUND Because of the newness and complexity of the knowledge acquisition problem, the background of the knowledge...4. Minimal (does not incorporate any unnecessary complexities ) 5. Expected (experts are not in disagreement over any important aspect) (Grover 1983
ERIC Educational Resources Information Center
Watanabe, Tad
2015-01-01
The Common Core State Standards for Mathematics (CCSSM) (CCSSI 2010) identifies the strategic use of appropriate tools as one of the mathematical practices and emphasizes the use of pictures and diagrams as reasoning tools. Starting with the early elementary grades, CCSSM discusses students' solving of problems "by drawing." In later…
Grossi, Enzo
2006-05-03
In recent years a number of algorithms for cardiovascular risk assessment has been proposed to the medical community. These algorithms consider a number of variables and express their results as the percentage risk of developing a major fatal or non-fatal cardiovascular event in the following 10 to 20 years The author has identified three major pitfalls of these algorithms, linked to the limitation of the classical statistical approach in dealing with this kind of non linear and complex information. The pitfalls are the inability to capture the disease complexity, the inability to capture process dynamics, and the wide confidence interval of individual risk assessment. Artificial Intelligence tools can provide potential advantage in trying to overcome these limitations. The theoretical background and some application examples related to artificial neural networks and fuzzy logic have been reviewed and discussed. The use of predictive algorithms to assess individual absolute risk of cardiovascular future events is currently hampered by methodological and mathematical flaws. The use of newer approaches, such as fuzzy logic and artificial neural networks, linked to artificial intelligence, seems to better address both the challenge of increasing complexity resulting from a correlation between predisposing factors, data on the occurrence of cardiovascular events, and the prediction of future events on an individual level.
The use of landsat 7 enhanced thematic mapper plus for mapping leafy spurge
Mladinich, C.S.; Bustos, M.R.; Stitt, S.; Root, R.; Brown, K.; Anderson, G.L.; Hager, S.
2006-01-01
Euphorbia esula L. (leafy spurge) is an invasive weed that is a major problem in much of the Upper Great Plains region, including parts of Montana, South Dakota, North Dakota, Nebraska, and Wyoming. Infestations in North Dakota alone have had a serious economic impact, estimated at $87 million annually in 1991, to the state's wildlife, tourism, and agricultural economy. Leafy spurge degrades prairie and badland ecosystems by displacing native grasses and forbs. It is a major threat to protected ecosystems in many national parks, national wild lands, and state recreational areas in the region. This study explores the use of Landsat 7 Enhanced Thematic Mapper Plus (Landsat) imagery and derived products as a management tool for mapping leafy spurge in Theodore Roosevelt National Park, in southwestern North Dakota. An unsupervised clustering approach was used to map leafy spurge classes and resulted in overall classification accuracies of approximately 63%. The uses of Landsat imagery did not provide the accuracy required for detailed mapping of small patches of the weed. However, it demonstrated the potential for mapping broad-scale (regional) leafy spurge occurrence. This paper offers recommendations on the suitability of Landsat imagery as a tool for use by resource managers to map and monitor leafy spurge populations over large areas.
The challenges of transitioning from linear to high-order overlay control in advanced lithography
NASA Astrophysics Data System (ADS)
Adel, M.; Izikson, P.; Tien, D.; Huang, C. K.; Robinson, J. C.; Eichelberger, B.
2008-03-01
In the lithography section of the ITRS 2006 update, at the top of the list of difficult challenges appears the text "overlay of multiple exposures including mask image placement". This is a reflection of the fact that today overlay is becoming a major yield risk factor in semiconductor manufacturing. Historically, lithographers have achieved sufficient alignment accuracy and hence layer to layer overlay control by relying on models which define overlay as a linear function of the field and wafer coordinates. These linear terms were easily translated to correctibles in the available exposure tool degrees of freedom on the wafer and reticle stages. However, as the 45 nm half pitch node reaches production, exposure tool vendors have begun to make available, and lithographers have begun to utilize so called high order wafer and field control, in which either look up table or high order polynomial models are modified on a product by product basis. In this paper, the major challenges of this transition will be described. It will include characterization of the sources of variation which need to be controlled by these new models and the overlay and alignment sampling optimization problem which needs to be addressed, while maintaining the ever tightening demands on productivity and cost of ownership.
Frimpong, Joseph Asamoah; Amo-Addae, Maame Pokuah; Adewuyi, Peter Adebayo; Hall, Casey Daniel; Park, Meeyoung Mattie; Nagbe, Thomas Knue
2017-01-01
The laboratory plays a major role in surveillance, including confirming the start and end of an outbreak. Knowing the causative agent for an outbreak informs the development of response strategies and management plans for a public health event. However, issues and challenges may arise that limit the effectiveness or efficiency of laboratories in surveillance. This case study applies a systematic approach to analyse gaps in laboratory surveillance, thereby improving the ability to mitigate these gaps. Although this case study concentrates on factors resulting in poor feedback from the laboratory, practise of this general approach to problem analysis will confer skills required in analysing most public health issues. This case study was developed based on a report submitted by the district surveillance officer in Grand Bassa County, Liberia, as a resident of the Liberian Frontline Field Epidemiology Training Program in 2016. This case study will serve as a training tool to reinforce lectures on surveillance problem analysis using the fishbone approach. It is designed for public health training in a classroom setting and can be completed within 2 hours 30 minutes.
Silva, Kathleen M; Gross, Thomas J; Silva, Francisco J
2015-03-01
In two experiments, we examined the effect of modifications to the features of a stick-and-tube problem on the stick lengths that adult humans used to solve the problem. In Experiment 1, we examined whether people's tool preferences for retrieving an out-of-reach object in a tube might more closely resemble those reported with laboratory crows if people could modify a single stick to an ideal length to solve the problem. Contrary to when adult humans have selected a tool from a set of ten sticks, asking people to modify a single stick to retrieve an object did not generally result in a stick whose length was related to the object's distance. Consistent with the prior research, though, the working length of the stick was related to the object's distance. In Experiment 2, we examined the effect of increasing the scale of the stick-and-tube problem on people's tool preferences. Increasing the scale of the task influenced people to select relatively shorter tools than had selected in previous studies. Although the causal structures of the tasks used in the two experiments were identical, their results were not. This underscores the necessity of studying physical cognition in relation to a particular causal structure by using a variety of tasks and methods.
A System for Fault Management for NASA's Deep Space Habitat
NASA Technical Reports Server (NTRS)
Colombano, Silvano P.; Spirkovska, Liljana; Aaseng, Gordon B.; Mccann, Robert S.; Baskaran, Vijayakumar; Ossenfort, John P.; Smith, Irene Skupniewicz; Iverson, David L.; Schwabacher, Mark A.
2013-01-01
NASA's exploration program envisions the utilization of a Deep Space Habitat (DSH) for human exploration of the space environment in the vicinity of Mars and/or asteroids. Communication latencies with ground control of as long as 20+ minutes make it imperative that DSH operations be highly autonomous, as any telemetry-based detection of a systems problem on Earth could well occur too late to assist the crew with the problem. A DSH-based development program has been initiated to develop and test the automation technologies necessary to support highly autonomous DSH operations. One such technology is a fault management tool to support performance monitoring of vehicle systems operations and to assist with real-time decision making in connection with operational anomalies and failures. Toward that end, we are developing Advanced Caution and Warning System (ACAWS), a tool that combines dynamic and interactive graphical representations of spacecraft systems, systems modeling, automated diagnostic analysis and root cause identification, system and mission impact assessment, and mitigation procedure identification to help spacecraft operators (both flight controllers and crew) understand and respond to anomalies more effectively. In this paper, we describe four major architecture elements of ACAWS: Anomaly Detection, Fault Isolation, System Effects Analysis, and Graphic User Interface (GUI), and how these elements work in concert with each other and with other tools to provide fault management support to both the controllers and crew. We then describe recent evaluations and tests of ACAWS on the DSH testbed. The results of these tests support the feasibility and strength of our approach to failure management automation and enhanced operational autonomy.
A System for Fault Management and Fault Consequences Analysis for NASA's Deep Space Habitat
NASA Technical Reports Server (NTRS)
Colombano, Silvano; Spirkovska, Liljana; Baskaran, Vijaykumar; Aaseng, Gordon; McCann, Robert S.; Ossenfort, John; Smith, Irene; Iverson, David L.; Schwabacher, Mark
2013-01-01
NASA's exploration program envisions the utilization of a Deep Space Habitat (DSH) for human exploration of the space environment in the vicinity of Mars and/or asteroids. Communication latencies with ground control of as long as 20+ minutes make it imperative that DSH operations be highly autonomous, as any telemetry-based detection of a systems problem on Earth could well occur too late to assist the crew with the problem. A DSH-based development program has been initiated to develop and test the automation technologies necessary to support highly autonomous DSH operations. One such technology is a fault management tool to support performance monitoring of vehicle systems operations and to assist with real-time decision making in connection with operational anomalies and failures. Toward that end, we are developing Advanced Caution and Warning System (ACAWS), a tool that combines dynamic and interactive graphical representations of spacecraft systems, systems modeling, automated diagnostic analysis and root cause identification, system and mission impact assessment, and mitigation procedure identification to help spacecraft operators (both flight controllers and crew) understand and respond to anomalies more effectively. In this paper, we describe four major architecture elements of ACAWS: Anomaly Detection, Fault Isolation, System Effects Analysis, and Graphic User Interface (GUI), and how these elements work in concert with each other and with other tools to provide fault management support to both the controllers and crew. We then describe recent evaluations and tests of ACAWS on the DSH testbed. The results of these tests support the feasibility and strength of our approach to failure management automation and enhanced operational autonomy
Hecht, Alan D; Ferster, Aaron; Summers, Kevin
2017-10-16
When the U.S. Environmental Protection Agency (EPA) was established nearly 50 years ago, the nation faced serious threats to its air, land, and water, which in turn impacted human health. These threats were effectively addressed by the creation of EPA (in 1970) and many subsequent landmark environmental legislations which in turn significantly reduced threats to the Nation's environment and public health. A key element of historic legislation is research aimed at dealing with current and future problems. Today we face national and global challenges that go beyond classic media-specific (air, land, water) environmental legislation and require an integrated paradigm of action and engagement based on (1) innovation based on science and technology, (2) stakeholder engagement and collaboration, and (3) public education and support. This three-pronged approach recognizes that current environmental problems, include social as well as physical and environmental factors, are best addressed through collaborative problem solving, the application of innovation in science and technology, and multiple stakeholder engagement. To achieve that goal, EPA's Office of Research and Development (ORD) is working directly with states and local communities to develop and apply a suite of accessible decision support tools (DST) that aim to improve environmental conditions, protect human health, enhance economic opportunity, and advance a resilient and sustainability society. This paper showcases joint EPA and state actions to develop tools and approaches that not only meet current environmental and public health challenges, but do so in a way that advances sustainable, healthy, and resilient communities well into the future. EPA's future plans should build on current work but aim to effectively respond to growing external pressures. Growing pressures from megatrends are a major challenge for the new Administration and for cities and states across the country. The recent hurricanes hitting Texas and the Gulf Coast, part of the increase in extreme weather events, make it clear that building resilient infrastructure is a crucial step to sustainability.
NASA Astrophysics Data System (ADS)
Henderson, Michael
1997-08-01
The Numerical Analysis Objects project (NAO) is a project in the Mathematics Department of IBM's TJ Watson Research Center. While there are plenty of numerical tools available today, it is not an easy task to combine them into a custom application. NAO is directed at the dual problems of building applications from a set of tools, and creating those tools. There are several "reuse" projects, which focus on the problems of identifying and cataloging tools. NAO is directed at the specific context of scientific computing. Because the type of tools is restricted, problems such as tools with incompatible data structures for input and output, and dissimilar interfaces to tools which solve similar problems can be addressed. The approach we've taken is to define interfaces to those objects used in numerical analysis, such as geometries, functions and operators, and to start collecting (and building) a set of tools which use these interfaces. We have written a class library (a set of abstract classes and implementations) in C++ which demonstrates the approach. Besides the classes, the class library includes "stub" routines which allow the library to be used from C or Fortran, and an interface to a Visual Programming Language. The library has been used to build a simulator for petroleum reservoirs, using a set of tools for discretizing nonlinear differential equations that we have written, and includes "wrapped" versions of packages from the Netlib repository. Documentation can be found on the Web at "http://www.research.ibm.com/nao". I will describe the objects and their interfaces, and give examples ranging from mesh generation to solving differential equations.
ERIC Educational Resources Information Center
Van Aalsvoort, Joke
2004-01-01
In a previous article, the problem of chemistry's lack of relevance in secondary chemical education was analysed using logical positivism as a tool. This article starts with the hypothesis that the problem can be addressed by means of activity theory, one of the important theories within the sociocultural school. The reason for this expectation is…
Development of Gis Tool for the Solution of Minimum Spanning Tree Problem using Prim's Algorithm
NASA Astrophysics Data System (ADS)
Dutta, S.; Patra, D.; Shankar, H.; Alok Verma, P.
2014-11-01
minimum spanning tree (MST) of a connected, undirected and weighted network is a tree of that network consisting of all its nodes and the sum of weights of all its edges is minimum among all such possible spanning trees of the same network. In this study, we have developed a new GIS tool using most commonly known rudimentary algorithm called Prim's algorithm to construct the minimum spanning tree of a connected, undirected and weighted road network. This algorithm is based on the weight (adjacency) matrix of a weighted network and helps to solve complex network MST problem easily, efficiently and effectively. The selection of the appropriate algorithm is very essential otherwise it will be very hard to get an optimal result. In case of Road Transportation Network, it is very essential to find the optimal results by considering all the necessary points based on cost factor (time or distance). This paper is based on solving the Minimum Spanning Tree (MST) problem of a road network by finding it's minimum span by considering all the important network junction point. GIS technology is usually used to solve the network related problems like the optimal path problem, travelling salesman problem, vehicle routing problems, location-allocation problems etc. Therefore, in this study we have developed a customized GIS tool using Python script in ArcGIS software for the solution of MST problem for a Road Transportation Network of Dehradun city by considering distance and time as the impedance (cost) factors. It has a number of advantages like the users do not need a greater knowledge of the subject as the tool is user-friendly and that allows to access information varied and adapted the needs of the users. This GIS tool for MST can be applied for a nationwide plan called Prime Minister Gram Sadak Yojana in India to provide optimal all weather road connectivity to unconnected villages (points). This tool is also useful for constructing highways or railways spanning several cities optimally or connecting all cities with minimum total road length.
Effectiveness of a Technology-Based Intervention to Teach Evidence-Based Practice: The EBR Tool.
Long, JoAnn D; Gannaway, Paula; Ford, Cindy; Doumit, Rita; Zeeni, Nadine; Sukkarieh-Haraty, Ola; Milane, Aline; Byers, Beverly; Harrison, LaNell; Hatch, Daniel; Brown, Justin; Proper, Sharlan; White, Patricia; Song, Huaxin
2016-02-01
As the world becomes increasingly digital, advances in technology have changed how students access evidence-based information. Research suggests that students overestimate their ability to locate quality online research and lack the skills needed to evaluate the scientific literature. Clinical nurses report relying on personal experience to answer clinical questions rather than searching evidence-based sources. To address the problem, a web-based, evidence-based research (EBR) tool that is usable from a computer, smartphone, or iPad was developed and tested. The purpose of the EBR tool is to guide students through the basic steps needed to locate and critically appraise the online scientific literature while linking users to quality electronic resources to support evidence-based practice (EBP). Testing of the tool took place in a mixed-method, quasi-experimental, and two-population randomized controlled trial (RCT) design in a U.S. and Middle East university. A statistically significant improvement in overall research skills was supported in the quasi-experimental nursing student group and RCT nutrition student group using the EBR tool. A statistically significant proportional difference was supported in the RCT nutrition and PharmD intervention groups in participants' ability to distinguish the credibility of online source materials compared with controls. The majority of participants could correctly apply PICOTS to a case study when using the tool. The data from this preliminary study suggests that the EBR tool enhanced student overall research skills and selected EBP skills while generating data for assessment of learning outcomes. The EBR tool places evidence-based resources at the fingertips of users by addressing some of the most commonly cited barriers to research utilization while exposing users to information and online literacy standards of practice, meeting a growing need within nursing curricula. © 2016 Sigma Theta Tau International.
Tool use disorders after left brain damage.
Baumard, Josselin; Osiurak, François; Lesourd, Mathieu; Le Gall, Didier
2014-01-01
In this paper we review studies that investigated tool use disorders in left-brain damaged (LBD) patients over the last 30 years. Four tasks are classically used in the field of apraxia: Pantomime of tool use, single tool use, real tool use and mechanical problem solving. Our aim was to address two issues, namely, (1) the role of mechanical knowledge in real tool use and (2) the cognitive mechanisms underlying pantomime of tool use, a task widely employed by clinicians and researchers. To do so, we extracted data from 36 papers and computed the difference between healthy subjects and LBD patients. On the whole, pantomime of tool use is the most difficult task and real tool use is the easiest one. Moreover, associations seem to appear between pantomime of tool use, real tool use and mechanical problem solving. These results suggest that the loss of mechanical knowledge is critical in LBD patients, even if all of those tasks (and particularly pantomime of tool use) might put differential demands on semantic memory and working memory.
Tool use disorders after left brain damage
Baumard, Josselin; Osiurak, François; Lesourd, Mathieu; Le Gall, Didier
2014-01-01
In this paper we review studies that investigated tool use disorders in left-brain damaged (LBD) patients over the last 30 years. Four tasks are classically used in the field of apraxia: Pantomime of tool use, single tool use, real tool use and mechanical problem solving. Our aim was to address two issues, namely, (1) the role of mechanical knowledge in real tool use and (2) the cognitive mechanisms underlying pantomime of tool use, a task widely employed by clinicians and researchers. To do so, we extracted data from 36 papers and computed the difference between healthy subjects and LBD patients. On the whole, pantomime of tool use is the most difficult task and real tool use is the easiest one. Moreover, associations seem to appear between pantomime of tool use, real tool use and mechanical problem solving. These results suggest that the loss of mechanical knowledge is critical in LBD patients, even if all of those tasks (and particularly pantomime of tool use) might put differential demands on semantic memory and working memory. PMID:24904487
Tu, S W; Eriksson, H; Gennari, J H; Shahar, Y; Musen, M A
1995-06-01
PROTEGE-II is a suite of tools and a methodology for building knowledge-based systems and domain-specific knowledge-acquisition tools. In this paper, we show how PROTEGE-II can be applied to the task of providing protocol-based decision support in the domain of treating HIV-infected patients. To apply PROTEGE-II, (1) we construct a decomposable problem-solving method called episodic skeletal-plan refinement, (2) we build an application ontology that consists of the terms and relations in the domain, and of method-specific distinctions not already captured in the domain terms, and (3) we specify mapping relations that link terms from the application ontology to the domain-independent terms used in the problem-solving method. From the application ontology, we automatically generate a domain-specific knowledge-acquisition tool that is custom-tailored for the application. The knowledge-acquisition tool is used for the creation and maintenance of domain knowledge used by the problem-solving method. The general goal of the PROTEGE-II approach is to produce systems and components that are reusable and easily maintained. This is the rationale for constructing ontologies and problem-solving methods that can be composed from a set of smaller-grained methods and mechanisms. This is also why we tightly couple the knowledge-acquisition tools to the application ontology that specifies the domain terms used in the problem-solving systems. Although our evaluation is still preliminary, for the application task of providing protocol-based decision support, we show that these goals of reusability and easy maintenance can be achieved. We discuss design decisions and the tradeoffs that have to be made in the development of the system.
Drug use in a rural secondary school in Kenya.
Ndetei, David M; Khasakhala, Lincoln I; Mutiso, Victoria; Ongecha-Owuor, Francisca A; Kokonya, Donald A
2010-07-01
Alcohol abuse and alcohol-related use problems among adolescents are highly prevalent and are a major concern worldwide. This study estimated the prevalence of drug abuse, knowledge about drug abuse and its effect on psychosocial well-being and induced behavioral problems among students of a public rural secondary school that admitted both girls and boys which offered both boarding and day school facilities. The students filled out a self-reporting substance use tool which measures the prevalence, frequency, and general patterns of substance use. Alcohol, tobacco, khat (catha edulis) and bhang (cannabis) were the most commonly reported substance of use, with user prevalence rates of 5.2%, 3.8%, 3.2%, and 1.7%, respectively. Tobacco use was initiated at 10 years, while cannabis, hard drugs, khat, and alcohol were initiated at 11, 12, 13, and 15 years of age, respectively. Among the students 71% were aware that their schoolmates were on drugs and it was known by 49.8%, 41.7%, 37.6%, 44.3%, and 32.4% of these students that using alcohol, tobacco, khat, cannabis, and hard drugs, respectively was a behavioral problem in the school. Three quarters of the students were aware that use of drugs was harmful to their health, with majority (78.6%) indicating that drug users need help to stop the drug use behavior. However most (73.6%) of the students suggested drug users in school should be punished. The drug use behavioral problems included school dropout, poor scholastic attainment, drunken driving, delinquency, and adolescence pregnancy which threaten the stability of the education system, family as an institution (family difficulties) and society at large. Therefore, teachers have an added burden of playing an active role in guidance and counselling the survivors of drug abuse, a pandemic facing teaching institutions apart from instilling knowledge.
Solution mechanism guide: implementing innovation within a research & development organization.
Keeton, Kathryn E; Richard, Elizabeth E; Davis, Jeffrey R
2014-10-01
In order to create a culture more open to novel problem-solving mechanisms, NASA's Human Health and Performance Directorate (HH&P) created a strategic knowledge management tool that educates employees about innovative problem-solving techniques, the Solution Mechanism Guide (SMG). The SMG is a web-based, interactive guide that leverages existing and innovative problem-solving methods and presents this information as a unique user experience so that the employee is empowered to make the best decision about which problem-solving tool best meets their needs. By integrating new and innovative methods with existing problem solving tools, the SMG seamlessly introduces open innovation and collaboration concepts within HH&P to more effectively address human health and performance risks. This commentary reviews the path of creating a more open and innovative culture within HH&P and the process and development steps that were taken to develop the SMG.
Computational Electronics and Electromagnetics
DOE Office of Scientific and Technical Information (OSTI.GOV)
DeFord, J.F.
The Computational Electronics and Electromagnetics thrust area is a focal point for computer modeling activities in electronics and electromagnetics in the Electronics Engineering Department of Lawrence Livermore National Laboratory (LLNL). Traditionally, they have focused their efforts in technical areas of importance to existing and developing LLNL programs, and this continues to form the basis for much of their research. A relatively new and increasingly important emphasis for the thrust area is the formation of partnerships with industry and the application of their simulation technology and expertise to the solution of problems faced by industry. The activities of the thrust areamore » fall into three broad categories: (1) the development of theoretical and computational models of electronic and electromagnetic phenomena, (2) the development of useful and robust software tools based on these models, and (3) the application of these tools to programmatic and industrial problems. In FY-92, they worked on projects in all of the areas outlined above. The object of their work on numerical electromagnetic algorithms continues to be the improvement of time-domain algorithms for electromagnetic simulation on unstructured conforming grids. The thrust area is also investigating various technologies for conforming-grid mesh generation to simplify the application of their advanced field solvers to design problems involving complicated geometries. They are developing a major code suite based on the three-dimensional (3-D), conforming-grid, time-domain code DSI3D. They continue to maintain and distribute the 3-D, finite-difference time-domain (FDTD) code TSAR, which is installed at several dozen university, government, and industry sites.« less
Evaluation of Tools for Protection of Interest against Hacking and Cracking
NASA Astrophysics Data System (ADS)
Jahankhani, Hossein; Antonijevic, Branko; Walcott, Terry
The internet considered a tool that effectively ensures communication globally has been hindered by hackers and crackers continuously. In so doing, a multitude of network facilitated tools such as firewalls, virtual private networks (VPN) and a variety of antivirus software packages has been enabled for dealing with such predicaments. However, more often than not these facilitated tools are marketed as perfect solutions to the ever culminating problems such as loss of data and privacy in networked and world wide intercommunications. We provide a forum for addressing these perceived problems in this paper.
Using hybrid expert system approaches for engineering applications
NASA Technical Reports Server (NTRS)
Allen, R. H.; Boarnet, M. G.; Culbert, C. J.; Savely, R. T.
1987-01-01
In this paper, the use of hybrid expert system shells and hybrid (i.e., algorithmic and heuristic) approaches for solving engineering problems is reported. Aspects of various engineering problem domains are reviewed for a number of examples with specific applications made to recently developed prototype expert systems. Based on this prototyping experience, critical evaluations of and comparisons between commercially available tools, and some research tools, in the United States and Australia, and their underlying problem-solving paradigms are made. Characteristics of the implementation tool and the engineering domain are compared and practical software engineering issues are discussed with respect to hybrid tools and approaches. Finally, guidelines are offered with the hope that expert system development will be less time consuming, more effective, and more cost-effective than it has been in the past.
An investigation of chatter and tool wear when machining titanium
NASA Technical Reports Server (NTRS)
Sutherland, I. A.
1974-01-01
The low thermal conductivity of titanium, together with the low contact area between chip and tool and the unusually high chip velocities, gives rise to high tool tip temperatures and accelerated tool wear. Machining speeds have to be considerably reduced to avoid these high temperatures with a consequential loss of productivity. Restoring this lost productivity involves increasing other machining variables, such as feed and depth-of-cut, and can lead to another machining problem commonly known as chatter. This work is to acquaint users with these problems, to examine the variables that may be encountered when machining a material like titanium, and to advise the machine tool user on how to maximize the output from the machines and tooling available to him. Recommendations are made on ways of improving tolerances, reducing machine tool instability or chatter, and improving productivity. New tool materials, tool coatings, and coolants are reviewed and their relevance examined when machining titanium.
Mulcahy, Nicholas J; Call, Josep; Dunbar, Robin I M
2005-02-01
Two important elements in problem solving are the abilities to encode relevant task features and to combine multiple actions to achieve the goal. The authors investigated these 2 elements in a task in which gorillas (Gorilla gorilla) and orangutans (Pongo pygmaeus) had to use a tool to retrieve an out-of-reach reward. Subjects were able to select tools of an appropriate length to reach the reward even when the position of the reward and tools were not simultaneously visible. When presented with tools that were too short to retrieve the reward, subjects were more likely to refuse to use them than when tools were the appropriate length. Subjects were proficient at using tools in sequence to retrieve the reward.
Generation of Multilayered 3D Structures of HepG2 Cells Using a Bio-printing Technique.
Jeon, Hyeryeon; Kang, Kyojin; Park, Su A; Kim, Wan Doo; Paik, Seung Sam; Lee, Sang-Hun; Jeong, Jaemin; Choi, Dongho
2017-01-15
Chronic liver disease is a major widespread cause of death, and whole liver transplantation is the only definitive treatment for patients with end-stage liver diseases. However, many problems, including donor shortage, surgical complications and cost, hinder their usage. Recently, tissue-engineering technology provided a potential breakthrough for solving these problems. Three-dimensional (3D) printing technology has been used to mimic tissues and organs suitable for transplantation, but applications for the liver have been rare. A 3D bioprinting system was used to construct 3D printed hepatic structures using alginate. HepG2 cells were cultured on these 3D structures for 3 weeks and examined by fluorescence microscopy, histology and immunohistochemistry. The expression of liverspecific markers was quantified on days 1, 7, 14, and 21. The cells grew well on the alginate scaffold, and liver-specific gene expression increased. The cells grew more extensively in 3D culture than two-dimensional culture and exhibited better structural aspects of the liver, indicating that the 3D bioprinting method recapitulates the liver architecture. The 3D bioprinting of hepatic structures appears feasible. This technology may become a major tool and provide a bridge between basic science and the clinical challenges for regenerative medicine of the liver.
The Rio de Janeiro Municipality's Services Portfolio and Health Actions in Primary Care in Brazil.
Salazar, Bianca Alves; Campos, Mônica Rodrigues; Luiza, Vera Lucia
2017-03-01
This study aimed to identify the provision of actions and procedures by family health teams (FHSt), based on Rio de Janeiro Municipality's (MRJ) Health Services Portfolio (HSP) and the main factors associated with this provision, in the different population strata. Data from the National Program for Improving Access and Quality of Primary Healthcare were used and implemented at the national level into 17,202 FHSts from June to September 2012. Outcome variables were "FHSt belonging to MRJ" and "FHSt providing all nine CS-MRJ procedures". Uni-, bi- and multivariate analysis were performed. A better performance of the MRJ in relation to other major urban centers (EP6#) (p<5%) was noted in 10 of the 14 health actions analyzed. The electronic medical record showed a level of deployment in MRJ's FHSts of 96%, contrasting with 34% in the EP6# and 14% in Brazil. Both the MRJ and EP6# evidenced low supply of mental health services (about 56%). While the supply of low-complexity procedures was a major problem in large cities, the supply of health actions in the different health care lines was a larger problem in small municipalities. Overall, the MRJ showed better performance when compared to the average of large municipalities. The health service portfolio appeared to be an important management tool.
Acquiring an understanding of design: evidence from children's insight problem solving.
Defeyter, Margaret Anne; German, Tim P
2003-09-01
The human ability to make tools and use them to solve problems may not be zoologically unique, but it is certainly extraordinary. Yet little is known about the conceptual machinery that makes humans so competent at making and using tools. Do adults and children have concepts specialized for understanding human-made artifacts? If so, are these concepts deployed in attempts to solve novel problems? Here we present new data, derived from problem-solving experiments, which support the following. (i) The structure of the child's concept of artifact function changes profoundly between ages 5 and 7. At age 5, the child's conceptual machinery defines the function of an artifact as any goal a user might have; by age 7, its function is defined by the artifact's typical or intended use. (ii) This conceptual shift has a striking effect on problem-solving performance, i.e. the child's concept of artifact function appears to be deployed in problem solving. (iii) This effect on problem solving is not caused by differences in the amount of knowledge that children have about the typical use of a particular tool; it is mediated by the structure of the child's artifact concept (which organizes and deploys the child's knowledge). In two studies, children between 5 and 7 years of age were matched for their knowledge of what a particular artifact "is for", and then given a problem that can only be solved if that tool is used for an atypical purpose. All children performed well in a baseline condition. But when they were primed by a demonstration of the artifact's typical function, 5-year-old children solved the problem much faster than 6-7-year-old children. Because all children knew what the tools were for, differences in knowledge alone cannot explain the results. We argue that the older children were slower to solve the problem when the typical function was primed because (i) their artifact concept plays a role in problem solving, and (ii) intended purpose is central to their concept of artifact function, but not to that of the younger children.
Trident: A Universal Tool for Generating Synthetic Absorption Spectra from Astrophysical Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hummels, Cameron B.; Smith, Britton D.; Silvia, Devin W.
Hydrodynamical simulations are increasingly able to accurately model physical systems on stellar, galactic, and cosmological scales; however, the utility of these simulations is often limited by our ability to directly compare them with the data sets produced by observers: spectra, photometry, etc. To address this problem, we have created trident, a Python-based open-source tool for post-processing hydrodynamical simulations to produce synthetic absorption spectra and related data. trident can (i) create absorption-line spectra for any trajectory through a simulated data set mimicking both background quasar and down-the-barrel configurations; (ii) reproduce the spectral characteristics of common instruments like the Cosmic Origins Spectrograph;more » (iii) operate across the ultraviolet, optical, and infrared using customizable absorption-line lists; (iv) trace simulated physical structures directly to spectral features; (v) approximate the presence of ion species absent from the simulation outputs; (vi) generate column density maps for any ion; and (vii) provide support for all major astrophysical hydrodynamical codes. trident was originally developed to aid in the interpretation of observations of the circumgalactic medium and intergalactic medium, but it remains a general tool applicable in other contexts.« less
Computer programing for geosciences: Teach your students how to make tools
NASA Astrophysics Data System (ADS)
Grapenthin, Ronni
2011-12-01
When I announced my intention to pursue a Ph.D. in geophysics, some people gave me confused looks, because I was working on a master's degree in computer science at the time. My friends, like many incoming geoscience graduate students, have trouble linking these two fields. From my perspective, it is pretty straightforward: Much of geoscience evolves around novel analyses of large data sets that require custom tools—computer programs—to minimize the drudgery of manual data handling; other disciplines share this characteristic. While most faculty adapted to the need for tool development quite naturally, as they grew up around computer terminal interfaces, incoming graduate students lack intuitive understanding of programing concepts such as generalization and automation. I believe the major cause is the intuitive graphical user interfaces of modern operating systems and applications, which isolate the user from all technical details. Generally, current curricula do not recognize this gap between user and machine. For students to operate effectively, they require specialized courses teaching them the skills they need to make tools that operate on particular data sets and solve their specific problems. Courses in computer science departments are aimed at a different audience and are of limited help.
Visualization of Atmospheric Water Vapor Data for SAGE
NASA Technical Reports Server (NTRS)
Kung, Mou-Liang; Chu, W. P. (Technical Monitor)
2000-01-01
The goal of this project was to develop visualization tools to study the water vapor dynamics using the Stratospheric Aerosol and Gas Experiment 11 (SAGE 11) water vapor data. During the past years, we completed the development of a visualization tool called EZSAGE, and various Gridded Water Vapor plots, tools deployed on the web to provide users with new insight into the water vapor dynamics. Results and experiences from this project, including papers, tutorials and reviews were published on the main Web page. Additional publishing effort has been initiated to package EZSAGE software for CD production and distribution. There have been some major personnel changes since Fall, 1998. Dr. Mou-Liang Kung, a Professor of Computer Science assumed the PI position vacated by Dr. Waldo Rodriguez who was on leave. However, former PI, Dr. Rodriguez continued to serve as a research adviser to this project to assure smooth transition and project completion. Typically in each semester, five student research assistants were hired and trained. Weekly group meetings were held to discuss problems, progress, new research direction, and activity planning. Other small group meetings were also held regularly for different objectives of this project. All student research assistants were required to submit reports for conference submission.
Gleibs, Ilka H
2017-08-01
New technologies like large-scale social media sites (e.g., Facebook and Twitter) and crowdsourcing services (e.g., Amazon Mechanical Turk, Crowdflower, Clickworker) are impacting social science research and providing many new and interesting avenues for research. The use of these new technologies for research has not been without challenges, and a recently published psychological study on Facebook has led to a widespread discussion of the ethics of conducting large-scale experiments online. Surprisingly little has been said about the ethics of conducting research using commercial crowdsourcing marketplaces. In this article, I focus on the question of which ethical questions are raised by data collection with crowdsourcing tools. I briefly draw on the implications of Internet research more generally, and then focus on the specific challenges that research with crowdsourcing tools faces. I identify fair pay and the related issue of respect for autonomy, as well as problems with the power dynamic between researcher and participant, which has implications for withdrawal without prejudice, as the major ethical challenges of crowdsourced data. Furthermore, I wish to draw attention to how we can develop a "best practice" for researchers using crowdsourcing tools.
NASA Astrophysics Data System (ADS)
Fedonin, O. N.; Handozhko, A. V.; Fedukov, A. G.
2018-03-01
The problem of mechanical processing, in particular, grinding products from leucosapphire, is considered. The main problem with this treatment is the need to adjust the diamond tool. One of the methods of tool trueing using loose abrasive technique is considered. The results of a study on restoring the tool cutting ability, its shape and profile after straightening are given.
Ngo, N S; Zhong, N; Bao, X
2018-04-15
Transboundary air pollution is a global environmental and public health problem including in the U.S., where pollution emissions from China, the largest emitter of anthropogenic air pollution in the world, can travel across the Pacific Ocean and reach places like California and Oregon. We examine the effects of transboundary air pollution following major events in China, specifically sandstorms, a natural-occurring source of air pollution, and Chinese New Year, a major 7-day holiday, on background air quality in the U.S. We focus on high elevation sites on the west coast between 2000 and 2013. We use regression analysis and a natural experiment to exploit the variation in the timing of these events in China, which are plausibly uncorrelated to other factors that affect air quality in China and the U.S. We find that sandstorms are associated with statistically significant increases in background coarse and fine particulate matter (PM) in the U.S., representing between 16 and 39% of average weekly PM levels. We also find Chinese New Year is associated with modest reductions in background air quality in the U.S., representing between 0.4 and 2.5% of PM levels. Findings are robust to different models and falsification tests. These results suggest that regression analysis could be a powerful tool to complement other, more widely used techniques in the environmental sciences that study this problem. This also has important implications for policymakers, who could track major sandstorms in China and prepare for possible increased foreign pollution emissions in the U.S. Copyright © 2018 Elsevier Ltd. All rights reserved.
Dynamic optimization case studies in DYNOPT tool
NASA Astrophysics Data System (ADS)
Ozana, Stepan; Pies, Martin; Docekal, Tomas
2016-06-01
Dynamic programming is typically applied to optimization problems. As the analytical solutions are generally very difficult, chosen software tools are used widely. These software packages are often third-party products bound for standard simulation software tools on the market. As typical examples of such tools, TOMLAB and DYNOPT could be effectively applied for solution of problems of dynamic programming. DYNOPT will be presented in this paper due to its licensing policy (free product under GPL) and simplicity of use. DYNOPT is a set of MATLAB functions for determination of optimal control trajectory by given description of the process, the cost to be minimized, subject to equality and inequality constraints, using orthogonal collocation on finite elements method. The actual optimal control problem is solved by complete parameterization both the control and the state profile vector. It is assumed, that the optimized dynamic model may be described by a set of ordinary differential equations (ODEs) or differential-algebraic equations (DAEs). This collection of functions extends the capability of the MATLAB Optimization Tool-box. The paper will introduce use of DYNOPT in the field of dynamic optimization problems by means of case studies regarding chosen laboratory physical educational models.
The role of optimization in the next generation of computer-based design tools
NASA Technical Reports Server (NTRS)
Rogan, J. Edward
1989-01-01
There is a close relationship between design optimization and the emerging new generation of computer-based tools for engineering design. With some notable exceptions, the development of these new tools has not taken full advantage of recent advances in numerical design optimization theory and practice. Recent work in the field of design process architecture has included an assessment of the impact of next-generation computer-based design tools on the design process. These results are summarized, and insights into the role of optimization in a design process based on these next-generation tools are presented. An example problem has been worked out to illustrate the application of this technique. The example problem - layout of an aircraft main landing gear - is one that is simple enough to be solved by many other techniques. Although the mathematical relationships describing the objective function and constraints for the landing gear layout problem can be written explicitly and are quite straightforward, an approximation technique has been used in the solution of this problem that can just as easily be applied to integrate supportability or producibility assessments using theory of measurement techniques into the design decision-making process.
Informatics for maize research: What is possible, and what is practical?
USDA-ARS?s Scientific Manuscript database
The informatics tools and technologies developed to address problems in fields outside of biology often drive what becomes available to biologists. Within the biological sciences, research groups have made headway implementing tools to solve problems of interest to maize researchers, but we do not ...
ERIC Educational Resources Information Center
Lockwood, Elise
2014-01-01
Formulas, problem types, keywords, and tricky techniques can certainly be valuable tools for successful counters. However, they can easily become substitutes for critical thinking about counting problems and for deep consideration of the set of outcomes. Formulas and techniques should serve as tools for students as they think critically about…
Tool use in neurodegenerative diseases: Planning or technical reasoning?
Baumard, Josselin; Lesourd, Mathieu; Remigereau, Chrystelle; Jarry, Christophe; Etcharry-Bouyx, Frédérique; Chauviré, Valérie; Osiurak, François; Le Gall, Didier
2017-04-29
Recent works showed that tool use can be impaired in stroke patients because of either planning or technical reasoning deficits, but these two hypotheses have not yet been compared in the field of neurodegenerative diseases. The aim of this study was to address the relationships between real tool use, mechanical problem-solving, and planning skills in patients with Alzheimer's disease (AD, n = 32), semantic dementia (SD, n = 16), and corticobasal syndrome (CBS, n = 9). Patients were asked to select and use ten common tools, to solve three mechanical problems, and to complete the Tower of London test. Motor function and episodic memory were controlled using the Purdue Pegboard Test and the BEC96 questionnaire, respectively. A data-transformation method was applied to avoid ceiling effects, and single-case analysis was performed based on raw scores and completion time. All groups demonstrated either impaired or slowed tool use. Planning deficits were found only in the AD group. Mechanical problem-solving deficits were observed only in the AD and CBS groups. Performance in the Tower of London test was the best predictor of tool use skills in the AD group, suggesting these patients had general rather than mechanical problem-solving deficits. Episodic memory seemed to play little role in performance. Motor dysfunction tended to be associated with tool use skills in CBS patients, while tool use disorders are interpreted as a consequence of the semantic loss in SD in line with previous works. These findings may encourage caregivers to set up disease-centred interventions. © 2017 The British Psychological Society.
Use of multicriteria decision analysis to address conservation conflicts.
Davies, A L; Bryce, R; Redpath, S M
2013-10-01
Conservation conflicts are increasing on a global scale and instruments for reconciling competing interests are urgently needed. Multicriteria decision analysis (MCDA) is a structured, decision-support process that can facilitate dialogue between groups with differing interests and incorporate human and environmental dimensions of conflict. MCDA is a structured and transparent method of breaking down complex problems and incorporating multiple objectives. The value of this process for addressing major challenges in conservation conflict management is that MCDA helps in setting realistic goals; entails a transparent decision-making process; and addresses mistrust, differing world views, cross-scale issues, patchy or contested information, and inflexible legislative tools. Overall we believe MCDA provides a valuable decision-support tool, particularly for increasing awareness of the effects of particular values and choices for working toward negotiated compromise, although an awareness of the effect of methodological choices and the limitations of the method is vital before applying it in conflict situations. © 2013 Society for Conservation Biology.
NASA Astrophysics Data System (ADS)
Skouloudis, Antonis; Evangelinos, Konstantinos; Kourmousis, Fotis
2009-08-01
The purpose of this article is twofold. First, evaluation scoring systems for triple bottom line (TBL) reports to date are examined and potential methodological weaknesses and problems are highlighted. In this context, a new assessment methodology is presented based explicitly on the most widely acknowledged standard on non-financial reporting worldwide, the Global Reporting Initiative (GRI) guidelines. The set of GRI topics and performance indicators was converted into scoring criteria while the generic scoring devise was set from 0 to 4 points. Secondly, the proposed benchmark tool was applied to the TBL reports published by Greek companies. Results reveal major gaps in reporting practices, stressing the need for the further development of internal systems and processes in order to collect essential non-financial performance data. A critical overview of the structure and rationale of the evaluation tool in conjunction with the Greek case study is discussed while recommendations for future research on the field of this relatively new form of reporting are suggested.
Skouloudis, Antonis; Evangelinos, Konstantinos; Kourmousis, Fotis
2009-08-01
The purpose of this article is twofold. First, evaluation scoring systems for triple bottom line (TBL) reports to date are examined and potential methodological weaknesses and problems are highlighted. In this context, a new assessment methodology is presented based explicitly on the most widely acknowledged standard on non-financial reporting worldwide, the Global Reporting Initiative (GRI) guidelines. The set of GRI topics and performance indicators was converted into scoring criteria while the generic scoring devise was set from 0 to 4 points. Secondly, the proposed benchmark tool was applied to the TBL reports published by Greek companies. Results reveal major gaps in reporting practices, stressing the need for the further development of internal systems and processes in order to collect essential non-financial performance data. A critical overview of the structure and rationale of the evaluation tool in conjunction with the Greek case study is discussed while recommendations for future research on the field of this relatively new form of reporting are suggested.
DART - LTQ ORBITRAP as an expedient tool for the identification of synthetic cannabinoids.
Habala, Ladislav; Valentová, Jindra; Pechová, Iveta; Fuknová, Mária; Devínsky, Ferdinand
2016-05-01
Synthetic cannabinoids as designer drugs constitute a major problem due to their rapid increase in number and the difficulties connected with their identification in complex mixtures. DART (Direct Analysis in Real Time) has emerged as an advantageous tool for the direct and rapid analysis of complex samples by mass spectrometry. Here we report on the identification of six synthetic cannabinoids originating from seized material in various matrices, employing the combination of ambient pressure ion source DART and hybrid ion trap - LTQ ORBITRAP mass spectrometer. This report also describes the sampling techniques for the provided herbal material containing the cannabinoids, either directly as plant parts or as an extract in methanol and their influence on the outcome of the analysis. The high resolution mass spectra supplied by the LTQ ORBITRAP instrument allowed for an unambiguous assignment of target compounds. The utilized instrumental coupling proved to be a convenient way for the identification of synthetic cannabinoids in real-world samples. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Determining Chemotherapy Tolerance in Older Patients With Cancer
Kim, Jerome; Hurria, Arti
2014-01-01
Older adults with cancer constitute a heterogeneous group of patients who pose unique challenges for oncology care. One major concern is how to identify patients who are at a higher risk for chemotherapy intolerance, because a standard oncology workup may not always be able to distinguish an older individual’s level of risk for treatment-related complications. Geriatric oncologists incorporate tools used in the field of geriatrics, and have developed the Comprehensive Geriatric Assessment to enhance the standard oncology workup. This assessment pinpoints problems with daily activities, comorbidities, medications, nutritional status, cognitive function, psychological state, and social support systems, all of which are risk factors for treatment vulnerability in older adults with cancer. Additional tools that also serve to predict chemotherapy toxicity in older patients with cancer are now available to identify patients at higher risk for morbidity and mortality. Together, these instruments complement the standard oncology workup by providing a global assessment, thereby guiding therapeutic interventions that may improve a patient’s quality of life and clinical outcomes. PMID:24335684
Browns Ferry turbine team-it`s all in the planning
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1995-05-01
This article illustrates a good example of how a project was creatively completed ahead of schedule and under budget. When the Brown`s Ferry turbine maintenance team took on the task of servicing unit 2`s turbine, the work was like building a ship in a bottle. {open_quotes}We had no room in which to work and only standard tools for the maintenance{close_quotes}, said Jim Roche, turbine manager. The big problem on the turbine floor is that there is only one single overhead crane for lifting the turbine components. All the major turbine components and support equipment are on the same floor. Eachmore » one requires a crane, and there is only one crane. There is limited laydown space. To do the maintenance properly, the team had to have a maintenance schedule it felt comfortable with, industry experience, tools yet to be invented, and money. The design method for this schedule is presented.« less
Group Mirrors to Support Interaction Regulation in Collaborative Problem Solving
ERIC Educational Resources Information Center
Jermann, Patrick; Dillenbourg, Pierre
2008-01-01
Two experimental studies test the effect of group mirrors upon quantitative and qualitative aspects of participation in collaborative problem solving. Mirroring tools consist of a graphical representation of the group's actions which is dynamically updated and displayed to the collaborators. In addition, metacognitive tools display a standard for…
Replication of Psycholinguistic Experiments and the Resolution of Inconsistencies
ERIC Educational Resources Information Center
Rákosi, Csilla
2017-01-01
Non-exact replications are regarded as effective tools of problem solving in psycholinguistic research because they lead to more plausible experimental results; however, they are also ineffective tools of problem solving because they trigger cumulative contradictions among different replications of an experiment. This paper intends to resolve this…
Private Education as a Policy Tool in Turkey
ERIC Educational Resources Information Center
Cinoglu, Mustafa
2006-01-01
This paper discusses privatization as policy tool to solve educational problems in Turkey. Turkey, as a developing country, is faced with many problems in education. Large class size, low enrollment rate, girl's education, high illiteracy rate, religious education, textbooks, curriculum and multicultural education are some of the important…
Preliminary Development of an Object-Oriented Optimization Tool
NASA Technical Reports Server (NTRS)
Pak, Chan-gi
2011-01-01
The National Aeronautics and Space Administration Dryden Flight Research Center has developed a FORTRAN-based object-oriented optimization (O3) tool that leverages existing tools and practices and allows easy integration and adoption of new state-of-the-art software. The object-oriented framework can integrate the analysis codes for multiple disciplines, as opposed to relying on one code to perform analysis for all disciplines. Optimization can thus take place within each discipline module, or in a loop between the central executive module and the discipline modules, or both. Six sample optimization problems are presented. The first four sample problems are based on simple mathematical equations; the fifth and sixth problems consider a three-bar truss, which is a classical example in structural synthesis. Instructions for preparing input data for the O3 tool are presented.
NASA Astrophysics Data System (ADS)
Murage, Francis Ndwiga
The stated research problem of this study was to examine the relationship between motivational factors and the degree to which the higher education faculty integrate CMC tools into their courses. The study population and sample involved higher education faculty teaching in science departments at one public university and three public colleges in the state of West Virginia (N = 153). A Likert-type rating scale survey was used to collect data based on the research questions. Two parts of the survey were adopted from previous studies while the other two were self-constructed. Research questions and hypothesis were analyzed using both descriptive and inferential analyses. The study results established a positive relationship between motivational factors and the degree the higher education faculty integrate CMC tools in their courses. The results in addition established that faculty are highly motivated to integrate CMC tools by intrinsic factors, moderately motivated by environmental factors and least motivated by extrinsic factors. The results also established that the most integrated CMC tools were those that support asynchronous methods of communication while the least integrated were those that support synchronous methods of communication. A major conclusion made was that members of higher education faculty are more likely to be motivated to integrate CMC tools into their courses by intrinsic factors rather than extrinsic or environmental factors. It was further concluded that intrinsic factors that supported and enhanced student learning as well as those that were altruistic in nature significantly influenced the degree of CMC integration. The study finally concluded that to larger extent, there is a relationship between motivational factors and the degree to which the higher education faculty integrate CMC tools in their courses. A major implication of this study was that institutions that wish to promote integration of CMC technologies should provide as much evidence as possible that the new mode of teaching will improve learning and meet the teaching needs of individual faculty. Further, institutional leadership should recognize and consider individual differences among faculty, especially acknowledging that locus of motivation is not the same for everyone and that it changes overtime depending on internal and external factors.
Preeti, Bajaj; Ashish, Ahuja; Shriram, Gosavi
2013-12-01
As the "Science of Medicine" is getting advanced day-by-day, need for better pedagogies & learning techniques are imperative. Problem Based Learning (PBL) is an effective way of delivering medical education in a coherent, integrated & focused manner. It has several advantages over conventional and age-old teaching methods of routine. It is based on principles of adult learning theory, including student's motivation, encouragement to set goals, think critically about decision making in day-to-day operations. Above all these, it stimulates challenge acceptance and learning curiosity among students and creates pragmatic educational program. To measure the effectiveness of the "Problem Based Learning" as compared to conventional theory/didactic lectures based learning. The study was conducted on 72 medical students from Dayanand Medical College & Hospital, Ludhiana. Two modules of problem based sessions designed and delivered. Pre & Post-test score's scientific statistical analysis was done. Student feed-back received based on questionnaire in the five-point Likert scale format. Significant improvement in overall performance observed. Feedback revealed majority agreement that "Problem-based learning" helped them create interest (88.8 %), better understanding (86%) & promotes self-directed subject learning (91.6 %). Substantial improvement in the post-test scores clearly reveals acceptance of PBL over conventional learning. PBL ensures better practical learning, ability to create interest, subject understanding. It is a modern-day educational strategy, an effective tool to objectively improve the knowledge acquisition in Medical Teaching.
Benchmarking a Visual-Basic based multi-component one-dimensional reactive transport modeling tool
NASA Astrophysics Data System (ADS)
Torlapati, Jagadish; Prabhakar Clement, T.
2013-01-01
We present the details of a comprehensive numerical modeling tool, RT1D, which can be used for simulating biochemical and geochemical reactive transport problems. The code can be run within the standard Microsoft EXCEL Visual Basic platform, and it does not require any additional software tools. The code can be easily adapted by others for simulating different types of laboratory-scale reactive transport experiments. We illustrate the capabilities of the tool by solving five benchmark problems with varying levels of reaction complexity. These literature-derived benchmarks are used to highlight the versatility of the code for solving a variety of practical reactive transport problems. The benchmarks are described in detail to provide a comprehensive database, which can be used by model developers to test other numerical codes. The VBA code presented in the study is a practical tool that can be used by laboratory researchers for analyzing both batch and column datasets within an EXCEL platform.
The integration of FMEA with other problem solving tools: A review of enhancement opportunities
NASA Astrophysics Data System (ADS)
Ng, W. C.; Teh, S. Y.; Low, H. C.; Teoh, P. C.
2017-09-01
Failure Mode Effect Analysis (FMEA) is one the most effective and accepted problem solving (PS) tools for most of the companies in the world. Since FMEA was first introduced in 1949, practitioners have implemented FMEA in various industries for their quality improvement initiatives. However, studies have shown that there are drawbacks that hinder the effectiveness of FMEA for continuous quality improvement from product design to manufacturing. Therefore, FMEA is integrated with other PS tools such as inventive problem solving methodology (TRIZ), Quality Function Deployment (QFD), Root Cause Analysis (RCA) and seven basic tools of quality to address the drawbacks. This study begins by identifying the drawbacks in FMEA. A comprehensive literature review on the integration of FMEA with other tools is carried out to categorise the integrations based on the drawbacks identified. The three categories are inefficiency of failure analysis, psychological inertia and neglect of customers’ perspective. This study concludes by discussing the gaps and opportunities in the integration for future research.
Applications of colored petri net and genetic algorithms to cluster tool scheduling
NASA Astrophysics Data System (ADS)
Liu, Tung-Kuan; Kuo, Chih-Jen; Hsiao, Yung-Chin; Tsai, Jinn-Tsong; Chou, Jyh-Horng
2005-12-01
In this paper, we propose a method, which uses Coloured Petri Net (CPN) and genetic algorithm (GA) to obtain an optimal deadlock-free schedule and to solve re-entrant problem for the flexible process of the cluster tool. The process of the cluster tool for producing a wafer usually can be classified into three types: 1) sequential process, 2) parallel process, and 3) sequential parallel process. But these processes are not economical enough to produce a variety of wafers in small volume. Therefore, this paper will propose the flexible process where the operations of fabricating wafers are randomly arranged to achieve the best utilization of the cluster tool. However, the flexible process may have deadlock and re-entrant problems which can be detected by CPN. On the other hand, GAs have been applied to find the optimal schedule for many types of manufacturing processes. Therefore, we successfully integrate CPN and GAs to obtain an optimal schedule with the deadlock and re-entrant problems for the flexible process of the cluster tool.
HPC Profiling with the Sun Studio™ Performance Tools
NASA Astrophysics Data System (ADS)
Itzkowitz, Marty; Maruyama, Yukon
In this paper, we describe how to use the Sun Studio Performance Tools to understand the nature and causes of application performance problems. We first explore CPU and memory performance problems for single-threaded applications, giving some simple examples. Then, we discuss multi-threaded performance issues, such as locking and false-sharing of cache lines, in each case showing how the tools can help. We go on to describe OpenMP applications and the support for them in the performance tools. Then we discuss MPI applications, and the techniques used to profile them. Finally, we present our conclusions.
Peute, Linda W P; de Keizer, Nicolette F; Jaspers, Monique W M
2015-06-01
To compare the performance of the Concurrent (CTA) and Retrospective (RTA) Think Aloud method and to assess their value in a formative usability evaluation of an Intensive Care Registry-physician data query tool designed to support ICU quality improvement processes. Sixteen representative intensive care physicians participated in the usability evaluation study. Subjects were allocated to either the CTA or RTA method by a matched randomized design. Each subject performed six usability-testing tasks of varying complexity in the query tool in a real-working context. Methods were compared with regard to number and type of problems detected. Verbal protocols of CTA and RTA were analyzed in depth to assess differences in verbal output. Standardized measures were applied to assess thoroughness in usability problem detection weighted per problem severity level and method overall effectiveness in detecting usability problems with regard to the time subjects spent per method. The usability evaluation of the data query tool revealed a total of 43 unique usability problems that the intensive care physicians encountered. CTA detected unique usability problems with regard to graphics/symbols, navigation issues, error messages, and the organization of information on the query tool's screens. RTA detected unique issues concerning system match with subjects' language and applied terminology. The in-depth verbal protocol analysis of CTA provided information on intensive care physicians' query design strategies. Overall, CTA performed significantly better than RTA in detecting usability problems. CTA usability problem detection effectiveness was 0.80 vs. 0.62 (p<0.05) respectively, with an average difference of 42% less time spent per subject compared to RTA. In addition, CTA was more thorough in detecting usability problems of a moderate (0.85 vs. 0.7) and severe nature (0.71 vs. 0.57). In this study, the CTA is more effective in usability-problem detection and provided clarification of intensive care physician query design strategies to inform redesign of the query tool. However, CTA does not outperform RTA. The RTA additionally elucidated unique usability problems and new user requirements. Based on the results of this study, we recommend the use of CTA in formative usability evaluation studies of health information technology. However, we recommend further research on the application of RTA in usability studies with regard to user expertise and experience when focusing on user profile customized (re)design. Copyright © 2015 Elsevier Inc. All rights reserved.
Mavaddat, Nahal; Ross, Sheila; Dobbin, Alastair; Williams, Kate; Graffy, Jonathan; Mant, Jonathan
2017-01-01
Post-stroke psychological problems predict poor recovery, while positive affect enables patients to focus on rehabilitation and may improve functional outcomes. Positive Mental Training (PosMT), a guided self-help audio shows promise as a tool in promoting positivity, optimism and resilience. To assess acceptability of training in positivity with PosMT for prevention and management of post-stroke psychological problems and to help with coping with rehabilitation. A modified PosMT tool consisted of 12 audio tracks each lasting 18 minutes, one listened to every day for a week. Survivors and carers were asked to listen for 4 weeks, but could volunteer to listen for more. Interviews took place about experiences of the tool after 4 and 12 weeks. 10 stroke survivors and 5 carers from Stroke Support Groups in the UK. Three stroke survivors did not engage with the tool. The remainder reported positive physical and psychological benefits including improved relaxation, better sleep and reduced anxiety after four weeks. Survivors who completed the programme gained a positive outlook on the future, increased motivation, confidence and ability to cope with rehabilitation. No adverse effects were reported. The PosMT shows potential as a tool for coping with rehabilitation and overcoming post-stroke psychological problems including anxiety and depression.
Varieties of second modernity: the cosmopolitan turn in social and political theory and research.
Beck, Ulrich; Grande, Edgar
2010-09-01
The theme of this special issue is the necessity of a cosmopolitan turn in social and political theory. The question at the heart of this introductory chapter takes the challenge of 'methodological cosmopolitanism', already addressed in a Special Issue on Cosmopolitan Sociology in this journal (Beck and Sznaider 2006), an important step further: How can social and political theory be opened up, theoretically as well as methodologically and normatively, to a historically new, entangled Modernity which threatens its own foundations? How can it account for the fundamental fragility, the mutability of societal dynamics (of unintended side effects, domination and power), shaped by the globalization of capital and risks at the beginning of the twenty-first century? What theoretical and methodological problems arise and how can they be addressed in empirical research? In the following, we will develop this 'cosmopolitan turn' in four steps: firstly, we present the major conceptual tools for a theory of cosmopolitan modernities; secondly, we de-construct Western modernity by using examples taken from research on individualization and risk; thirdly, we address the key problem of methodological cosmopolitanism, namely the problem of defining the appropriate unit of analysis; and finally,we discuss normative questions, perspectives, and dilemmas of a theory of cosmopolitan modernities, in particular problems of political agency and prospects of political realization.
Berberat, Pascal O; de Wit, Niek J; Bockhorn, Maximilian; Lundell, Lars; Drenth, Joost P H
2010-12-01
To define a new educational strategy for the United European Gastroenterology Federation (UEGF) to be followed and implemented in the near future. UEGF organized a consensus-based strategy meeting with stakeholders and key decision makers in European Gastroenterology on Training Innovations in Gastroenterology and Educational Resources. In May 2010, in an 'open-face conference' at Starnberg, Germany, 59 specialists in gastroenterology, hepatology, and related fields from 15 countries and 16 societies participated. Breakout sessions identified the key problem areas, possible solutions, and formulated statements subsequently voted upon in plenum. A majority of the formulated statements (59%) reached a strong agreement. Topics in which UEGF should focus are the future educational activities that include developing ways to advocate multidisciplinarity and integration between levels of care and specialties, ways to improve quality of care, and the development of training tools. The successful outcome of the Training Innovations in Gastroenterology and Educational Resources conference was achieved with the production of a strategy layout for new UEGF educational activities. There was an agreement that improvement in topics related to multidisciplinarity and professionalism, which is crucial for further development. An open-face conference, such as that embodied by the Training Innovations in Gastroenterology and Educational Resources meeting, was shown to be an effective tool in identifying the key problem areas in education and in formulating new strategies.
NASA Astrophysics Data System (ADS)
Jyothi, P. N.; Susmitha, M.; Sharan, P.
2017-04-01
Cutting fluids are used in machining industries for improving tool life, reducing work piece and thermal deformation, improving surface finish and flushing away chips from the cutting zone. Although the application of cutting fluids increases the tool life and Machining efficiency, but it has many major problems related to environmental impacts and health hazards along with recycling & disposal. These problems gave provision for the introduction of mineral, vegetable and animal oils. These oils play an important role in improving various machining properties, including corrosion protection, lubricity, antibacterial protection, even emulsibility and chemical stability. Compared to mineral oils, vegetable oils in general possess high viscosity index, high flash point, high lubricity and low evaporative losses. Vegetable oils can be edible or non-edible oils and Various researchers have proved that edible vegetable oils viz., palm oil, coconut oil, canola oil, soya bean oil can be effectively used as eco-friendly cutting fluid in machining operations. But in present situations harnessing edible oils for lubricants formation restricts the use due to increased demands of growing population worldwide and availability. In the present work, Non-edible vegetable oil like Neem and Honge are been used as cutting fluid for drilling of Mild steel and its effect on cutting temperature, hardness and surface roughness are been investigated. Results obtained are compared with SAE 20W40 (petroleum based cutting fluid)and dry cutting condition.
Brusamolino, Ercole; Maffi, Guido
2004-01-01
This paper critically reviews an experience of health cooperation in an hospital of a rural area of Ivory Coast. This particular situation is analysed in the more general frame of health problems in low-income countries and may suggest priorities for international health cooperation. The analysis of the main causes of avoidable death in poor countries does indicate targets and tools of intervention. In this case, the target was the reduction of infant mortality from anaemia of different origin and from HIV-1 mother-to-infant transmission. The major tool for intervention was the partnership between an Italian teaching and research hospital and the African hospital, with the catalyst of a non-governmental organisation. This paper analyses the different levels at which cooperation developed in this project, from sheer economic support to the implementation of disease-oriented twinning programs that can improve health care and strengthen research capacity on both sides. Besides, medical, ethical and social implications of the ongoing cooperation program are discussed, with particular reference to the problems of preventing mortality from severe anaemia (diet fortification in children and pregnancy and transfusional guidelines in severe malaria) and of preventing mother-to-child neonatal transmission of HIV-1 infection (counselling and testing pregnant women for HIV-1, nevirapine administering to the mother and the baby and breast-feeding).
Tucker, Jalie A.; Simpson, Cathy A.
2011-01-01
Recent innovations in alcohol-focused interventions are aimed at closing the gap between population need and the currently uncommon use of alcohol treatment services. Guided by population data showing the heterogeneity of alcohol problems and the occurrence of natural remissions from problem drinking without treatment, alcohol services have begun to expand beyond clinical treatment to offer the untreated majority of individuals with alcohol-related problems accessible, less-intensive services that use the tools of public health practice. These services often are opportunistic, meaning they can be provided in primary-care or other unspecialized health care or community settings. They also can be delivered by nonspecialists, or can be used by people themselves to address problems with alcohol without entering the health care system. This developing spectrum of services includes screening and brief interventions, guided self-change programs, and telehealth options that often are targeted and tailored for high-risk groups (e.g., college drinkers). Other efforts aimed at reducing barriers to care and increasing motivation to seek help have utilized individual, organizational, and public health strategies. Together, these efforts have potential for helping the treatment field reach people who have realized that they have a drinking problem but have not yet experienced the severe negative consequences that may eventually drive them to seek treatment. Although the evidence supporting several innovations in alcohol services is preliminary, some approaches are well established, and collectively they form an emerging continuum of care for alcohol problems aimed at increasing service availability and improving overall impact on population health. PMID:23580021
A Comparative Study of Involvement and Motivation among Casino Gamblers.
Lee, Choong-Ki; Lee, Bongkoo; Bernhard, Bo Jason; Lee, Tae Kyung
2009-09-01
The purpose of this paper is to investigate three different types of gamblers (which we label "non-problem", "some problem", and "probable pathological gamblers") to determine differences in involvement and motivation, as well as differences in demographic and behavioral variables. The analysis takes advantage of a unique opportunity to sample on-site at a major casino in South Korea, and the resulting purposive sample yielded 180 completed questionnaires in each of the three groups, for a total number of 540. Factor analysis, analysis of variance (ANOVA) and Duncan tests, and Chi-square tests are employed to analyze the data collected from the survey. Findings from ANOVA tests indicate that involvement factors of importance/self-expression, pleasure/interest, and centrality derived from the factor analysis were significantly different among these three types of gamblers. The "probable pathological" and "some problem" gamblers were found to have similar degrees of involvement, and higher degrees of involvement than the non-problem gamblers. The tests also reveal that motivational factors of escape, socialization, winning, and exploring scenery were significantly different among these three types of gamblers. When looking at motivations to visit the casino, "probable pathological" gamblers were more likely to seek winning, the "some problem" group appeared to be more likely to seek escape, and the "non-problem" gamblers indicate that their motivations to visit centered around explorations of scenery and culture in the surrounding casino area. The tools for exploring motivations and involvements of gambling provide valuable and discerning information about the entire spectrum of gamblers.
Characteristics and contents of dreams.
Schredl, Michael
2010-01-01
Dreams have been studied from different perspectives: psychoanalysis, academic psychology, and neurosciences. After presenting the definition of dreaming and the methodological tools of dream research, the major findings regarding the phenomenology of dreaming and the factors influencing dream content are briefly reviewed. The so-called continuity hypothesis stating that dreams reflect waking-life experiences is supported by studies investigating the dreams of psychiatric patients and patients with sleep disorders, i.e., their daytime symptoms and problems are reflected in their dreams. Dreams also have an effect on subsequent waking life, e.g., on daytime mood and creativity. The question about the functions of dreaming is still unanswered and open to future research. Copyright © 2010 Elsevier Inc. All rights reserved.
The significance of levels of organization for scientific research: A heuristic approach.
Brooks, Daniel S; Eronen, Markus I
2018-04-10
The concept of 'levels of organization' has come under fire recently as being useless for scientific and philosophical purposes. In this paper, we show that 'levels' is actually a remarkably resilient and constructive conceptual tool that can be, and in fact is, used for a variety of purposes. To this effect, we articulate an account of the importance of the levels concept seen in light of its status as a major organizing concept of biology. We argue that the usefulness of 'levels' is best seen in the heuristic contributions the concept makes to treating and structuring scientific problems. We illustrate this with two examples from biological research. Copyright © 2018. Published by Elsevier Ltd.
The Intersection of Financial Exploitation and Financial Capacity
Lichtenberg, P.A.
2016-01-01
Research in the past decade has documented that financial exploitation of older adults has become a major problem and Psychology is only recently increasing its presence in efforts to reduce exploitation. During the same time period, Psychology has been a leader in setting best practices for the assessment of diminished capacity in older adults culminating in the 2008 ABA/APA joint publication on a handbook for psychologists. Assessment of financial decision making capacity is often the cornerstone assessment needed in cases of financial exploitation. This paper will examine the intersection of financial exploitation and decision making capacity; introduce a new conceptual model and new tools for both the investigation and prevention of financial exploitation. PMID:27159438
Pereira, Tiago Veiga; Rudnicki, Martina; Pereira, Alexandre Costa; Pombo-de-Oliveira, Maria S; Franco, Rendrik França
2006-01-01
Meta-analysis has become an important statistical tool in genetic association studies, since it may provide more powerful and precise estimates. However, meta-analytic studies are prone to several potential biases not only because the preferential publication of "positive'' studies but also due to difficulties in obtaining all relevant information during the study selection process. In this letter, we point out major problems in meta-analysis that may lead to biased conclusions, illustrating an empirical example of two recent meta-analyses on the relation between MTHFR polymorphisms and risk of acute lymphoblastic leukemia that, despite the similarity in statistical methods and period of study selection, provided partially conflicting results.
Metabolomics window into diabetic complications.
Wu, Tao; Qiao, Shuxuan; Shi, Chenze; Wang, Shuya; Ji, Guang
2018-03-01
Diabetes has become a major global health problem. The elucidation of characteristic metabolic alterations during the diabetic progression is critical for better understanding its pathogenesis, and identifying potential biomarkers and drug targets. Metabolomics is a promising tool to reveal the metabolic changes and the underlying mechanism involved in the pathogenesis of diabetic complications. The present review provides an update on the application of metabolomics in diabetic complications, including diabetic coronary artery disease, diabetic nephropathy, diabetic retinopathy and diabetic neuropathy, and this review provides notes on the prevention and prediction of diabetic complications. © 2017 The Authors. Journal of Diabetes Investigation published by Asian Association for the Study of Diabetes (AASD) and John Wiley & Sons Australia, Ltd.
Electronic Publishing or Electronic Information Handling?
NASA Astrophysics Data System (ADS)
Heck, A.
The current dramatic evolution in information technology is bringing major modifications in the way scientists communicate. The concept of 'electronic publishing' is too restrictive and has often different, sometimes conflicting, interpretations. It is thus giving way to the broader notion of 'electronic information handling' encompassing the diverse types of information, the different media, as well as the various communication methodologies and technologies. New problems and challenges result also from this new information culture, especially on legal, ethical, and educational grounds. The procedures for validating 'published material' and for evaluating scientific activities will have to be adjusted too. 'Fluid' information is becoming a common concept. Electronic publishing cannot be conceived without link to knowledge bases nor without intelligent information retrieval tools.
A guided interview process to improve student pharmacists' identification of drug therapy problems.
Rovers, John; Miller, Michael J; Koenigsfeld, Carrie; Haack, Sally; Hegge, Karly; McCleeary, Erin
2011-02-10
To measure agreement between advanced pharmacy practice experience students using a guided interview process and experienced clinical pharmacists using standard practices to identify drug therapy problems. Student pharmacists enrolled in an advanced pharmacy practice experience (APPE) and clinical pharmacists conducted medication therapy management interviews to identify drug therapy problems in elderly patients recruited from the community. Student pharmacists used a guided interview tool, while clinical pharmacists' interviews were conducted using their usual and customary practices. Student pharmacists also were surveyed to determine their perceptions of the interview tool. Fair to moderate agreement was observed on student and clinical pharmacists' identification of 4 of 7 drug therapy problems. Of those, agreement was significantly higher than chance for 3 drug therapy problems (adverse drug reaction, dosage too high, and needs additional drug therapy) and not significant for 1 (unnecessary drug therapy). Students strongly agreed that the interview tool was useful but agreed less strongly on recommending its use in practice. The guided interview process served as a useful teaching aid to assist student pharmacists to identify drug therapy problems.
A Guided Interview Process to Improve Student Pharmacists' Identification of Drug Therapy Problems
Miller, Michael J.; Koenigsfeld, Carrie; Haack, Sally; Hegge, Karly; McCleeary, Erin
2011-01-01
Objective To measure agreement between advanced pharmacy practice experience students using a guided interview process and experienced clinical pharmacists using standard practices to identify drug therapy problems. Methods Student pharmacists enrolled in an advanced pharmacy practice experience (APPE) and clinical pharmacists conducted medication therapy management interviews to identify drug therapy problems in elderly patients recruited from the community. Student pharmacists used a guided interview tool, while clinical pharmacists' interviews were conducted using their usual and customary practices. Student pharmacists also were surveyed to determine their perceptions of the interview tool. Results Fair to moderate agreement was observed on student and clinical pharmacists' identification of 4 of 7 drug therapy problems. Of those, agreement was significantly higher than chance for 3 drug therapy problems (adverse drug reaction, dosage too high, and needs additional drug therapy) and not significant for 1 (unnecessary drug therapy). Students strongly agreed that the interview tool was useful but agreed less strongly on recommending its use in practice. Conclusions The guided interview process served as a useful teaching aid to assist student pharmacists to identify drug therapy problems. PMID:21451770
Improve Problem Solving Skills through Adapting Programming Tools
NASA Technical Reports Server (NTRS)
Shaykhian, Linda H.; Shaykhian, Gholam Ali
2007-01-01
There are numerous ways for engineers and students to become better problem-solvers. The use of command line and visual programming tools can help to model a problem and formulate a solution through visualization. The analysis of problem attributes and constraints provide insight into the scope and complexity of the problem. The visualization aspect of the problem-solving approach tends to make students and engineers more systematic in their thought process and help them catch errors before proceeding too far in the wrong direction. The problem-solver identifies and defines important terms, variables, rules, and procedures required for solving a problem. Every step required to construct the problem solution can be defined in program commands that produce intermediate output. This paper advocates improved problem solving skills through using a programming tool. MatLab created by MathWorks, is an interactive numerical computing environment and programming language. It is a matrix-based system that easily lends itself to matrix manipulation, and plotting of functions and data. MatLab can be used as an interactive command line or a sequence of commands that can be saved in a file as a script or named functions. Prior programming experience is not required to use MatLab commands. The GNU Octave, part of the GNU project, a free computer program for performing numerical computations, is comparable to MatLab. MatLab visual and command programming are presented here.
Proposal for Land Consolidation Project Solutions for Selected Problem Areas
NASA Astrophysics Data System (ADS)
Wojcik-Len, Justyna; Strek, Zanna
2017-12-01
One of the economic tools for supporting agricultural policy are the activities implemented under the Rural Development Program (RDP). By encouraging agricultural activities and creating equal opportunities for development of farms, among others in areas with unfavourable environmental conditions characterized by low productivity of soils exposed to degradation, decision makers can contribute to improving the spatial structure of rural areas. In Poland, one of the major concerns are agricultural problem areas (regions). In view of this situation, the aim of this article was to characterize the problem areas in question and propose land consolidation project solutions for selected fragments of those areas. This paper presents the results of a review of literature and an analysis of geodetic and cartographic data regarding the problem areas. The process of land consolidation, which is one of the technical and legal instruments supporting the development of rural areas, was characterized. The study allowed the present authors to establish criteria for selecting agricultural problem areas for land consolidation. To develop a proposal for rational management of the problem areas, key general criteria (location, topography, soil quality and usefulness) and specific criteria were defined and assigned weights. A conception of alternative development of the agricultural problem areas was created as part of a land consolidation project. The results were used to create a methodology for the development of agricultural problem areas to be employed during land consolidation in rural areas. Every agricultural space includes areas with unfavourable environmental and soil conditions determined by natural or anthropogenic factors. Development of agricultural problem areas through land consolidation should take into account the specific functions assigned to these areas in land use plans, as well as to comply with legal regulations.
Nepolean, Thirunavukkarsau; Kaul, Jyoti; Mukri, Ganapati; Mittal, Shikha
2018-01-01
Breeding science has immensely contributed to the global food security. Several varieties and hybrids in different food crops including maize have been released through conventional breeding. The ever growing population, decreasing agricultural land, lowering water table, changing climate, and other variables pose tremendous challenge to the researchers to improve the production and productivity of food crops. Drought is one of the major problems to sustain and improve the productivity of food crops including maize in tropical and subtropical production systems. With advent of novel genomics and breeding tools, the way of doing breeding has been tremendously changed in the last two decades. Drought tolerance is a combination of several component traits with a quantitative mode of inheritance. Rapid DNA and RNA sequencing tools and high-throughput SNP genotyping techniques, trait mapping, functional characterization, genomic selection, rapid generation advancement, and other tools are now available to understand the genetics of drought tolerance and to accelerate the breeding cycle. Informatics play complementary role by managing the big-data generated from the large-scale genomics and breeding experiments. Genome editing is the latest technique to alter specific genes to improve the trait expression. Integration of novel genomics, next-generation breeding, and informatics tools will accelerate the stress breeding process and increase the genetic gain under different production systems. PMID:29696027
Filtering NetCDF Files by Using the EverVIEW Slice and Dice Tool
Conzelmann, Craig; Romañach, Stephanie S.
2010-01-01
Network Common Data Form (NetCDF) is a self-describing, machine-independent file format for storing array-oriented scientific data. It was created to provide a common interface between applications and real-time meteorological and other scientific data. Over the past few years, there has been a growing movement within the community of natural resource managers in The Everglades, Fla., to use NetCDF as the standard data container for datasets based on multidimensional arrays. As a consequence, a need surfaced for additional tools to view and manipulate NetCDF datasets, specifically to filter the files by creating subsets of large NetCDF files. The U.S. Geological Survey (USGS) and the Joint Ecosystem Modeling (JEM) group are working to address these needs with applications like the EverVIEW Slice and Dice Tool, which allows users to filter grid-based NetCDF files, thus targeting those data most important to them. The major functions of this tool are as follows: (1) to create subsets of NetCDF files temporally, spatially, and by data value; (2) to view the NetCDF data in table form; and (3) to export the filtered data to a comma-separated value (CSV) file format. The USGS and JEM will continue to work with scientists and natural resource managers across The Everglades to solve complex restoration problems through technological advances.
The JPL functional requirements tool
NASA Technical Reports Server (NTRS)
Giffin, Geoff; Skinner, Judith; Stoller, Richard
1987-01-01
Planetary spacecraft are complex vehicles which are built according to many thousands of requirements. Problems encountered in documenting and maintaining these requirements led to the current attempt to reduce or eliminate these problems by a computer automated data base Functional Requirements Tool. The tool developed at JPL and in use on several JPL Projects is described. The organization and functionality of the Tool, together with an explanation of the data base inputs, their relationships, and use are presented. Methods of interfacing with external documents, representation of tables and figures, and methods of approval and change processing are discussed. The options available for disseminating information from the Tool are identified. The implementation of the Requirements Tool is outlined, and the operation is summarized. The conclusions drawn from this work is that the Requirements Tool represents a useful addition to the System Engineer's Tool kit, it is not currently available elsewhere, and a clear development path exists to expand the capabilities of the Tool to serve larger and more complex projects.
Stämpfli, Dominik; Boeni, Fabienne; Gerber, Andy; Bättig, Victor A D; Hersberger, Kurt E; Lampert, Markus L
2018-06-19
Inappropriate prescribing is linked to increased risks for adverse drug reactions and hospitalisation. Combining explicit and implicit criteria of inappropriate prescribing with the information obtained in patient interviews seems beneficial with regard to the identification of drug-related problems (DRPs) in hospitalised patients. We aimed to investigate the inclusion of pharmacist interviews as part of medication reviews (including the use of explicit and implicit criteria of inappropriate prescribing) to identify DRPs in older inpatients. Clinical medication reviews were performed on geriatric and associated physical and neurological rehabilitation wards in a regional secondary care hospital. Data from electronic medical records, laboratory data, and current treatment regimens were complemented with a novel structured patient interview performed by a clinical pharmacist. The structured interview questioned patients on administration issues, prescribed medication, self-medication, and allergies. The reviews included the use of current treatment guidelines, the Medication Appropriateness Index, the Screening Tool of Older People's Prescriptions (STOPP, v2), and the Screening Tool to Alert to Right Treatment (START, v2). The potential relevance of the DRPs was estimated using the German version of the CLEO tool. In 110 patients, 595 DRPs were identified, averaging 5.4 per patient (range 0-17). The structured interviews identified 249 DRPs (41.8%), of which 227 were not identified by any other source of information. The majority of DRPs (213/249, i.e. 85.5%) identified by patient interview were estimated to be of minor clinical relevance (i.e. limited adherence, knowledge, quality of life, or satisfaction). We demonstrated that structured patient interviews identified additional DRPs that other sources did not identify. Embedded within a comprehensive approach, the structured patient interviews were needed as data resource for over one-third of all DRPs.
Clevenbergh, P; Van der Borght, S F M; van Cranenburgh, K; Janssens, V; Kitenge Lubangi, C; Gahimbaza, L; Lange, J M A; Rinke de Wit, T F; Rijckborst, H
2006-01-01
The lack of human resources for health is presently recognized as a major factor limiting scale-up of antiretroviral treatment (ART) programs in resourcelimited settings. The mobilization of public and private partners, the decentralization of care, and the training of non-HIV specialist nurses and general practitioners could help increase the number of HIV-infected patients receiving ART. In addition to other forms of training, scheduled teleconferences (TCs) have been organized to support a comprehensive HIV treatment program delivered by a private company's health team. To describe the role of the TC as an additional tool in mentoring a company's health care workers (HCWs). For this study, all TC reports were retrospectively reviewed and the questions classified by topic. Participating Heineken physicians evaluated the technical quality and scientific relevance of the TCs through an anonymous survey. From October 2001 to December 2003, 10 HCWs working in 14 operating companies in 5 African countries raised 268 problems during 45 TCs. A total of 79 questions (29%) were asked about antiretroviral (ARV) therapy, 53 (20%) about the diagnosis and treatment of opportunistic infection, 43 (16%) about ARV toxicity, 40 (15%) about care organization and policy, 32 (12%) about laboratory or drug supply, and 21 (8%) about biological parameters. The mean TC attendance rate was 70%. The level of satisfaction among local company physicians was 65% for logistics, 89% for scientific relevance, 84% for applicability of advice, and 85% overall. The most common complaints concerned the poor quality of the telephone connection and language problems for francophone participants. Database-supported teleconferencing could be an additional tool to mentor company HCWs in their routine care of HIV-infected workers and family members. The role and costeffectiveness of telemedicine in improving health outcomes should be further studied.
Membrane Packing Problems: A short Review on computational Membrane Modeling Methods and Tools
Sommer, Björn
2013-01-01
The use of model membranes is currently part of the daily workflow for many biochemical and biophysical disciplines. These membranes are used to analyze the behavior of small substances, to simulate transport processes, to study the structure of macromolecules or for illustrative purposes. But, how can these membrane structures be generated? This mini review discusses a number of ways to obtain these structures. First, the problem will be formulated as the Membrane Packing Problem. It will be shown that the theoretical problem of placing proteins and lipids onto a membrane area differ significantly. Thus, two sub-problems will be defined and discussed. Then, different – partly historical – membrane modeling methods will be introduced. And finally, membrane modeling tools will be evaluated which are able to semi-automatically generate these model membranes and thus, drastically accelerate and simplify the membrane generation process. The mini review concludes with advice about which tool is appropriate for which application case. PMID:24688707
Proposal of a micromagnetic standard problem for ferromagnetic resonance simulations
NASA Astrophysics Data System (ADS)
Baker, Alexander; Beg, Marijan; Ashton, Gregory; Albert, Maximilian; Chernyshenko, Dmitri; Wang, Weiwei; Zhang, Shilei; Bisotti, Marc-Antonio; Franchin, Matteo; Hu, Chun Lian; Stamps, Robert; Hesjedal, Thorsten; Fangohr, Hans
2017-01-01
Nowadays, micromagnetic simulations are a common tool for studying a wide range of different magnetic phenomena, including the ferromagnetic resonance. A technique for evaluating reliability and validity of different micromagnetic simulation tools is the simulation of proposed standard problems. We propose a new standard problem by providing a detailed specification and analysis of a sufficiently simple problem. By analyzing the magnetization dynamics in a thin permalloy square sample, triggered by a well defined excitation, we obtain the ferromagnetic resonance spectrum and identify the resonance modes via Fourier transform. Simulations are performed using both finite difference and finite element numerical methods, with OOMMF and Nmag simulators, respectively. We report the effects of initial conditions and simulation parameters on the character of the observed resonance modes for this standard problem. We provide detailed instructions and code to assist in using the results for evaluation of new simulator tools, and to help with numerical calculation of ferromagnetic resonance spectra and modes in general.
NASA Astrophysics Data System (ADS)
Valentine, Andrew; Belski, Iouri; Hamilton, Margaret
2017-11-01
Problem-solving is a key engineering skill, yet is an area in which engineering graduates underperform. This paper investigates the potential of using web-based tools to teach students problem-solving techniques without the need to make use of class time. An idea generation experiment involving 90 students was designed. Students were surveyed about their study habits and reported they use electronic-based materials more than paper-based materials while studying, suggesting students may engage with web-based tools. Students then generated solutions to a problem task using either a paper-based template or an equivalent web interface. Students who used the web-based approach performed as well as students who used the paper-based approach, suggesting the technique can be successfully adopted and taught online. Web-based tools may therefore be adopted as supplementary material in a range of engineering courses as a way to increase students' options for enhancing problem-solving skills.
Modeling of tool path for the CNC sheet cutting machines
NASA Astrophysics Data System (ADS)
Petunin, Aleksandr A.
2015-11-01
In the paper the problem of tool path optimization for CNC (Computer Numerical Control) cutting machines is considered. The classification of the cutting techniques is offered. We also propose a new classification of toll path problems. The tasks of cost minimization and time minimization for standard cutting technique (Continuous Cutting Problem, CCP) and for one of non-standard cutting techniques (Segment Continuous Cutting Problem, SCCP) are formalized. We show that the optimization tasks can be interpreted as discrete optimization problem (generalized travel salesman problem with additional constraints, GTSP). Formalization of some constraints for these tasks is described. For the solution GTSP we offer to use mathematical model of Prof. Chentsov based on concept of a megalopolis and dynamic programming.
Tool and data interoperability in the SSE system
NASA Technical Reports Server (NTRS)
Shotton, Chuck
1988-01-01
Information is given in viewgraph form on tool and data interoperability in the Software Support Environment (SSE). Information is given on industry problems, SSE system interoperability issues, SSE solutions to tool and data interoperability, and attainment of heterogeneous tool/data interoperability.
Promoting Staff Support in Schools: Solution Circles
ERIC Educational Resources Information Center
Brown, Emma; Henderson, Linda
2012-01-01
The Solution Circle (SC) approach is a flexible tool which encourages participants to maintain a positive, creative approach to problem-solving. This project focussed on the introduction of this approach to staff in a primary and a secondary school. The rationale was to implement a problem-solving/discussion tool that would allow staff to utilise…
Concordancers and Dictionaries as Problem-Solving Tools for ESL Academic Writing
ERIC Educational Resources Information Center
Yoon, Choongil
2016-01-01
The present study investigated how 6 Korean ESL graduate students in Canada used a suite of freely available reference resources, consisting of Web-based corpus tools, Google search engines, and dictionaries, for solving linguistic problems while completing an authentic academic writing assignment in English. Using a mixed methods design, the…
Advanced Tools for Smartphone-Based Experiments: Phyphox
ERIC Educational Resources Information Center
Staacks, S.; Hütz, S.; Stampfer, C.; Heinke, H.
2018-01-01
The sensors in modern smartphones are a promising and cost-effective tool for experimentation in physics education, but many experiments face practical problems. Often the phone is inaccessible during the experiment and the data usually needs to be analyzed subsequently on a computer. We address both problems by introducing a new app, called…
Bolton, Paul; Michalopoulos, Lynn; Ahmed, Ahmed Mohammed Amin; Murray, Laura K; Bass, Judith
2013-01-01
From 1986-9, the Kurdish population of Iraqi Kurdistan was subjected to an intense campaign of military action, and genocide by the central Iraq government. This campaign, referred to as the Anfal, included systematic attacks consisting of aerial bombings, mass deportation, imprisonment, torture, and chemical warfare. It has been estimated that around 200,000 Kurdish people disappeared. To gain a better understanding of current priority mental health and psychosocial problems among Kurdish survivors of the Anfal, and to inform the subsequent design of culturally appropriate and relevant assessment instruments and services to address these problems. The study examined 1) the nature and cause of current problems of survivors of torture and/or civilian attacks and their families, 2) what survivors do to address these problems, and 3) what they felt should be done. We used a grounded theory approach. Free list interviews with a convenience sample (n=42) explored the current problems of Kurdish persons affected by torture. Subsequent key informant interviews (n=21) gathered more detailed information on the priority mental health problem areas identified in the free list interviews. Major mental health problem areas emerging from the free list interviews (and explored in the key informant interviews) included 1) problems directly related to the torture, 2) problems related to the current situation, and 3) problems related to the perception and treatment by others in the community. Problems were similar, but not identical, to Western concepts of depression, anxiety, PTSD and related trauma, and traumatic grief. Iraqi Kurdish torture survivors in Iraq have many mental health and psychosocial problems found among torture survivors elsewhere. The findings suggest that the problems are a result of the trauma experienced as well as current stressors. Development of mental health assessment tools and interventions should therefore address both previous trauma and current stressors.
Cochrane Lecture 1997. What evidence do we need for evidence based medicine?
Hart, J T
1997-01-01
As presently understood, evidence based medicine aims to advance practice from its traditional unverifiable mix of art and science to rational use of measurable inputs and outputs. In practice, however, its advocates accept uncritically a desocialised definition of science, assume that major clinical decisions are taken at the level of secondary specialist rather than primary generalist care, and ignore the multiple nature of most clinical problems, as well as the complexity of social problems within which clinical problems arise and have to be solved. These reductionist assumptions derive from the use of evidence based medicine as a tool for managed care in a transactional model for consultations. If these assumptions persist, they will strengthen reification of disease and promote the episodic output of process regardless of health outcome. We need to work within a different paradigm based on development of patients as co-producers rather than consumers, promoting continuing output of health gain through shared decisions using all relevant evidence, within a broader, socialised definition of science. Adoption of this model would require a major social and cultural shift for health professionals. This shift has already begun, promoted by changes in public attitudes to professional authority, changes in the relation of professionals to managers, and pressures for improved effectiveness and efficiency which, contrary to received wisdom, seem more likely to endorse cooperative than transactional clinical production. Progress on these lines is resisted by rapidly growing and extremely powerful economic and political interests. Health professionals and strategists have yet to recognise and admit the existence of this choice. PMID:9519124
GOING WITH THE FLOW: QUALITY OF LIFE OUTCOMES OF CANCER SURVIVORS WITH URINARY DIVERSION
Gemmill, Robin; Sun, Virginia; Ferrell, Betty; Krouse, Robert S.; Grant, Marcia
2012-01-01
Purpose The purpose of this descriptive study is to describe health related quality of life (HRQOL) concerns among cancer patients with continent and incontinent urinary diversions (UD). Subjects and Settings Study participants were accrued from members of the California United Ostomy Association and two cancer centers in Southern California. Instruments The City of Hope HRQOL-Ostomy Questionnaire (COHHRQOL-O) is a modified HRQOL measurement tool based on the original work done over a number of years by Grant and colleagues. Methods The COHHRQOL-O was mailed to 2890 individuals. Of the 1600 returns there were 307 responses from patients with UD indicating they had a UD and a diagnosis that clearly indicated cancer. Results The majority of respondents were diagnosed with bladder cancer and the average time since surgery was 9.5 years. While most patients reported being sexually active prior to UD less than 27% resumed sexual activity after surgery. Over 75% of patients also reported difficulty in adjusting to their UD with the majority reporting difficulty with urine leakage. Those who were incontinent reported a range of bothersome issues, such as skin problems around the UD, difficulties in managing UD care, fear of recurrence, financial worries, family distress, and uncertainty about the future. Conclusions The results of this study add to our understanding of how patients adjust to a UD and what problems and issues can occur, even years after the initial surgery. Mastering UD care is best done under guidance of a WOC nurse and access to WOC nurse is essential when problems occur. PMID:20075694
Going with the flow: quality-of-life outcomes of cancer survivors with urinary diversion.
Gemmill, Robin; Sun, Virginia; Ferrell, Betty; Krouse, Robert S; Grant, Marcia
2010-01-01
The purpose of this descriptive study was to describe health-related quality of life (HRQOL) concerns among cancer patients with continent urinary diversion (UD) and incontinent UD. Study participants were accrued from members of the California United Ostomy Association and 2 cancer centers in Southern California. The City of Hope HRQOL-Ostomy Questionnaire (COHHRQOL-O) is a modified HRQOL measurement tool based on the original work done over a number of years by Grant and colleagues. The COHHRQOL-O was mailed to 2,890 individuals. Of the 1,600 returns, there were 307 responses from patients with UD, indicating that they had a UD and a diagnosis that clearly indicated cancer. The majority of respondents were diagnosed with bladder cancer, and the average time since surgery was 9.5 years. While most patients reported being sexually active prior to UD, less than 27% resumed sexual activity after surgery. More than 75% of patients also reported difficulty in adjusting to their UD, with the majority reporting difficulty with urine leakage. Those who were incontinent reported a range of bothersome issues, such as skin problems around the UD, difficulties in managing UD care, fear of recurrence, financial worries, family distress, and uncertainty about the future. The results of this study add to our understanding of how patients adjust to a UD and what problems and issues can occur, even years after the initial surgery. Mastering UD care is best done under the guidance of a WOC nurse, and access to WOC nurse is essential when problems occur.
Mathematical Problem Solving: A Review of the Literature.
ERIC Educational Resources Information Center
Funkhouser, Charles
The major perspectives on problem solving of the twentieth century are reviewed--associationism, Gestalt psychology, and cognitive science. The results of the review on teaching problem solving and the uses of computers to teach problem solving are included. Four major issues related to the teaching of problem solving are discussed: (1)…
Protein and gene model inference based on statistical modeling in k-partite graphs.
Gerster, Sarah; Qeli, Ermir; Ahrens, Christian H; Bühlmann, Peter
2010-07-06
One of the major goals of proteomics is the comprehensive and accurate description of a proteome. Shotgun proteomics, the method of choice for the analysis of complex protein mixtures, requires that experimentally observed peptides are mapped back to the proteins they were derived from. This process is also known as protein inference. We present Markovian Inference of Proteins and Gene Models (MIPGEM), a statistical model based on clearly stated assumptions to address the problem of protein and gene model inference for shotgun proteomics data. In particular, we are dealing with dependencies among peptides and proteins using a Markovian assumption on k-partite graphs. We are also addressing the problems of shared peptides and ambiguous proteins by scoring the encoding gene models. Empirical results on two control datasets with synthetic mixtures of proteins and on complex protein samples of Saccharomyces cerevisiae, Drosophila melanogaster, and Arabidopsis thaliana suggest that the results with MIPGEM are competitive with existing tools for protein inference.
NASA Astrophysics Data System (ADS)
Quest, D.; Gayer, C.; Hering, P.
2012-01-01
Laser osteotomy is one possible method of preparing beds for dental implants in the human jaw. A major problem in using this contactless treatment modality is the lack of haptic feedback to control the depth while drilling the implant bed. A contactless measurement system called laser triangulation is presented as a new procedure to overcome this problem. Together with a tomographic picture the actual position of the laser ablation in the bone can be calculated. Furthermore, the laser response is sufficiently fast as to pose little risk to surrounding sensitive areas such as nerves and blood vessels. In the jaw two different bone structures exist, namely the cancellous bone and the compact bone. Samples of both bone structures were examined with test drillings performed either by laser osteotomy or by a conventional rotating drilling tool. The depth of these holes was measured using laser triangulation. The results and the setup are reported in this study.
Spatial and Temporal Flood Risk Assessment for Decision Making Approach
NASA Astrophysics Data System (ADS)
Azizat, Nazirah; Omar, Wan-Mohd-Sabki Wan
2018-03-01
Heavy rainfall, adversely impacting inundation areas, depends on the magnitude of the flood. Significantly, location of settlements, infrastructure and facilities in floodplains result in many regions facing flooding risks. A problem faced by the decision maker in an assessment of flood vulnerability and evaluation of adaptation measures is recurrent flooding in the same areas. Identification of recurrent flooding areas and frequency of floods should be priorities for flood risk management. However, spatial and temporal variability become major factors of uncertainty in flood risk management. Therefore, dynamic and spatial characteristics of these changes in flood impact assessment are important in making decisions about the future of infrastructure development and community life. System dynamics (SD) simulation and hydrodynamic modelling are presented as tools for modelling the dynamic characteristics of flood risk and spatial variability. This paper discusses the integration between spatial and temporal information that is required by the decision maker for the identification of multi-criteria decision problems involving multiple stakeholders.
A study on the operation analysis of the power conditioning system with real HTS SMES coil
NASA Astrophysics Data System (ADS)
Kim, A. R.; Jung, H. Y.; Kim, J. H.; Ali, Mohd. Hasan; Park, M.; Yu, I. K.; Kim, H. J.; Kim, S. H.; Seong, K. C.
2008-09-01
Voltage sag from sudden increasing loads is one of the major problems in the utility network. In order to compensate the voltage sag problem, power compensation devices have widely been developed. In the case of voltage sag, it needs an energy source to overcome the energy caused by voltage sag. According as the SMES device is characterized by its very high response time of charge and discharge, it has widely been researched and developed for more than 20 years. However, before the installation of SMES into utility, the system analysis has to be carried out with a certain simulation tool. This paper presents a real-time simulation algorithm for the SMES system by using the miniaturized SMES model coil whose properties are same as those of real size SMES coil. With this method, researchers can easily analyse the performance of SMES connected into utility network by abstracting the properties from the real modeled SMES coil and using the virtual simulated power network in RSCAD/RTDS.
The feasibility of well-logging measurements of arsenic levels using neutron-activation analysis
Oden, C.P.; Schweitzer, J.S.; McDowell, G.M.
2006-01-01
Arsenic is an extremely toxic metal, which poses a significant problem in many mining environments. Arsenic contamination is also a major problem in ground and surface waters. A feasibility study was conducted to determine if neutron-activation analysis is a practical method of measuring in situ arsenic levels. The response of hypothetical well-logging tools to arsenic was simulated using a readily available Monte Carlo simulation code (MCNP). Simulations were made for probes with both hyperpure germanium (HPGe) and bismuth germanate (BGO) detectors using accelerator and isotopic neutron sources. Both sources produce similar results; however, the BGO detector is much more susceptible to spectral interference than the HPGe detector. Spectral interference from copper can preclude low-level arsenic measurements when using the BGO detector. Results show that a borehole probe could be built that would measure arsenic concentrations of 100 ppm by weight to an uncertainty of 50 ppm in about 15 min. ?? 2006 Elsevier Ltd. All rights reserved.
Sustainable biorefining in wastewater by engineered extreme alkaliphile Bacillus marmarensis
Wernick, David G.; Pontrelli, Sammy P.; Pollock, Alexander W.; Liao, James C.
2016-01-01
Contamination susceptibility, water usage, and inability to utilize 5-carbon sugars and disaccharides are among the major obstacles in industrialization of sustainable biorefining. Extremophilic thermophiles and acidophiles are being researched to combat these problems, but organisms which answer all the above problems have yet to emerge. Here, we present engineering of the unexplored, extreme alkaliphile Bacillus marmarensis as a platform for new bioprocesses which meet all these challenges. With a newly developed transformation protocol and genetic tools, along with optimized RBSs and antisense RNA, we engineered B. marmarensis to produce ethanol at titers of 38 g/l and 65% yields from glucose in unsterilized media. Furthermore, ethanol titers and yields of 12 g/l and 50%, respectively, were produced from cellobiose and xylose in unsterilized seawater and algal-contaminated wastewater. As such, B. marmarensis presents a promising approach for the contamination-resistant biorefining of a wide range of carbohydrates in unsterilized, non-potable seawater. PMID:26831574
Echinococcosis: Control and Prevention.
Craig, P S; Hegglin, D; Lightowlers, M W; Torgerson, P R; Wang, Q
2017-01-01
Human cystic echinococcosis (CE) has been eliminated or significantly reduced as a public health problem in several previously highly endemic regions. This has been achieved by the long-term application of prevention and control measures primarily targeted to deworming dogs, health education, meat inspection, and effective surveillance in livestock and human populations. Human CE, however, remains a serious neglected zoonotic disease in many resource-poor pastoral regions. The incidence of human alveolar echinococcosis (AE) has increased in continental Europe and is a major public health problem in parts of Eurasia. Better understanding of wildlife ecology for fox and small mammal hosts has enabled targeted anthelmintic baiting of fox populations and development of spatially explicit models to predict population dynamics for key intermediate host species and human AE risk in endemic landscapes. Challenges that remain for echinococcosis control include effective intervention in resource-poor communities, better availability of surveillance tools, optimal application of livestock vaccination, and management and ecology of dog and wildlife host populations. Copyright © 2017 Elsevier Ltd. All rights reserved.
Using game technologies to improve the safety of construction plant operations.
Guo, Hongling; Li, Heng; Chan, Greg; Skitmore, Martin
2012-09-01
Many accidents occur world-wide in the use of construction plant and equipment, and safety training is considered by many to be one of the best approaches to their prevention. However, current safety training methods/tools are unable to provide trainees with the hands-on practice needed. Game technology-based safety training platforms have the potential to overcome this problem in a virtual environment. One such platform is described in this paper - its characteristics are analysed and its possible contribution to safety training identified. This is developed and tested by means of a case study involving three major pieces of construction plant, which successfully demonstrates that the platform can improve the process and performance of the safety training involved in their operation. This research not only presents a new and useful solution to the safety training of construction operations, but illustrates the potential use of advanced technologies in solving construction industry problems in general. Copyright © 2011 Elsevier Ltd. All rights reserved.
Chinese approaches to understanding and building resilience in at-risk children and adolescents.
Lee, Tak-Yan; Shek, Daniel T L; Kwong, Wai-Man
2007-04-01
This article discusses the prevailing Chinese belief systems that have bearings on the perception and practices of promoting resilience among children and youth in a major city in China. It briefly describes a huge social intervention program entitled "Understanding the Adolescent Project" to combat the problems among grade 7 students identified as adolescents at risk from 2001 to 2004 in Hong Kong Special Administrative Region. A critical review of the problems encountered by social workers in the delivery of the program is presented to support the move to provide the preventive program for grade 4 students with clinical symptoms on a screening tool for identification of at-risk status. Starting in 2005, a large-scale positive youth development program was being developed for all secondary one to three (grades 7 to 9) students. Encouraging results of the evaluation studies demonstrated the effectiveness of this new preventive program, entitled Positive Adolescent Training through Holistic Social Programs.
Preferred conservation policies of shark researchers.
Shiffman, David S; Hammerschlag, Neil
2016-08-01
There is increasing concern about the conservation status of sharks. However, the presence of numerous different (and potentially mutually exclusive) policies complicates management implementation and public understanding of the process. We distributed an online survey to members of the largest professional shark and ray research societies to assess member knowledge of and attitudes toward different conservation policies. Questions covered society member opinions on conservation and management policies, personal histories of involvement in advocacy and management, and perceptions of the approach of conservation nongovernmental organizations (NGOs) to shark conservation. One hundred and two surveys were completed (overall response rate 21%). Respondents considered themselves knowledgeable about and actively involved in conservation and management policy; a majority believed scientists have a responsibility to advocate for conservation (75%), and majorities have sent formal public comments to policymakers (54%) and included policy suggestions in their papers (53%). They believe sustainable shark fisheries are possible, are currently happening today (in a few places), and should be the goal instead of banning fisheries. Respondents were generally less supportive of newer limit-based (i.e., policies that ban exploitation entirely without a species-specific focus) conservation policy tools, such as shark sanctuaries and bans on the sale of shark fins, than of target-based fisheries management tools (i.e., policies that allow for sustainable harvest of species whose populations can withstand it), such as fishing quotas. Respondents were generally supportive of environmental NGO efforts to conserve sharks but raised concerns about some NGOs that they perceived as using incorrect information and focusing on the wrong problems. Our results show there is an ongoing debate in shark conservation and management circles relative to environmental policy on target-based natural resources management tools versus limit-based conservation tools. They also suggest that closer communication between the scientific and environmental NGO communities may be needed to recognize and reconcile differing values and objectives between these groups. © 2016 Society for Conservation Biology.
Using Structured e-Forum to Support the Legislation Formation Process
NASA Astrophysics Data System (ADS)
Xenakis, Alexandros; Loukis, Euripides
Many public policy problems are 'wicked', being characterised by high complexity, many heterogeneous views and conflicts among various stakeholders, and also lack of mathematically 'optimal' solutions and predefined algorithms for calculating them. The best approach for addressing such problems is through consultation and argumentation among stakeholders. The e-participation research has investigated and suggested several ICT tools for this purpose, such as e-forum, e-petition and e-community tools. This paper investigates the use of an advanced ICT tool, the structured e-forum, for addressing such wicked problems associated with the legislation formation. For this purpose we designed, implemented and evaluated two pilot e-consultations on legislation under formation in the Parliaments of Austria and Greece using a structured e-forum tool based on the Issue Based Information Systems (IBIS) framework. The conclusions drawn reveal the advantages offered by the structured e-forum, but also its difficulties as well.
Water flow algorithm decision support tool for travelling salesman problem
NASA Astrophysics Data System (ADS)
Kamarudin, Anis Aklima; Othman, Zulaiha Ali; Sarim, Hafiz Mohd
2016-08-01
This paper discuss about the role of Decision Support Tool in Travelling Salesman Problem (TSP) for helping the researchers who doing research in same area will get the better result from the proposed algorithm. A study has been conducted and Rapid Application Development (RAD) model has been use as a methodology which includes requirement planning, user design, construction and cutover. Water Flow Algorithm (WFA) with initialization technique improvement is used as the proposed algorithm in this study for evaluating effectiveness against TSP cases. For DST evaluation will go through usability testing conducted on system use, quality of information, quality of interface and overall satisfaction. Evaluation is needed for determine whether this tool can assists user in making a decision to solve TSP problems with the proposed algorithm or not. Some statistical result shown the ability of this tool in term of helping researchers to conduct the experiments on the WFA with improvements TSP initialization.
Family factors in adolescent problematic Internet gaming: A systematic review.
Schneider, Luke A; King, Daniel L; Delfabbro, Paul H
2017-09-01
Background and aims Familial influences are known to affect the likelihood of an adolescent becoming a problem gamer. This systematic review examined some of the key findings in empirical research on family factors related to adolescent problem gaming. Methods A total of 14 studies in the past decade were evaluated. Family-related variables included: (a) parent status (e.g., socioeconomic status and mental health), (b) parent-child relationship (e.g., warmth, conflict, and abuse), (c) parental influence on gaming (e.g., supervision of gaming, modeling, and attitudes toward gaming), and (d) family environment (e.g., household composition). Results The majority of studies have focused on parent-child relationships, reporting that poorer quality relationships are associated with increased severity of problem gaming. The paternal relationship may be protective against problem gaming; therefore, prevention programs should leverage the support of cooperative fathers. Discussion The intergenerational effects of problem gaming require further attention, in light of adult gamers raising their children in a gaming-centric environment. Research has been limited by a reliance on adolescent self-report to understand family dynamics, without gathering corroborating information from parents and other family members. The very high rates of problem gaming (>10%) reported in general population samples raise concerns about the validity of current screening tools. Conclusions Interventions for adolescents may be more effective in some cases if they can address familial influences on problem gaming with the active co-participation of parents, rather than enrolling vulnerable adolescents in individual-based training or temporarily isolating adolescents from the family system.
Family factors in adolescent problematic Internet gaming: A systematic review
Schneider, Luke A.; King, Daniel L.; Delfabbro, Paul H.
2017-01-01
Background and aims Familial influences are known to affect the likelihood of an adolescent becoming a problem gamer. This systematic review examined some of the key findings in empirical research on family factors related to adolescent problem gaming. Methods A total of 14 studies in the past decade were evaluated. Family-related variables included: (a) parent status (e.g., socioeconomic status and mental health), (b) parent–child relationship (e.g., warmth, conflict, and abuse), (c) parental influence on gaming (e.g., supervision of gaming, modeling, and attitudes toward gaming), and (d) family environment (e.g., household composition). Results The majority of studies have focused on parent–child relationships, reporting that poorer quality relationships are associated with increased severity of problem gaming. The paternal relationship may be protective against problem gaming; therefore, prevention programs should leverage the support of cooperative fathers. Discussion The intergenerational effects of problem gaming require further attention, in light of adult gamers raising their children in a gaming-centric environment. Research has been limited by a reliance on adolescent self-report to understand family dynamics, without gathering corroborating information from parents and other family members. The very high rates of problem gaming (>10%) reported in general population samples raise concerns about the validity of current screening tools. Conclusions Interventions for adolescents may be more effective in some cases if they can address familial influences on problem gaming with the active co-participation of parents, rather than enrolling vulnerable adolescents in individual-based training or temporarily isolating adolescents from the family system. PMID:28762279
Grossi, Enzo
2006-01-01
Background In recent years a number of algorithms for cardiovascular risk assessment has been proposed to the medical community. These algorithms consider a number of variables and express their results as the percentage risk of developing a major fatal or non-fatal cardiovascular event in the following 10 to 20 years Discussion The author has identified three major pitfalls of these algorithms, linked to the limitation of the classical statistical approach in dealing with this kind of non linear and complex information. The pitfalls are the inability to capture the disease complexity, the inability to capture process dynamics, and the wide confidence interval of individual risk assessment. Artificial Intelligence tools can provide potential advantage in trying to overcome these limitations. The theoretical background and some application examples related to artificial neural networks and fuzzy logic have been reviewed and discussed. Summary The use of predictive algorithms to assess individual absolute risk of cardiovascular future events is currently hampered by methodological and mathematical flaws. The use of newer approaches, such as fuzzy logic and artificial neural networks, linked to artificial intelligence, seems to better address both the challenge of increasing complexity resulting from a correlation between predisposing factors, data on the occurrence of cardiovascular events, and the prediction of future events on an individual level. PMID:16672045
Jeffries, Thomas C.; Rayu, Smriti; Nielsen, Uffe N.; Lai, Kaitao; Ijaz, Ali; Nazaries, Loic; Singh, Brajesh K.
2018-01-01
Chemical contamination of natural and agricultural habitats is an increasing global problem and a major threat to sustainability and human health. Organophosphorus (OP) compounds are one major class of contaminant and can undergo microbial degradation, however, no studies have applied system-wide ecogenomic tools to investigate OP degradation or use metagenomics to understand the underlying mechanisms of biodegradation in situ and predict degradation potential. Thus, there is a lack of knowledge regarding the functional genes and genomic potential underpinning degradation and community responses to contamination. Here we address this knowledge gap by performing shotgun sequencing of community DNA from agricultural soils with a history of pesticide usage and profiling shifts in functional genes and microbial taxa abundance. Our results showed two distinct groups of soils defined by differing functional and taxonomic profiles. Degradation assays suggested that these groups corresponded to the organophosphorus degradation potential of soils, with the fastest degrading community being defined by increases in transport and nutrient cycling pathways and enzymes potentially involved in phosphorus metabolism. This was against a backdrop of taxonomic community shifts potentially related to contamination adaptation and reflecting the legacy of exposure. Overall our results highlight the value of using holistic system-wide metagenomic approaches as a tool to predict microbial degradation in the context of the ecology of contaminated habitats. PMID:29515526
Approaches for Subgrid Parameterization: Does Scaling Help?
NASA Astrophysics Data System (ADS)
Yano, Jun-Ichi
2016-04-01
Arguably the scaling behavior is a well-established fact in many geophysical systems. There are already many theoretical studies elucidating this issue. However, the scaling law is slow to be introduced in "operational" geophysical modelling, notably for weather forecast as well as climate projection models. The main purpose of this presentation is to ask why, and try to answer this question. As a reference point, the presentation reviews the three major approaches for traditional subgrid parameterization: moment, PDF (probability density function), and mode decomposition. The moment expansion is a standard method for describing the subgrid-scale turbulent flows both in the atmosphere and the oceans. The PDF approach is intuitively appealing as it directly deals with a distribution of variables in subgrid scale in a more direct manner. The third category, originally proposed by Aubry et al (1988) in context of the wall boundary-layer turbulence, is specifically designed to represent coherencies in compact manner by a low--dimensional dynamical system. Their original proposal adopts the proper orthogonal decomposition (POD, or empirical orthogonal functions, EOF) as their mode-decomposition basis. However, the methodology can easily be generalized into any decomposition basis. The mass-flux formulation that is currently adopted in majority of atmospheric models for parameterizing convection can also be considered a special case of the mode decomposition, adopting the segmentally-constant modes for the expansion basis. The mode decomposition can, furthermore, be re-interpreted as a type of Galarkin approach for numerically modelling the subgrid-scale processes. Simple extrapolation of this re-interpretation further suggests us that the subgrid parameterization problem may be re-interpreted as a type of mesh-refinement problem in numerical modelling. We furthermore see a link between the subgrid parameterization and downscaling problems along this line. The mode decomposition approach would also be the best framework for linking between the traditional parameterizations and the scaling perspectives. However, by seeing the link more clearly, we also see strength and weakness of introducing the scaling perspectives into parameterizations. Any diagnosis under a mode decomposition would immediately reveal a power-law nature of the spectrum. However, exploiting this knowledge in operational parameterization would be a different story. It is symbolic to realize that POD studies have been focusing on representing the largest-scale coherency within a grid box under a high truncation. This problem is already hard enough. Looking at differently, the scaling law is a very concise manner for characterizing many subgrid-scale variabilities in systems. We may even argue that the scaling law can provide almost complete subgrid-scale information in order to construct a parameterization, but with a major missing link: its amplitude must be specified by an additional condition. The condition called "closure" in the parameterization problem, and known to be a tough problem. We should also realize that the studies of the scaling behavior tend to be statistical in the sense that it hardly provides complete information for constructing a parameterization: can we specify the coefficients of all the decomposition modes by a scaling law perfectly when the first few leading modes are specified? Arguably, the renormalization group (RNG) is a very powerful tool for reducing a system with a scaling behavior into a low dimension, say, under an appropriate mode decomposition procedure. However, RNG is analytical tool: it is extremely hard to apply it to real complex geophysical systems. It appears that it is still a long way to go for us before we can begin to exploit the scaling law in order to construct operational subgrid parameterizations in effective manner.
Assing Hvidt, Elisabeth; Hansen, Dorte Gilså; Ammentorp, Jette; Bjerrum, Lars; Cold, Søren; Gulbrandsen, Pål; Olesen, Frede; Pedersen, Susanne S; Søndergaard, Jens; Timmermann, Connie; Timm, Helle; Hvidt, Niels Christian
2017-12-01
General practice recognizes the existential dimension as an integral part of multidimensional patient care alongside the physical, psychological and social dimensions. However, general practitioners (GPs) report substantial barriers related to communication with patients about existential concerns. To describe the development of the EMAP tool facilitating communication about existential problems and resources between GPs and patients with cancer. A mixed-methods design was chosen comprising a literature search, focus group interviews with GPs and patients (n = 55) and a two-round Delphi procedure initiated by an expert meeting with 14 experts from Denmark and Norway. The development procedure resulted in a semi-structured tool containing suggestions for 10 main questions and 13 sub-questions grouped into four themes covering the existential dimension. The tool utilized the acronym and mnemonic EMAP (existential communication in general practice) indicating the intention of the tool: to provide a map of possible existential problems and resources that the GP and the patient can discuss to find points of reorientation in the patient's situation. This study resulted in a question tool that can serve as inspiration and help GPs when communicating with cancer patients about existential problems and resources. This tool may qualify GPs' assessment of existential distress, increase the patient's existential well-being and help deepen the GP-patient relationship.
A Successful Senior Seminar: Unsolved Problems in Number Theory
ERIC Educational Resources Information Center
Styer, Robert
2014-01-01
The "Unsolved Problems in Number Theory" book by Richard Guy provides nice problems suitable for a typical math major. We give examples of problems that have worked well in our senior seminar course and some nice results that senior math majors can obtain.
Undavalli, Chaitanya; Das, Piyush; Dutt, Taru; Bhoi, Sanjeev; Kashyap, Rahul
2014-10-01
Traumatic events after a road traffic accident (RTA) can be physical and/or psychological. Posttraumatic stress disorder (PTSD) is one of the major psychological conditions which affect accident victims. Psychological issues may not be addressed in the emergency department(ED) immediately. There have been reports about a mismatch between the timely referrals from ED to occupational or primary care services for these issues. If left untreated, there may be adverse effects on quality of life (QOL) and work productivity. Hospital expenses, loss of income, and loss of work could create a never ending cycle for financial difficulties and burden in trauma victims. The aim of our review is to address the magnitude of PTSD in post-RTA hospitalized patients in Indian subcontinent population. We also attempted to emphasis on few management guidelines. A comprehensive search was conducted on major databases with Medical Subject Headings (MeSH) term 'PTSD or post-traumatic stress' and Emergency department and vehicle or road or highway or automobile or car or truck or trauma and India. Out of 120 studies, a total of six studies met our inclusion criteria and were included in the review. Our interpretation of the problem is that; hospital expenditure due to trauma, time away from work during hospitalization, and reduction in work performance, are three major hits that can lead RTA victims to financial crisis. Proposed management guidelines are; establish a coordinated triage, implementing a screening tool in the ED, and provide psychological counseling.
Accomplishments of the Abrupt Wing Stall (AWS) Program and Future Research Requirements
NASA Technical Reports Server (NTRS)
Hall, Robert M.; Woodson, Shawn H.; Chambers, Joseph R.
2003-01-01
The Abrupt Wing Stall (AWS) Program has addressed the problem of uncommanded lateral motions, such as wing drop and wing rock, at transonic speeds. The genesis of this Program was the experience of the F/A-1 8E/F Program in the late 1990's, when wing drop was discovered in the heart of the maneuver envelope for the pre-production aircraft. While the F/A-1 8E/F problem was subsequently corrected by a leading-edge flap scheduling change and the addition of a porous door to the wing fold fairing, the AWS Program was initiated as a national response to the lack of technology readiness available at the time of the F/A-18E/F Development Program. The AWS Program objectives were to define causal factors for the F/A-18E/F experience, to gain insights into the flow physics associated with wing drop, and to develop methods and analytical tools so that future programs could identify this type of problem before going to flight test. The paper reviews, for the major goals of the AWS Program, the status of the technology before the
Electronic aroma detection technology for forensic and law enforcement applications
NASA Astrophysics Data System (ADS)
Barshick, Stacy-Ann; Griest, Wayne H.; Vass, Arpad A.
1997-02-01
A major problem hindering criminal investigations is the lack of appropriate tools for proper crime scene investigations. Often locating important pieces of evidence means relying on the ability of trained detection canines. Development of analytical technology to uncover and analyze evidence, potentially at the scene, could serve to expedite criminal investigations, searches, and court proceedings. To address this problem, a new technology based on gas sensor arrays was investigated for its applicability to forensic and law enforcement problems. The technology employs an array of sensors that respond to volatile chemical components yielding a characteristic 'fingerprint' pattern representative of the vapor-phase composition of a sample. Sample aromas can be analyzed and identified using artificial neural networks that are trained on known aroma patterns. Several candidate applications based on known technological needs of the forensic and law enforcement communities have been investigated. These applications have included the detection of aromas emanating from cadavers to aid in determining time since death, drug detection for deterring the manufacture, sale, and use of drugs of abuse, and the analysis of fire debris for accelerant identification. The result to date for these applications have been extremely promising and demonstrate the potential applicability of this technology for forensic use.
Multidisciplinary full-mouth rehabilitation with soft tissue regeneration in the esthetic zone.
Liebermann, Anja; Frei, Stefan; Pinheiro Dias Engler, Madalena Lucia; Zuhr, Otto; Prandtner, Otto; Edelhoff, Daniel; Saeidi Pour, Reza
2018-01-01
Oral rehabilitation often requires a multidisciplinary approach including restorative dentistry, prosthodontics, and periodontology to fulfill high esthetic and functional demands, frequently combined with changes in the vertical dimension. The presence of gingival recessions can be associated with numerous factors, such as brushing or preparation trauma and persistent inflammation of the gingiva due to inadequate marginal fit of restorations. Because gingival recessions can cause major esthetic and functional problems, obtaining stability of the gingival tissue around prosthetic restorations is of essential concern. Modifications of the occlusal vertical dimension require sufficient experience of the whole dental team. Especially in patients with functional problems and craniomandibular dysfunction, a newly defined occlusal position should be adequately tested and possibly adjusted. This case report presents a complete prosthetic rehabilitation combined with a periodontal surgical approach for a patient with gingival recessions and functional/esthetic related problems. The vertical dimension was carefully defined through long-term polymethyl methacrylate provisionals as a communication tool between all parts involved. All-ceramic crowns were inserted after periodontal healing as definitive rehabilitation. Complex rehabilitation in patients with high esthetic demands including soft tissue corrections requires a multidisciplinary team approach that consists of periodontal surgeon, dentist and dental technician. © 2017 Wiley Periodicals, Inc.
Development of a new semi-analytical model for cross-borehole flow experiments in fractured media
Roubinet, Delphine; Irving, James; Day-Lewis, Frederick D.
2015-01-01
Analysis of borehole flow logs is a valuable technique for identifying the presence of fractures in the subsurface and estimating properties such as fracture connectivity, transmissivity and storativity. However, such estimation requires the development of analytical and/or numerical modeling tools that are well adapted to the complexity of the problem. In this paper, we present a new semi-analytical formulation for cross-borehole flow in fractured media that links transient vertical-flow velocities measured in one or a series of observation wells during hydraulic forcing to the transmissivity and storativity of the fractures intersected by these wells. In comparison with existing models, our approach presents major improvements in terms of computational expense and potential adaptation to a variety of fracture and experimental configurations. After derivation of the formulation, we demonstrate its application in the context of sensitivity analysis for a relatively simple two-fracture synthetic problem, as well as for field-data analysis to investigate fracture connectivity and estimate fracture hydraulic properties. These applications provide important insights regarding (i) the strong sensitivity of fracture property estimates to the overall connectivity of the system; and (ii) the non-uniqueness of the corresponding inverse problem for realistic fracture configurations.
Barcodes for genomes and applications
Zhou, Fengfeng; Olman, Victor; Xu, Ying
2008-01-01
Background Each genome has a stable distribution of the combined frequency for each k-mer and its reverse complement measured in sequence fragments as short as 1000 bps across the whole genome, for 1
PC Software for Artificial Intelligence Applications.
Epp, H; Kalin, M; Miller, D
1988-05-06
Our review has emphasized that AI tools are programming languages inspired by some problem-solving paradigm. We want to underscore their status as programming languages; even if an AI tool seems to fit a problem perfectly, its proficient use still requires the training and practice associated with any programming language. The programming manuals for PC-Plus, Smalltalk/ V, and Nexpert Object are all tutorial in nature, and the corresponding software packages come with sample applications. We find the manuals to be uniformly good introductions that try to anticipate the problems of a user who is new to the technology. All three vendors offer free technical support by telephone to licensed users. AI tools are sometimes oversold as a way to make programming easy or to avoid it altogether. The truth is that AI tools demand programming-but programming that allows you to concentrate on the essentials of the problem. If we had to implement a diagnostic system, we would look first to a product such as PC-Plus rather than BASIC or C, because PC-Plus is designed specifically for such a problem, whereas these conventional languages are not. If we had to implement a system that required graphical interfaces and could benefit from inheritance, we would look first to an object-oriented system such as Smalltalk/V that provides built-in mechanisms for both. If we had to implement an expert system that called for some mix of AI and conventional techniques, we would look first to a product such as Nexpert Object that integrates various problem-solving technologies. Finally, we might use FORTRAN if we were concerned primarily with programming a well-defined numerical algorithm. AI tools are a valuable complement to traditional languages.
NASA Astrophysics Data System (ADS)
Vermeire, B. C.; Witherden, F. D.; Vincent, P. E.
2017-04-01
First- and second-order accurate numerical methods, implemented for CPUs, underpin the majority of industrial CFD solvers. Whilst this technology has proven very successful at solving steady-state problems via a Reynolds Averaged Navier-Stokes approach, its utility for undertaking scale-resolving simulations of unsteady flows is less clear. High-order methods for unstructured grids and GPU accelerators have been proposed as an enabling technology for unsteady scale-resolving simulations of flow over complex geometries. In this study we systematically compare accuracy and cost of the high-order Flux Reconstruction solver PyFR running on GPUs and the industry-standard solver STAR-CCM+ running on CPUs when applied to a range of unsteady flow problems. Specifically, we perform comparisons of accuracy and cost for isentropic vortex advection (EV), decay of the Taylor-Green vortex (TGV), turbulent flow over a circular cylinder, and turbulent flow over an SD7003 aerofoil. We consider two configurations of STAR-CCM+: a second-order configuration, and a third-order configuration, where the latter was recommended by CD-adapco for more effective computation of unsteady flow problems. Results from both PyFR and STAR-CCM+ demonstrate that third-order schemes can be more accurate than second-order schemes for a given cost e.g. going from second- to third-order, the PyFR simulations of the EV and TGV achieve 75× and 3× error reduction respectively for the same or reduced cost, and STAR-CCM+ simulations of the cylinder recovered wake statistics significantly more accurately for only twice the cost. Moreover, advancing to higher-order schemes on GPUs with PyFR was found to offer even further accuracy vs. cost benefits relative to industry-standard tools.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vermeire, B.C., E-mail: brian.vermeire@concordia.ca; Witherden, F.D.; Vincent, P.E.
First- and second-order accurate numerical methods, implemented for CPUs, underpin the majority of industrial CFD solvers. Whilst this technology has proven very successful at solving steady-state problems via a Reynolds Averaged Navier–Stokes approach, its utility for undertaking scale-resolving simulations of unsteady flows is less clear. High-order methods for unstructured grids and GPU accelerators have been proposed as an enabling technology for unsteady scale-resolving simulations of flow over complex geometries. In this study we systematically compare accuracy and cost of the high-order Flux Reconstruction solver PyFR running on GPUs and the industry-standard solver STAR-CCM+ running on CPUs when applied to amore » range of unsteady flow problems. Specifically, we perform comparisons of accuracy and cost for isentropic vortex advection (EV), decay of the Taylor–Green vortex (TGV), turbulent flow over a circular cylinder, and turbulent flow over an SD7003 aerofoil. We consider two configurations of STAR-CCM+: a second-order configuration, and a third-order configuration, where the latter was recommended by CD-adapco for more effective computation of unsteady flow problems. Results from both PyFR and STAR-CCM+ demonstrate that third-order schemes can be more accurate than second-order schemes for a given cost e.g. going from second- to third-order, the PyFR simulations of the EV and TGV achieve 75× and 3× error reduction respectively for the same or reduced cost, and STAR-CCM+ simulations of the cylinder recovered wake statistics significantly more accurately for only twice the cost. Moreover, advancing to higher-order schemes on GPUs with PyFR was found to offer even further accuracy vs. cost benefits relative to industry-standard tools.« less
Rawson, Timothy M; Castro-Sánchez, Enrique; Charani, Esmita; Husson, Fran; Moore, Luke S P; Holmes, Alison H; Ahmad, Raheelah
2018-02-01
Public sources fund the majority of UK infection research, but citizens currently have no formal role in resource allocation. To explore the feasibility and willingness of citizens to engage in strategic decision making, we developed and tested a practical tool to capture public priorities for research. A scenario including six infection themes for funding was developed to assess citizen priorities for research funding. This was tested over two days at a university public festival. Votes were cast anonymously along with rationale for selection. The scenario was then implemented during a three-hour focus group exploring views on engagement in strategic decisions and in-depth evaluation of the tool. 188/491(38%) prioritized funding research into drug-resistant infections followed by emerging infections(18%). Results were similar between both days. Focus groups contained a total of 20 citizens with an equal gender split, range of ethnicities and ages ranging from 18 to >70 years. The tool was perceived as clear with participants able to make informed comparisons. Rationale for funding choices provided by voters and focus group participants are grouped into three major themes: (i) Information processing; (ii) Knowledge of the problem; (iii) Responsibility; and a unique theme within the focus groups (iv) The potential role of citizens in decision making. Divergent perceptions of relevance and confidence of "non-experts" as decision makers were expressed. Voting scenarios can be used to collect, en-masse, citizens' choices and rationale for research priorities. Ensuring adequate levels of citizen information and confidence is important to allow deployment in other formats. © 2017 The Authors Health Expectations Published by John Wiley & Sons Ltd.
RetroPath2.0: A retrosynthesis workflow for metabolic engineers.
Delépine, Baudoin; Duigou, Thomas; Carbonell, Pablo; Faulon, Jean-Loup
2018-01-01
Synthetic biology applied to industrial biotechnology is transforming the way we produce chemicals. However, despite advances in the scale and scope of metabolic engineering, the research and development process still remains costly. In order to expand the chemical repertoire for the production of next generation compounds, a major engineering biology effort is required in the development of novel design tools that target chemical diversity through rapid and predictable protocols. Addressing that goal involves retrosynthesis approaches that explore the chemical biosynthetic space. However, the complexity associated with the large combinatorial retrosynthesis design space has often been recognized as the main challenge hindering the approach. Here, we provide RetroPath2.0, an automated open source workflow for retrosynthesis based on generalized reaction rules that perform the retrosynthesis search from chassis to target through an efficient and well-controlled protocol. Its easiness of use and the versatility of its applications make this tool a valuable addition to the biological engineer bench desk. We show through several examples the application of the workflow to biotechnological relevant problems, including the identification of alternative biosynthetic routes through enzyme promiscuity or the development of biosensors. We demonstrate in that way the ability of the workflow to streamline retrosynthesis pathway design and its major role in reshaping the design, build, test and learn pipeline by driving the process toward the objective of optimizing bioproduction. The RetroPath2.0 workflow is built using tools developed by the bioinformatics and cheminformatics community, because it is open source we anticipate community contributions will likely expand further the features of the workflow. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
Price, C L; Brace-McDonnell, S J; Stallard, N; Bleetman, A; Maconochie, I; Perkins, G D
2016-05-01
Context Triage tools are an essential component of the emergency response to a major incident. Although fortunately rare, mass casualty incidents involving children are possible which mandate reliable triage tools to determine the priority of treatment. To determine the performance characteristics of five major incident triage tools amongst paediatric casualties who have sustained traumatic injuries. Retrospective observational cohort study using data from 31,292 patients aged less than 16 years who sustained a traumatic injury. Data were obtained from the UK Trauma Audit and Research Network (TARN) database. Interventions Statistical evaluation of five triage tools (JumpSTART, START, CareFlight, Paediatric Triage Tape/Sieve and Triage Sort) to predict death or severe traumatic injury (injury severity score >15). Main outcome measures Performance characteristics of triage tools (sensitivity, specificity and level of agreement between triage tools) to identify patients at high risk of death or severe injury. Of the 31,292 cases, 1029 died (3.3%), 6842 (21.9%) had major trauma (defined by an injury severity score >15) and 14,711 (47%) were aged 8 years or younger. There was variation in the performance accuracy of the tools to predict major trauma or death (sensitivities ranging between 36.4 and 96.2%; specificities 66.0-89.8%). Performance characteristics varied with the age of the child. CareFlight had the best overall performance at predicting death, with the following sensitivity and specificity (95% CI) respectively: 95.3% (93.8-96.8) and 80.4% (80.0-80.9). JumpSTART was superior for the triaging of children under 8 years; sensitivity and specificity (95% CI) respectively: 86.3% (83.1-89.5) and 84.8% (84.2-85.5). The triage tools were generally better at identifying patients who would die than those with non-fatal severe injury. This statistical evaluation has demonstrated variability in the accuracy of triage tools at predicting outcomes for children who sustain traumatic injuries. No single tool performed consistently well across all evaluated scenarios. Copyright © 2015 Elsevier Ltd. All rights reserved.
Loquai, C; Scheurich, V; Syring, N; Schmidtmann, I; Müller-Brenne, T; Werner, A; Grabbe, S; Beutel, M E
2014-12-01
Although psychosocial distress has been evaluated well in cancer entities like breast or prostate cancer, its impact on melanoma patients still needs to be characterized. The objectives of this study were to (i) evaluate psychosocial distress in melanoma patients using an expert rating instrument [basic documentation for psycho-oncology short version (PO-Bado SF)]; (ii) determine associated demographic and clinical variables; and (iii) assess the acceptance of using PO-Bado SF as a routine procedure in a skin cancer unit. A cross-sectional group of 696 melanoma patients was recruited. During the routine contact, doctors assessed the patients subjective distress using PO-Bado SF. Sociodemographic data, tumour data, treatment and the course of the disease were extracted from the patients' charts. PO-Bado SF was completed in 688 of 696 (99%) participating patients, revealing a high acceptance. In 51 (7%) patients, the PO-Bado SF cut-off score indicated the potential need of psychosocial support. Patients with previous or ongoing radiotherapy, a history of major surgery due to organ metastases, younger age and shorter time since diagnosis were considered significantly more distressed than patients without these criteria. Patients were most distressed by suffering from anxiety/worries and/or tensions. In younger patients emotional variables and other problems like social or family problems were deemed more relevant while functional limitations in daily living were reasons for higher distress in older patients. PO-Bado SF is a useful, well-accepted, practical and economic screening tool to identify distressed melanoma patients. Although most melanoma patients seem to cope well with their disease, special attention should be given to young patients in the first years after initial diagnosis and to patients with advanced disease, radiotherapy and major surgery due to their disease. Combination of expert rating tools with self-report screening instruments could further characterize the specific sources of distress to optimize psychosocial support. © 2014 European Academy of Dermatology and Venereology.
NASA Technical Reports Server (NTRS)
Anderson, D. M. (Principal Investigator); Haugen, R. K.; Gatto, L. W.; Slaughter, C. W.; Marlar, T. L.; Mckim, H. L.
1972-01-01
There are no author-identified significant results in this report. An overriding problem in arctic and subarctic environmental research has been the absence of long-term observational data and the sparseness of geographical coverage of existing data. A first look report is presented on the use of ERTS-1 imagery as a major tool in two large area environmental studies: (1) investigation of sedimentation and other nearshore marine processes in Cook Inlet, Alaska; and (2) a regional study of permafrost regimes in the discontinuous permafrost zone of Alaska. These studies incorporate ground truth acquisition techniques that are probably similar to most ERTS investigations. Studies of oceanographic processes in Cook Inlet will be focused on seasonal changes in nearshore bathymetry, tidal and major current circulation patterns, and coastal sedimentation processes, applicable to navigation, construction, and maintenance of harbors. Analyses will be made of the regional permafrost distribution and regimes in the Upper Koyukuk-Kobuk River area located in NW Alaska.
Toyabe, Shin-ichi
2012-01-01
Severe injuries such as intracranial hemorrhage (ICH) are the most serious problem after falls in hospital, but they have not been considered in risk assessment scores for falls. We tried to determine the risk factors for ICH after falls in 20,320 inpatients (696,364 patient-days) aged from 40 to 90 years who were admitted to a tertiary-care university hospital. Possible risk factors including STRATIFY risk score for falls and FRAX™ risk score for fractures were analyzed by univariate and multivariate analyses. Fallers accounted for 3.2% of the patients, and 5.0% of the fallers suffered major injuries, including peripheral bone fracture (59.6%) and ICH (23.4%). In addition to STRATIFY, FRAX™ was significantly associated not only with bone fractures but also ICH. Concomitant use of risk score for falls and risk score for fractures might be useful for the prediction of major injuries such as ICH after falls. PMID:22980233
Problem Solving: Tools and Techniques for the Park and Recreation Administrator. Fourth Edition
ERIC Educational Resources Information Center
Arnold, Margaret L.; Heyne, Linda A.; Busser, James A.
2005-01-01
This book is a useful tool for recreation educators in carrying out their responsibilities for preparing the next generation for effective service in recreation and parks. The need for this book is apparent, because few recreation curricula include courses in problem solving. It is true that many texts dealing with recreation describe policies and…
Investigating the Effect of Complexity Factors in Gas Law Problems
ERIC Educational Resources Information Center
Schuttlefield, Jennifer D.; Kirk, John; Pienta, Norbert J.; Tang, Hui
2012-01-01
Undergraduate students were asked to complete gas law questions using a Web-based tool as a first step in our understanding of the role of cognitive load in chemistry word questions and in helping us assess student problem-solving. Each question contained five different complexity factors, which were randomly assigned by the tool so that a…
ERIC Educational Resources Information Center
Bartholomew, Scott R.; Nadelson, Louis S.; Goodridge, Wade H.; Reeve, Edward M.
2018-01-01
We investigated the use of adaptive comparative judgment to evaluate the middle school student learning, engagement, and experience with the design process in an open-ended problem assigned in a technology and engineering education course. Our results indicate that the adaptive comparative judgment tool effectively facilitated the grading of the…
ERIC Educational Resources Information Center
Flor, Richard F.; Troskey, Matthew D.
This paper explores the dynamics of managing collective problem solving and decision making, and the application of tools and strategies to deal with the emergent complexity of systems in which educators work. Schools and educational programs are complex adaptive systems that respond to changes in internal and external environments. Functioning…
ERIC Educational Resources Information Center
Daunic, Ann P.; Smith, Stephen W.; Garvan, Cynthia W.; Barber, Brian R.; Becker, Mallory K.; Peters, Christine D.; Taylor, Gregory G.; Van Loan, Christopher L.; Li, Wei; Naranjo, Arlene H.
2012-01-01
Researchers have demonstrated that cognitive-behavioral intervention strategies--such as social problem solving--provided in school settings can help ameliorate the developmental risk for emotional and behavioral difficulties. In this study, we report the results of a randomized controlled trial of Tools for Getting Along (TFGA), a social…
Evaluation of Mathematical Self-Explanations with LSA in a Counterintuitive Problem of Probabilities
ERIC Educational Resources Information Center
Guiu, Jordi Maja
2012-01-01
In this paper different type of mathematical explanations are presented in relation to the mathematical problem of probabilities Monty Hall (card version) and the computational tool Latent Semantic Analyses (LSA) is used. At the moment the results in the literature about this computational tool to study texts show that this technique is…
Case Studies of Software Development Tools for Parallel Architectures
1993-06-01
Simulation ............................................. 29 4.7.3 Visualization...autonomous entities, each with its own state and set of behaviors, as in simulation , tracking, or Battle Management. Because C2 applications are often... simulation , that is used to help the developer solve the problems. The new tool/problem solution matrix is structured in terms of the software development
ERIC Educational Resources Information Center
Jahreie, Cecilie Flo
2010-01-01
This article examines the way student teachers make sense of conceptual tools when writing cases. In order to understand the problem-solving process, an analysis of the interactions is conducted. The findings show that transforming practical experiences into theoretical reflection is not a straightforward matter. To be able to elaborate on the…
ERIC Educational Resources Information Center
Derry, Sharon; And Others
This study examined ways in which two independent variables, peer collaboration and the use of a specific tool (the TAPS interface), work together and individually to shape students' problem-solving processes. More specifically, the researchers were interested in determining how collaboration and TAPS use cause metacognitive processes to differ…
Classification and assessment tools for structural motif discovery algorithms.
Badr, Ghada; Al-Turaiki, Isra; Mathkour, Hassan
2013-01-01
Motif discovery is the problem of finding recurring patterns in biological data. Patterns can be sequential, mainly when discovered in DNA sequences. They can also be structural (e.g. when discovering RNA motifs). Finding common structural patterns helps to gain a better understanding of the mechanism of action (e.g. post-transcriptional regulation). Unlike DNA motifs, which are sequentially conserved, RNA motifs exhibit conservation in structure, which may be common even if the sequences are different. Over the past few years, hundreds of algorithms have been developed to solve the sequential motif discovery problem, while less work has been done for the structural case. In this paper, we survey, classify, and compare different algorithms that solve the structural motif discovery problem, where the underlying sequences may be different. We highlight their strengths and weaknesses. We start by proposing a benchmark dataset and a measurement tool that can be used to evaluate different motif discovery approaches. Then, we proceed by proposing our experimental setup. Finally, results are obtained using the proposed benchmark to compare available tools. To the best of our knowledge, this is the first attempt to compare tools solely designed for structural motif discovery. Results show that the accuracy of discovered motifs is relatively low. The results also suggest a complementary behavior among tools where some tools perform well on simple structures, while other tools are better for complex structures. We have classified and evaluated the performance of available structural motif discovery tools. In addition, we have proposed a benchmark dataset with tools that can be used to evaluate newly developed tools.
Are multidose drug dispensing systems initiated for the appropriate patients?
Mertens, Bram J; Kwint, H F; van Marum, Rob J; Bouvy, Marcel L
2018-05-16
It is unknown if multidose drug dispensing (MDD) systems are initiated for the appropriate patients. Therefore, the objective of this study was to compare the medication management problems of patients who were about to start with a MDD system (MDD patients) and patients who continued manually dispensed medication (non-MDD users) in order to identify if the appropriate patients receive a MDD system. Patient interviews (semi-structured) were conducted by 44 community pharmacists at the patient's home. Patients over 65 years of age, home dwelling and using at least five chronic drugs, were eligible for the study. An assessment tool was developed including 22 potential medication management problems, covering four domains: functional (7), organizational (7), medication adherence (6), and medication knowledge (2). Median scores were calculated with the interquartile range. Additionally, cognitive function was assessed with the Mini-Cog and frailty using the Groningen Frailty Indicator. One hundred eighty-eight MDD users and 230 non-MDD users were interviewed. MDD users were older, more often female, and using more drugs. Forty-two percent of the MDD users were possibly cognitively impaired and 63% were assessed as frail compared to 20 and 27% respectively of the non-MDD users. MDD users had more potential organizational problems (3 vs. 1; p < 0.01), functional problems (2 vs. 1; p < 0.01), medication adherence problems (1 vs. 0; p < 0.01), and medication knowledge problems (1 vs. 0; p < 0.01) compared to non-MDD users. Seventy percent of the MDD users scored six or more potential medication management problems while this was 22% among non-MDD users. The majority of MDD systems were initiated for patients who experienced multiple potential medication management problems suggesting a decreased medication management capacity.
Generating optimal control simulations of musculoskeletal movement using OpenSim and MATLAB.
Lee, Leng-Feng; Umberger, Brian R
2016-01-01
Computer modeling, simulation and optimization are powerful tools that have seen increased use in biomechanics research. Dynamic optimizations can be categorized as either data-tracking or predictive problems. The data-tracking approach has been used extensively to address human movement problems of clinical relevance. The predictive approach also holds great promise, but has seen limited use in clinical applications. Enhanced software tools would facilitate the application of predictive musculoskeletal simulations to clinically-relevant research. The open-source software OpenSim provides tools for generating tracking simulations but not predictive simulations. However, OpenSim includes an extensive application programming interface that permits extending its capabilities with scripting languages such as MATLAB. In the work presented here, we combine the computational tools provided by MATLAB with the musculoskeletal modeling capabilities of OpenSim to create a framework for generating predictive simulations of musculoskeletal movement based on direct collocation optimal control techniques. In many cases, the direct collocation approach can be used to solve optimal control problems considerably faster than traditional shooting methods. Cyclical and discrete movement problems were solved using a simple 1 degree of freedom musculoskeletal model and a model of the human lower limb, respectively. The problems could be solved in reasonable amounts of time (several seconds to 1-2 hours) using the open-source IPOPT solver. The problems could also be solved using the fmincon solver that is included with MATLAB, but the computation times were excessively long for all but the smallest of problems. The performance advantage for IPOPT was derived primarily by exploiting sparsity in the constraints Jacobian. The framework presented here provides a powerful and flexible approach for generating optimal control simulations of musculoskeletal movement using OpenSim and MATLAB. This should allow researchers to more readily use predictive simulation as a tool to address clinical conditions that limit human mobility.
Generating optimal control simulations of musculoskeletal movement using OpenSim and MATLAB
Lee, Leng-Feng
2016-01-01
Computer modeling, simulation and optimization are powerful tools that have seen increased use in biomechanics research. Dynamic optimizations can be categorized as either data-tracking or predictive problems. The data-tracking approach has been used extensively to address human movement problems of clinical relevance. The predictive approach also holds great promise, but has seen limited use in clinical applications. Enhanced software tools would facilitate the application of predictive musculoskeletal simulations to clinically-relevant research. The open-source software OpenSim provides tools for generating tracking simulations but not predictive simulations. However, OpenSim includes an extensive application programming interface that permits extending its capabilities with scripting languages such as MATLAB. In the work presented here, we combine the computational tools provided by MATLAB with the musculoskeletal modeling capabilities of OpenSim to create a framework for generating predictive simulations of musculoskeletal movement based on direct collocation optimal control techniques. In many cases, the direct collocation approach can be used to solve optimal control problems considerably faster than traditional shooting methods. Cyclical and discrete movement problems were solved using a simple 1 degree of freedom musculoskeletal model and a model of the human lower limb, respectively. The problems could be solved in reasonable amounts of time (several seconds to 1–2 hours) using the open-source IPOPT solver. The problems could also be solved using the fmincon solver that is included with MATLAB, but the computation times were excessively long for all but the smallest of problems. The performance advantage for IPOPT was derived primarily by exploiting sparsity in the constraints Jacobian. The framework presented here provides a powerful and flexible approach for generating optimal control simulations of musculoskeletal movement using OpenSim and MATLAB. This should allow researchers to more readily use predictive simulation as a tool to address clinical conditions that limit human mobility. PMID:26835184
Sisler, Laurel; Omofoye, Oluwaseun; Paci, Karina; Hadar, Eldad; Goldstein, Adam O; Ripley-Moffitt, Carol
2017-12-01
Health care providers routinely undertreat tobacco dependence, indicating a need for innovative ways to increase delivery of evidence-based care. Lean, a set of quality improvement (QI) tools used increasingly in health care, can help streamline processes, create buy-in for use of evidence-based practices, and lead to the identification of solutions on the basis of a problem's root causes. To date, no published research has examined the use of Lean tools in tobacco dependence. A 12-month QI project using Lean tools was conducted to increase delivery of evidence-based tobacco use treatment (TUT) to hospitalized neurosurgical patients. The study team developed a nicotine replacement therapy (NRT) and counseling protocol for neurosurgery inpatients who indicated current tobacco use and used Lean tools to increase protocol adherence. Rates of NRT prescription, referrals to counseling, and follow-up phone calls were compared pre- and postintervention. Secondary measures included patient satisfaction with intervention, quit rates, and reduction rates at 4 weeks postdischarge. Referrals to counseling doubled from 31.7% at baseline to 62.0% after implementation of the intervention, and rates of nicotine replacement therapy (NRT) prescriptions during hospitalization and at discharge increased from 15.3% to 28.5% and 9.0% to 19.3%, respectively. Follow-up phone call rates also dramatically increased. The majority of satisfaction survey respondents indicated that counseling had a positive or neutral impact on stress level and overall satisfaction. Lean tools can dramatically increase use of evidence-based TUT in hospitalized patients. This project is easily replicable by professionals seeking to improve delivery of tobacco treatment. These findings may be particularly helpful to inpatient surgical departments that have traditionally been reticent to prescribe NRT. Copyright © 2017 The Joint Commission. Published by Elsevier Inc. All rights reserved.
Agampodi, Thilini Chanchala; Agampodi, Suneth Buddhika; Glozier, Nicholas; Siribaddana, Sisira
2015-03-01
Social capital is a neglected determinant of health in low and middle income countries. To date, majority of evidence syntheses on social capital and health are based upon high income countries. We conducted this systematic review to identify the methods used to measure social capital in low and middle-income countries and to evaluate their relative strengths and weaknesses. An electronic search was conducted using Pubmed, Science citation index expanded, Social science citation index expanded, Web of knowledge, Cochrane, Trip, Google scholar and selected grey literature sources. We aimed to include all studies conducted in low and middle-income countries, published in English that have measured any aspect of social capital in relation to health in the study, from 1980 to January 2013. We extracted data using a data extraction form and performed narrative synthesis as the measures were heterogeneous. Of the 472 articles retrieved, 46 articles were selected for the review. The review included 32 studies from middle income countries and seven studies from low income countries. Seven were cross national studies. Most studies were descriptive cross sectional in design (n = 39). Only two randomized controlled trials were included. Among the studies conducted using primary data (n = 32), we identified18 purposely built tools that measured various dimensions of social capital. Validity (n = 11) and reliability (n = 8) of the tools were assessed only in very few studies. Cognitive constructs of social capital, namely trust, social cohesion and sense of belonging had a positive association towards measured health outcome in majority of the studies. While most studies measured social capital at individual/micro level (n = 32), group level measurements were obtained by aggregation of individual measures. As many tools originate in high income contexts, cultural adaptation, validation and reliability assessment is mandatory in adapting the tool to the study setting. Evidence on causality and assessing predictive validity is a problem due to the scarcity of prospective study designs. We recommend Harpham et al. s' Adapted Social Capital Assessment Tool (A-SCAT), Hurtado et al. s' six item tool and Elgar et al. s' World Value Survey Social Capital Scale for assessment of social capital in low and middle income countries. Copyright © 2015 Elsevier Ltd. All rights reserved.
Problem-Solving Environments (PSEs) to Support Innovation Clustering
NASA Technical Reports Server (NTRS)
Gill, Zann
1999-01-01
This paper argues that there is need for high level concepts to inform the development of Problem-Solving Environment (PSE) capability. A traditional approach to PSE implementation is to: (1) assemble a collection of tools; (2) integrate the tools; and (3) assume that collaborative work begins after the PSE is assembled. I argue for the need to start from the opposite premise, that promoting human collaboration and observing that process comes first, followed by the development of supporting tools, and finally evolution of PSE capability through input from collaborating project teams.
Web Tools: The Second Generation
ERIC Educational Resources Information Center
Pascopella, Angela
2008-01-01
Web 2.0 tools and technologies, or second generation tools, help districts to save time and money, and eliminate the need to transfer or move files back and forth across computers. Many Web 2.0 tools help students think critically and solve problems, which falls under the 21st-century skills. The second-generation tools are growing in popularity…
Laplace Transform Based Radiative Transfer Studies
NASA Astrophysics Data System (ADS)
Hu, Y.; Lin, B.; Ng, T.; Yang, P.; Wiscombe, W.; Herath, J.; Duffy, D.
2006-12-01
Multiple scattering is the major uncertainty for data analysis of space-based lidar measurements. Until now, accurate quantitative lidar data analysis has been limited to very thin objects that are dominated by single scattering, where photons from the laser beam only scatter a single time with particles in the atmosphere before reaching the receiver, and simple linear relationship between physical property and lidar signal exists. In reality, multiple scattering is always a factor in space-based lidar measurement and it dominates space- based lidar returns from clouds, dust aerosols, vegetation canopy and phytoplankton. While multiple scattering are clear signals, the lack of a fast-enough lidar multiple scattering computation tool forces us to treat the signal as unwanted "noise" and use simple multiple scattering correction scheme to remove them. Such multiple scattering treatments waste the multiple scattering signals and may cause orders of magnitude errors in retrieved physical properties. Thus the lack of fast and accurate time-dependent radiative transfer tools significantly limits lidar remote sensing capabilities. Analyzing lidar multiple scattering signals requires fast and accurate time-dependent radiative transfer computations. Currently, multiple scattering is done with Monte Carlo simulations. Monte Carlo simulations take minutes to hours and are too slow for interactive satellite data analysis processes and can only be used to help system / algorithm design and error assessment. We present an innovative physics approach to solve the time-dependent radiative transfer problem. The technique utilizes FPGA based reconfigurable computing hardware. The approach is as following, 1. Physics solution: Perform Laplace transform on the time and spatial dimensions and Fourier transform on the viewing azimuth dimension, and convert the radiative transfer differential equation solving into a fast matrix inversion problem. The majority of the radiative transfer computation goes to matrix inversion processes, FFT and inverse Laplace transforms. 2. Hardware solutions: Perform the well-defined matrix inversion, FFT and Laplace transforms on highly parallel, reconfigurable computing hardware. This physics-based computational tool leads to accurate quantitative analysis of space-based lidar signals and improves data quality of current lidar mission such as CALIPSO. This presentation will introduce the basic idea of this approach, preliminary results based on SRC's FPGA-based Mapstation, and how we may apply it to CALIPSO data analysis.
Tracking wakefulness as it fades: Micro-measures of alertness.
Jagannathan, Sridhar R; Ezquerro-Nassar, Alejandro; Jachs, Barbara; Pustovaya, Olga V; Bareham, Corinne A; Bekinschtein, Tristan A
2018-08-01
A major problem in psychology and physiology experiments is drowsiness: around a third of participants show decreased wakefulness despite being instructed to stay alert. In some non-visual experiments participants keep their eyes closed throughout the task, thus promoting the occurrence of such periods of varying alertness. These wakefulness changes contribute to systematic noise in data and measures of interest. To account for this omnipresent problem in data acquisition we defined criteria and code to allow researchers to detect and control for varying alertness in electroencephalography (EEG) experiments under eyes-closed settings. We first revise a visual-scoring method developed for detection and characterization of the sleep-onset process, and adapt the same for detection of alertness levels. Furthermore, we show the major issues preventing the practical use of this method, and overcome these issues by developing an automated method (micro-measures algorithm) based on frequency and sleep graphoelements, which are capable of detecting micro variations in alertness. The validity of the micro-measures algorithm was verified by training and testing using a dataset where participants are known to fall asleep. In addition, we tested generalisability by independent validation on another dataset. The methods developed constitute a unique tool to assess micro variations in levels of alertness and control trial-by-trial retrospectively or prospectively in every experiment performed with EEG in cognitive neuroscience under eyes-closed settings. Copyright © 2018. Published by Elsevier Inc.
Evaluation of natural language processing systems: Issues and approaches
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guida, G.; Mauri, G.
This paper encompasses two main topics: a broad and general analysis of the issue of performance evaluation of NLP systems and a report on a specific approach developed by the authors and experimented on a sample test case. More precisely, it first presents a brief survey of the major works in the area of NLP systems evaluation. Then, after introducing the notion of the life cycle of an NLP system, it focuses on the concept of performance evaluation and analyzes the scope and the major problems of the investigation. The tools generally used within computer science to assess the qualitymore » of a software system are briefly reviewed, and their applicability to the task of evaluation of NLP systems is discussed. Particular attention is devoted to the concepts of efficiency, correctness, reliability, and adequacy, and how all of them basically fail in capturing the peculiar features of performance evaluation of an NLP system is discussed. Two main approaches to performance evaluation are later introduced; namely, black-box- and model-based, and their most important characteristics are presented. Finally, a specific model for performance evaluation proposed by the authors is illustrated, and the results of an experiment with a sample application are reported. The paper concludes with a discussion on research perspective, open problems, and importance of performance evaluation to industrial applications.« less
Competitive control of cognition in rhesus monkeys.
Kowaguchi, Mayuka; Patel, Nirali P; Bunnell, Megan E; Kralik, Jerald D
2016-12-01
The brain has evolved different approaches to solve problems, but the mechanisms that determine which approach to take remain unclear. One possibility is that control progresses from simpler processes, such as associative learning, to more complex ones, such as relational reasoning, when the simpler ones prove inadequate. Alternatively, control could be based on competition between the processes. To test between these possibilities, we posed the support problem to rhesus monkeys using a tool-use paradigm, in which subjects could pull an object (the tool) toward themselves to obtain an otherwise out-of-reach goal item. We initially provided one problem exemplar as a choice: for the correct option, a food item placed on the support tool; for the incorrect option, the food item placed off the tool. Perceptual cues were also correlated with outcome: e.g., red, triangular tool correct, blue, rectangular tool incorrect. Although the monkeys simply needed to touch the tool to register a response, they immediately pulled it, reflecting a relational reasoning process between themselves and another object (R self-other ), rather than an associative one between the arbitrary touch response and reward (A resp-reward ). Probe testing then showed that all four monkeys used a conjunction of perceptual features to select the correct option, reflecting an associative process between stimuli and reward (A stim-reward ). We then added a second problem exemplar and subsequent testing revealed that the monkeys switched to using the on/off relationship, reflecting a relational reasoning process between two objects (R other-other ). Because behavior appeared to reflect R self-other rather than A resp-reward , and A stim-reward prior to R other-other , our results suggest that cognitive processes are selected via competitive control dynamics. Copyright © 2016 Elsevier B.V. All rights reserved.
Conceptual Comparison of Population Based Metaheuristics for Engineering Problems
Green, Paul
2015-01-01
Metaheuristic algorithms are well-known optimization tools which have been employed for solving a wide range of optimization problems. Several extensions of differential evolution have been adopted in solving constrained and nonconstrained multiobjective optimization problems, but in this study, the third version of generalized differential evolution (GDE) is used for solving practical engineering problems. GDE3 metaheuristic modifies the selection process of the basic differential evolution and extends DE/rand/1/bin strategy in solving practical applications. The performance of the metaheuristic is investigated through engineering design optimization problems and the results are reported. The comparison of the numerical results with those of other metaheuristic techniques demonstrates the promising performance of the algorithm as a robust optimization tool for practical purposes. PMID:25874265
Conceptual comparison of population based metaheuristics for engineering problems.
Adekanmbi, Oluwole; Green, Paul
2015-01-01
Metaheuristic algorithms are well-known optimization tools which have been employed for solving a wide range of optimization problems. Several extensions of differential evolution have been adopted in solving constrained and nonconstrained multiobjective optimization problems, but in this study, the third version of generalized differential evolution (GDE) is used for solving practical engineering problems. GDE3 metaheuristic modifies the selection process of the basic differential evolution and extends DE/rand/1/bin strategy in solving practical applications. The performance of the metaheuristic is investigated through engineering design optimization problems and the results are reported. The comparison of the numerical results with those of other metaheuristic techniques demonstrates the promising performance of the algorithm as a robust optimization tool for practical purposes.
A Python tool to set up relative free energy calculations in GROMACS.
Klimovich, Pavel V; Mobley, David L
2015-11-01
Free energy calculations based on molecular dynamics (MD) simulations have seen a tremendous growth in the last decade. However, it is still difficult and tedious to set them up in an automated manner, as the majority of the present-day MD simulation packages lack that functionality. Relative free energy calculations are a particular challenge for several reasons, including the problem of finding a common substructure and mapping the transformation to be applied. Here we present a tool, alchemical-setup.py, that automatically generates all the input files needed to perform relative solvation and binding free energy calculations with the MD package GROMACS. When combined with Lead Optimization Mapper (LOMAP; Liu et al. in J Comput Aided Mol Des 27(9):755-770, 2013), recently developed in our group, alchemical-setup.py allows fully automated setup of relative free energy calculations in GROMACS. Taking a graph of the planned calculations and a mapping, both computed by LOMAP, our tool generates the topology and coordinate files needed to perform relative free energy calculations for a given set of molecules, and provides a set of simulation input parameters. The tool was validated by performing relative hydration free energy calculations for a handful of molecules from the SAMPL4 challenge (Mobley et al. in J Comput Aided Mol Des 28(4):135-150, 2014). Good agreement with previously published results and the straightforward way in which free energy calculations can be conducted make alchemical-setup.py a promising tool for automated setup of relative solvation and binding free energy calculations.
The present status of geography education in boys' intermediate schools of Riyadh, Saudi Arabia
NASA Astrophysics Data System (ADS)
Al-Gahtany, Abdulrahman Mohammed
The purpose of this study was to describe the present status of geography education in boys' intermediate schools in Riyadh, Saudi Arabia as perceived by geography teachers and supervisors; that is, to investigate the objectives, content, methods of teaching, tools and resources that are available and used in classrooms, evaluation techniques, and problems encountered in the teaching of geography. To collect data from this representative sample population, a questionnaire was developed by the researcher specifically for this study. Questionnaire data was collected from 19 social studies supervisors and 213 geography teachers. Percentages, frequencies, means, and standard deviations were computed for each questionnaire item. Chi Square tests were applied to determine if any significant differences could be identified between the observed and expected responses of supervisors and teachers. Major findings of the study indicated that both supervisors and teachers tend to strongly support the identified geography objectives. Most teachers and supervisors also indicated that the current geography curriculum contains enough information about Saudi Arabia, the Arabic world, and the Islamic world. In addition, the also indicated that geography content promotes a sense of patriotism and cultural pride. Responses indicated that educators see deficiencies in the content: it does not focus sufficiently on current events nor on developing student skills such as research and technical skills like drawing maps. Lecture and discussion are the most commonly used strategies in the teaching of geography. Field trips, role-playing, scientific competitions, scientific games, solving problems, and individual learning are less commonly used. Teaching tools most commonly used are wall maps and earth globes, whereas the use of geographical transparencies, models, and instruments is not common. Most of the teachers do lot use computers in their teaching. Evaluation techniques depend completely on traditional examinations to evaluate the performance of the students. Chi-square test shows that there are significant differences in observed and expected frequencies between teachers and supervisors with respect to geography learning objectives, geography content, methods of teaching, tools and resources, and the problems that geography teachers encounter in their teaching.
NASA Astrophysics Data System (ADS)
Docktor, Jennifer L.; Dornfeld, Jay; Frodermann, Evan; Heller, Kenneth; Hsu, Leonardo; Jackson, Koblar Alan; Mason, Andrew; Ryan, Qing X.; Yang, Jie
2016-06-01
Problem solving is a complex process valuable in everyday life and crucial for learning in the STEM fields. To support the development of problem-solving skills it is important for researchers and curriculum developers to have practical tools that can measure the difference between novice and expert problem-solving performance in authentic classroom work. It is also useful if such tools can be employed by instructors to guide their pedagogy. We describe the design, development, and testing of a simple rubric to assess written solutions to problems given in undergraduate introductory physics courses. In particular, we present evidence for the validity, reliability, and utility of the instrument. The rubric identifies five general problem-solving processes and defines the criteria to attain a score in each: organizing problem information into a Useful Description, selecting appropriate principles (Physics Approach), applying those principles to the specific conditions in the problem (Specific Application of Physics), using Mathematical Procedures appropriately, and displaying evidence of an organized reasoning pattern (Logical Progression).
Morris, Gerwyn; Berk, Michael; Puri, Basant K
2018-04-01
There is copious evidence of abnormalities in resting-state functional network connectivity states, grey and white matter pathology and impaired cerebral perfusion in patients afforded a diagnosis of multiple sclerosis, major depression or chronic fatigue syndrome (CFS) (myalgic encephalomyelitis). Systemic inflammation may well be a major element explaining such findings. Inter-patient and inter-illness variations in neuroimaging findings may arise at least in part from regional genetic, epigenetic and environmental variations in the functions of microglia and astrocytes. Regional differences in neuronal resistance to oxidative and inflammatory insults and in the performance of antioxidant defences in the central nervous system may also play a role. Importantly, replicated experimental findings suggest that the use of high-resolution SPECT imaging may have the capacity to differentiate patients afforded a diagnosis of CFS from those with a diagnosis of depression. Further research involving this form of neuroimaging appears warranted in an attempt to overcome the problem of aetiologically heterogeneous cohorts which probably explain conflicting findings produced by investigative teams active in this field. However, the ionising radiation and relative lack of sensitivity involved probably preclude its use as a routine diagnostic tool.
NASA Technical Reports Server (NTRS)
Wisdom, Jack
2002-01-01
In these 18 years, the research has touched every major dynamical problem in the solar system, including: the effect of chaotic zones on the distribution of asteroids, the delivery of meteorites along chaotic pathways, the chaotic motion of Pluto, the chaotic motion of the outer planets and that of the whole solar system, the delivery of short period comets from the Kuiper belt, the tidal evolution of the Uranian arid Galilean satellites, the chaotic tumbling of Hyperion and other irregular satellites, the large chaotic variations of the obliquity of Mars, the evolution of the Earth-Moon system, and the resonant core- mantle dynamics of Earth and Venus. It has introduced new analytical and numerical tools that are in widespread use. Today, nearly every long-term integration of our solar system, its subsystems, and other solar systems uses algorithms that was invented. This research has all been primarily Supported by this sequence of PGG NASA grants. During this period published major investigations of tidal evolution of the Earth-Moon system and of the passage of the Earth and Venus through non-linear core-mantle resonances were completed. It has published a major innovation in symplectic algorithms: the symplectic corrector. A paper was completed on non-perturbative hydrostatic equilibrium.
ERIC Educational Resources Information Center
Latta, Raymond F.; Downey, Carolyn J.
This book presents a wide array of sophisticated problem-solving tools and shows how to use them in a humanizing way that involves all stakeholders in the process. Chapter 1 develops the rationale for educational stakeholders to consider quality tools. Chapter 2 highlights three quality group-process tools--brainstorming, the nominal group…
Primary school children's communication experiences with Twitter: a case study from Turkey.
Gunuc, Selim; Misirli, Ozge; Odabasi, H Ferhan
2013-06-01
This case study examines the utilization of Twitter as a communication channel among primary school children. This study tries to answer the following questions: "What are the cases for primary school children's use of Twitter for communication?" and "What are primary school children's experiences of utilizing Twitter for communication?" Participants were 7th grade students (17 female, 34 male; age 13 years) studying in a private primary school in Turkey within the 2011-12 academic year. A questionnaire, semi-structured interview, document analysis, and open ended questions were used as data collection tools. The children were invited and encouraged to use Twitter for communication. Whilst participants had some minor difficulties getting accustomed to Twitter, they managed to use Twitter for communication, a conclusion drawn from the children's responses and tweets within the study. However, the majority of children did not consider Twitter as a communication tool, and were observed to quit using Twitter once the study had ended. They found Twitter unproductive and restrictive for communication. Furthermore, Twitter's low popularity among adolescents was also a problem. This study suggests that social networking tools favored by children should be integrated into educational settings in order to maximize instructional benefits for primary school children and adolescents.
The use of strain gauge platform and virtual reality tool for patient stability examination
NASA Astrophysics Data System (ADS)
Walendziuk, Wojciech; Wysk, Lukasz; Skoczylas, Marcin
2016-09-01
Virtual reality is one of the fastest growing information technologies. This paper is only a prelude to a larger study on the use of virtual reality tools in analysing bony labyrinth and sense of balance. Problems with the functioning of these areas of the body are a controversial topic in debate among specialists. The result of still unresolved imbalance treatments is a constant number of people reporting this type of ailment. Considering above, authors created a system and application that contains a model of virtual environment, and a tool for the modification of the obstacles in 3D space. Preliminary studies of patients from a test group aged 22-49 years were also carried out, in which behaviour and sense of balance in relation to the horizontal curvature of the virtual world around patient has been analysed. Experiments carried out on a test group showed that the shape of the curve and the virtual world space and age of patient has a major impact on a sense of balance. The data obtained can be linked with actual disorders of bony labyrinth and human behaviour at the time of their occurrence. Another important achievement that will be the subject of further work is possible use a modified version of the software for rehabilitation purposes.
Implementation Science to Accelerate Clean Cooking for Public Health
Rosenthal, Joshua; Balakrishnan, Kalpana; Bruce, Nigel; Chambers, David; Graham, Jay; Jack, Darby; Kline, Lydia; Masera, Omar; Mehta, Sumi; Mercado, Ilse Ruiz; Neta, Gila; Pattanayak, Subhrendu; Puzzolo, Elisa; Petach, Helen; Punturieri, Antonello; Rubinstein, Adolfo; Sage, Michael; Sturke, Rachel; Shankar, Anita; Sherr, Kenny; Smith, Kirk; Yadama, Gautam
2017-01-01
Summary: Clean cooking has emerged as a major concern for global health and development because of the enormous burden of disease caused by traditional cookstoves and fires. The World Health Organization has developed new indoor air quality guidelines that few homes will be able to achieve without replacing traditional methods with modern clean cooking technologies, including fuels and stoves. However, decades of experience with improved stove programs indicate that the challenge of modernizing cooking in impoverished communities includes a complex, multi-sectoral set of problems that require implementation research. The National Institutes of Health, in partnership with several government agencies and the Global Alliance for Clean Cookstoves, has launched the Clean Cooking Implementation Science Network that aims to address this issue. In this article, our focus is on building a knowledge base to accelerate scale-up and sustained use of the cleanest technologies in low- and middle-income countries. Implementation science provides a variety of analytical and planning tools to enhance effectiveness of clinical and public health interventions. These tools are being integrated with a growing body of knowledge and new research projects to yield new methods, consensus tools, and an evidence base to accelerate improvements in health promised by the renewed agenda of clean cooking. PMID:28055947
Using Online Digital Tools and Video to Support International Problem-Based Learning
ERIC Educational Resources Information Center
Lajoie, Susanne P.; Hmelo-Silver, Cindy; Wiseman, Jeffrey; Chan, Lap Ki; Lu, Jingyan; Khurana, Chesta; Cruz-Panesso, Ilian; Poitras, Eric; Kazemitabar, Maedeh
2014-01-01
The goal of this study is to examine how to facilitate cross-cultural groups in problem-based learning (PBL) using online digital tools and videos. The PBL consisted of two video-based cases used to trigger student-learning issues about giving bad news to HIV-positive patients. Mixed groups of medical students from Canada and Hong Kong worked with…
Old Tools for New Problems: Modifying Master Gardener Training to Improve Food Access in Rural Areas
ERIC Educational Resources Information Center
Randle, Anne
2015-01-01
Extension faces ever-changing problems, which can be addressed by modifying successful tools rather than inventing new ones. The Master Gardener program has proven its effectiveness, but the cost and time commitment can make it inaccessible to rural, low-income communities, where training in home gardening may address issues of food access and…
R-U-Typing-2-Me? Evolving a Chat Tool to Increase Understanding in Learning Activities
ERIC Educational Resources Information Center
Fuks, Hugo; Pimentel, Mariano; Lucena, Carlos Jose Pereira de
2006-01-01
Very often, when using a chat tool where more than one participant is talking simultaneously, it is difficult to follow the conversation, read all the different messages and work out who is talking to whom about what. This problem has been dubbed "Chat Confusion." This article investigates this problem in debate sessions in an online university…
ERIC Educational Resources Information Center
Bazo, Plácido; Rodríguez, Romén; Fumero, Dácil
2016-01-01
In this paper, we will introduce an innovative software platform that can be especially useful in a Content and Language Integrated Learning (CLIL) context. This tool is called Vocabulary Notebook, and has been developed to solve all the problems that traditional (paper) vocabulary notebooks have. This tool keeps focus on the personalisation of…
Informatics tools to improve clinical research study implementation.
Brandt, Cynthia A; Argraves, Stephanie; Money, Roy; Ananth, Gowri; Trocky, Nina M; Nadkarni, Prakash M
2006-04-01
There are numerous potential sources of problems when performing complex clinical research trials. These issues are compounded when studies are multi-site and multiple personnel from different sites are responsible for varying actions from case report form design to primary data collection and data entry. We describe an approach that emphasizes the use of a variety of informatics tools that can facilitate study coordination, training, data checks and early identification and correction of faulty procedures and data problems. The paper focuses on informatics tools that can help in case report form design, procedures and training and data management. Informatics tools can be used to facilitate study coordination and implementation of clinical research trials.
The Prisoner Problem--A Generalization.
ERIC Educational Resources Information Center
Gannon, Gerald E.; Martelli, Mario U.
2000-01-01
Presents a generalization to the classic prisoner problem, which is inherently interesting and has a solution within the reach of most high school mathematics students. Suggests the problem as a way to emphasize to students the final step in a problem-solver's tool kit, considering possible generalizations when a particular problem has been…
Dwyer, Robyn; Fraser, Suzanne
2017-06-01
It is widely accepted that alcohol and other drug consumption is profoundly gendered. Just where this gendering is occurring, however, remains the subject of debate. We contend that one important and overlooked site where the gendering of substance consumption and addiction is taking place is through AOD research itself: in particular, through the addiction screening and diagnostic tools designed to measure and track substance consumption and problems within populations. These tools establish key criteria and set numerical threshold scores for the identification of problems. In many of these tools, separate threshold scores for women and men are established or recommended. Drawing on Karen Barad's concept of post-humanist performativity, in this article we examine the ways in which gender itself is being materialised by these apparatuses of measurement. We focus primarily on the Drug Use Disorders Identification Test (DUDIT) tool as an exemplar of gendering processes that operate across addiction tools more broadly. We consider gendering processes operating through tools questions themselves and we also examine the quantification and legitimation processes used in establishing gender difference and the implications these have for women. We find tools rely on and reproduce narrow and marginalising assumptions about women as essentially fragile and vulnerable and simultaneously reinforce normative expectations that women sacrifice pleasure. The seemingly objective and neutral quantification processes operating in tools naturalise gender as they enact it. Copyright © 2017 Elsevier B.V. All rights reserved.
Orbital thermal analysis of lattice structured spacecraft using color video display techniques
NASA Technical Reports Server (NTRS)
Wright, R. L.; Deryder, D. D.; Palmer, M. T.
1983-01-01
A color video display technique is demonstrated as a tool for rapid determination of thermal problems during the preliminary design of complex space systems. A thermal analysis is presented for the lattice-structured Earth Observation Satellite (EOS) spacecraft at 32 points in a baseline non Sun-synchronous (60 deg inclination) orbit. Large temperature variations (on the order of 150 K) were observed on the majority of the members. A gradual decrease in temperature was observed as the spacecraft traversed the Earth's shadow, followed by a sudden rise in temperature (100 K) as the spacecraft exited the shadow. Heating rate and temperature histories of selected members and color graphic displays of temperatures on the spacecraft are presented.
Assessing The Single-Parent Family
Christie-Seely, Janet; Talbot, Yves
1985-01-01
The increase of single-parent families causes an increase in psychosocial problems and illness associated with stress. Divorce, separation, and lone parenting have now surpassed death as a cause of single-parent families. They are major life events, and the family physician who helps anticipate them and facilitates adaptation of the family can help prevent associated morbidity and mortality. A non-judgmental approach and understanding of system theory helps in assessing the single-parent family and its stresses. As in medical areas, diagnosis precedes treatment, appropriate assessment indicates management strategies. The acronym ‘PRACTICE’ describes an assessment tool for the areas likely to be problematic in single-parent families. The difference between the divorced, widowed and the never-married and their coping strategies are described. PMID:21274172
Kay, J D; Nurse, D
1999-01-01
We have used internet-standard tools to provide access for clinicians to the components of the electronic patient record held on multiple remote disparate systems. Through the same interface we have provided access to multiple knowledgebases, some written locally and others published elsewhere. We have developed linkage between these two types of information which removes the need for the user to drill down into each knowledgebase to search for relevant information. This approach may help in the implementation of evidence-based practice. The major problems appear to be semantic rather than technological. The intranet was developed at low cost and is now in routine use. This approach appears to be transferable across systems and organisations.
microRNAs as cancer therapeutics: A step closer to clinical application.
Catela Ivkovic, Tina; Voss, Gjendine; Cornella, Helena; Ceder, Yvonne
2017-10-28
During the last decades, basic and translational research has enabled great improvements in the clinical management of cancer. However, scarcity of complete remission and many drug-induced toxicities are still a major problem in the clinics. Recently, microRNAs (miRNAs) have emerged as promising therapeutic targets due to their involvement in cancer development and progression. Their extraordinary regulatory potential, which enables regulation of entire signalling networks within the cells, makes them an interesting tool for the development of cancer therapeutics. In this review we will focus on miRNAs with experimentally proven therapeutic potential, and discuss recent advances in the technical development and clinical evaluation of miRNA-based therapeutic agents. Copyright © 2017 Elsevier B.V. All rights reserved.
Database usage and performance for the Fermilab Run II experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bonham, D.; Box, D.; Gallas, E.
2004-12-01
The Run II experiments at Fermilab, CDF and D0, have extensive database needs covering many areas of their online and offline operations. Delivering data to users and processing farms worldwide has represented major challenges to both experiments. The range of applications employing databases includes, calibration (conditions), trigger information, run configuration, run quality, luminosity, data management, and others. Oracle is the primary database product being used for these applications at Fermilab and some of its advanced features have been employed, such as table partitioning and replication. There is also experience with open source database products such as MySQL for secondary databasesmore » used, for example, in monitoring. Tools employed for monitoring the operation and diagnosing problems are also described.« less
Measuring occupational stress: development of the pressure management indicator.
Williams, S; Cooper, C L
1998-10-01
The study of occupational stress is hindered by the lack of compact and comprehensive standardized measurement tools. The Pressure Management Indicator (PMI) is a 120-item self-report questionnaire developed from the Occupational Stress Indicator (OSI). The PMI is more reliable, more comprehensive, and shorter than the OSI. It provides an integrated measure of the major dimensions of occupational stress. The outcome scales measure job satisfaction, organizational satisfaction, organizational security, organizational commitment, anxiety--depression, resilience, worry, physical symptoms, and exhaustion. The stressor scales cover pressure from workload, relationships, career development, managerial responsibility, personal responsibility, home demands, and daily hassles. The moderator variables measure drive, impatience, control, decision latitude, and the coping strategies of problem focus, life work balance, and social support.
SigmaCLIPSE = presentation management + NASA CLI PS + SQL
NASA Technical Reports Server (NTRS)
Weiss, Bernard P., Jr.
1990-01-01
SigmaCLIPSE provides an expert systems and 'intelligent' data base development program for diverse systems integration environments that require support for automated reasoning and expert systems technology, presentation management, and access to 'intelligent' SQL data bases. The SigmaCLIPSE technology and and its integrated ability to access 4th generation application development and decision support tools through a portable SQL interface, comprises a sophisticated software development environment for solving knowledge engineering and expert systems development problems in information intensive commercial environments -- financial services, health care, and distributed process control -- where the expert system must be extendable -- a major architectural advantage of NASA CLIPS. SigmaCLIPSE is a research effort intended to test the viability of merging SQL data bases with expert systems technology.
GIS based solid waste management information system for Nagpur, India.
Vijay, Ritesh; Jain, Preeti; Sharma, N; Bhattacharyya, J K; Vaidya, A N; Sohony, R A
2013-01-01
Solid waste management is one of the major problems of today's world and needs to be addressed by proper utilization of technologies and design of effective, flexible and structured information system. Therefore, the objective of this paper was to design and develop a GIS based solid waste management information system as a decision making and planning tool for regularities and municipal authorities. The system integrates geo-spatial features of the city and database of existing solid waste management. GIS based information system facilitates modules of visualization, query interface, statistical analysis, report generation and database modification. It also provides modules like solid waste estimation, collection, transportation and disposal details. The information system is user-friendly, standalone and platform independent.
Electronic publishing and information handling: Plenty of roses, but also some thorns
NASA Astrophysics Data System (ADS)
Heck, André
The current dramatic evolution in information technology is bringing major modifications in the way scientists communicate. The concept of 'electronic publishing' is too restrictive and has often different, sometimes conflicting, interpretations. It is giving way to the broader notion of 'electronic information handling' encompassing the diverse types of information, the different media, as well as the various communication methodologies and technologies. New problems and challenges result also from this new information culture, especially on legal, ethical, and educational grounds. The procedures for validating 'published material' and for evaluating scientific activities will have to be adjusted too. 'Fluid' information is becoming an omnipresent reality. Electronic publishing cannot be conceived without link to knowledge bases and information resources, nor without intelligent information retrieval tools.
Fazeli, Mohammad Sadegh; Keramati, Mohammad Reza
2015-01-01
Rectal cancer is the second most common cancer in large intestine. The prevalence and the number of young patients diagnosed with rectal cancer have made it as one of the major health problems in the world. With regard to the improved access to and use of modern screening tools, a number of new cases are diagnosed each year. Considering the location of the rectum and its adjacent organs, management and treatment of rectal tumor is different from tumors located in other parts of the gastrointestinal tract or even the colon. In this article, we will review the current updates on rectal cancer including epidemiology, risk factors, clinical presentations, screening, and staging. Diagnostic methods and latest treatment modalities and approaches will also be discussed in detail. PMID:26034724
A Comparative Study of Involvement and Motivation among Casino Gamblers
Lee, Choong-Ki; Lee, BongKoo; Bernhard, Bo Jason
2009-01-01
Objective The purpose of this paper is to investigate three different types of gamblers (which we label "non-problem", "some problem", and "probable pathological gamblers") to determine differences in involvement and motivation, as well as differences in demographic and behavioral variables. Methods The analysis takes advantage of a unique opportunity to sample on-site at a major casino in South Korea, and the resulting purposive sample yielded 180 completed questionnaires in each of the three groups, for a total number of 540. Factor analysis, analysis of variance (ANOVA) and Duncan tests, and Chi-square tests are employed to analyze the data collected from the survey. Results Findings from ANOVA tests indicate that involvement factors of importance/self-expression, pleasure/interest, and centrality derived from the factor analysis were significantly different among these three types of gamblers. The "probable pathological" and "some problem" gamblers were found to have similar degrees of involvement, and higher degrees of involvement than the non-problem gamblers. The tests also reveal that motivational factors of escape, socialization, winning, and exploring scenery were significantly different among these three types of gamblers. When looking at motivations to visit the casino, "probable pathological" gamblers were more likely to seek winning, the "some problem" group appeared to be more likely to seek escape, and the "non-problem" gamblers indicate that their motivations to visit centered around explorations of scenery and culture in the surrounding casino area. Conclusion The tools for exploring motivations and involvements of gambling provide valuable and discerning information about the entire spectrum of gamblers. PMID:20046388