Basic Reference Tools for Nursing Research. A Workbook with Explanations and Examples.
ERIC Educational Resources Information Center
Smalley, Topsy N.
This workbook is designed to introduce nursing students to basic concepts and skills needed for searching the literatures of medicine, nursing, and allied health areas for materials relevant to specific information needs. The workbook introduces the following research tools: (1) the National Library of Medicine's MEDLINE searches, including a…
Assessment of Department of Defense Basic Research
2005-01-01
Sciences, the National Academy of Engineering, the Institute of Medicine, and the National Research Council: • Download hundreds of free books in PDF...with our innovative research tools Thank you for downloading this free PDF. If you have comments, questions or just want more information... downloaded from: http://www.nap.edu/catalog/11177.html Assessment of Department of Defense Basic Research Committee on Department of Defense Basic
Roadsigns from Research. BASICS: Bridging Vocational and Academic Skills.
ERIC Educational Resources Information Center
Sechler, Judith A.; Crowe, Michael R.
This document responds to the need for integration of basic skills into vocational education by providing a summary of research findings, implications, and practical suggestions for teachers. The six sections and four complementary posters are intended as tools for staff development of teachers engaged in teaching basic skills. Sections can also…
Cruising through Research: Library Skills for Young Adults.
ERIC Educational Resources Information Center
Volkman, John D.
This book presents an approach for school librarians to use to introduce basic research tools to students in grades 7-12. Twelve "Excursions" (i.e., library research projects) are described. Excursions 1 and 2 provide an introduction to reference books, and Excursions 3 and 4 explore note-taking and basic organization of research papers. The…
Treating for Two: Medicine and Pregnancy
... health condition. Multimedia and Tools Podcasts, videos, infographics, posters, and fact sheets. CDC’s Work CDC’s research program. ... Basics Research Findings by Health Condition Multimedia & Tools Posters and Fact Sheets 3 Things to Discuss with ...
Tools for computer graphics applications
NASA Technical Reports Server (NTRS)
Phillips, R. L.
1976-01-01
Extensive research in computer graphics has produced a collection of basic algorithms and procedures whose utility spans many disciplines. These tools are described in terms of their fundamental aspects, implementations, applications, and availability. Programs which are discussed include basic data plotting, curve smoothing, and depiction of three dimensional surfaces. As an aid to potential users of these tools, particular attention is given to discussing their availability and, where applicable, their cost.
Climate Action Planning Tool | NREL
NREL's Climate Action Planning Tool provides a quick, basic estimate of how various technology options can contribute to an overall climate action plan for your research campus. Use the tool to
2017-01-01
Copyrighted. (1 and 20) ABSTRACT (Maximum 200 words) The founding years of Operations Research (OR) are the years just before and during the Second World... Research ............................................................................................. 1 1.2 Teilgebiete des Operations Research ...2 1.3 Madelle des Operations Research
Are Online Quizzes an Effective Tool for Mastering Basic Algebra?
ERIC Educational Resources Information Center
Read, Wayne; Higgins, Patrick
2012-01-01
On-line quizzes are used to help first year University Mathematics students identify weaknesses in their basic skills and improve them. Quizzes developed as a formative tool have been utilised at JCU [James Cook University] for eight years. However, before this research no-one has questioned the effectiveness of quizzes for this task. We present a…
Integrated Measurements and Characterization | Photovoltaic Research | NREL
Integrated Measurements and Characterization cluster tool offers powerful capabilities with integrated tools more details on these capabilities. Basic Cluster Tool Capabilities Sample Handling Ultra-high-vacuum connections, it can be interchanged between tools, such as the Copper Indium Gallium Diselenide cluster tool
SPARSKIT: A basic tool kit for sparse matrix computations
NASA Technical Reports Server (NTRS)
Saad, Youcef
1990-01-01
Presented here are the main features of a tool package for manipulating and working with sparse matrices. One of the goals of the package is to provide basic tools to facilitate the exchange of software and data between researchers in sparse matrix computations. The starting point is the Harwell/Boeing collection of matrices for which the authors provide a number of tools. Among other things, the package provides programs for converting data structures, printing simple statistics on a matrix, plotting a matrix profile, and performing linear algebra operations with sparse matrices.
SNP_tools: A compact tool package for analysis and conversion of genotype data for MS-Excel
Chen, Bowang; Wilkening, Stefan; Drechsel, Marion; Hemminki, Kari
2009-01-01
Background Single nucleotide polymorphism (SNP) genotyping is a major activity in biomedical research. Scientists prefer to have a facile access to the results which may require conversions between data formats. First hand SNP data is often entered in or saved in the MS-Excel format, but this software lacks genetic and epidemiological related functions. A general tool to do basic genetic and epidemiological analysis and data conversion for MS-Excel is needed. Findings The SNP_tools package is prepared as an add-in for MS-Excel. The code is written in Visual Basic for Application, embedded in the Microsoft Office package. This add-in is an easy to use tool for users with basic computer knowledge (and requirements for basic statistical analysis). Conclusion Our implementation for Microsoft Excel 2000-2007 in Microsoft Windows 2000, XP, Vista and Windows 7 beta can handle files in different formats and converts them into other formats. It is a free software. PMID:19852806
SNP_tools: A compact tool package for analysis and conversion of genotype data for MS-Excel.
Chen, Bowang; Wilkening, Stefan; Drechsel, Marion; Hemminki, Kari
2009-10-23
Single nucleotide polymorphism (SNP) genotyping is a major activity in biomedical research. Scientists prefer to have a facile access to the results which may require conversions between data formats. First hand SNP data is often entered in or saved in the MS-Excel format, but this software lacks genetic and epidemiological related functions. A general tool to do basic genetic and epidemiological analysis and data conversion for MS-Excel is needed. The SNP_tools package is prepared as an add-in for MS-Excel. The code is written in Visual Basic for Application, embedded in the Microsoft Office package. This add-in is an easy to use tool for users with basic computer knowledge (and requirements for basic statistical analysis). Our implementation for Microsoft Excel 2000-2007 in Microsoft Windows 2000, XP, Vista and Windows 7 beta can handle files in different formats and converts them into other formats. It is a free software.
Ma, Bin; Xu, Jia-Ke; Wu, Wen-Jing; Liu, Hong-Yan; Kou, Cheng-Kun; Liu, Na; Zhao, Lulu
2017-01-01
To investigate the awareness and use of the Systematic Review Center for Laboratory Animal Experimentation's (SYRCLE) risk-of-bias tool, the Animal Research: Reporting of In Vivo Experiments (ARRIVE) reporting guidelines, and Gold Standard Publication Checklist (GSPC) in China in basic medical researchers of animal experimental studies. A national questionnaire-based survey targeting basic medical researchers was carried in China to investigate the basic information and awareness of SYRCLE's risk of bias tool, ARRIVE guidelines, GSPC, and animal experimental bias risk control factors. The EpiData3.1 software was used for data entry, and Microsoft Excel 2013 was used for statistical analysis in this study. The number of cases (n) and percentage (%) of classified information were statistically described, and the comparison between groups (i.e., current students vs. research staff) was performed using chi-square test. A total of 298 questionnaires were distributed, and 272 responses were received, which included 266 valid questionnaires (from 118 current students and 148 research staff). Among the 266 survey participants, only 15.8% was aware of the SYRCLE's risk of bias tool, with significant difference between the two groups (P = 0.003), and the awareness rates of ARRIVE guidelines and GSPC were only 9.4% and 9.0%, respectively; 58.6% survey participants believed that the reports of animal experimental studies in Chinese literature were inadequate, with significant difference between the two groups (P = 0.004). In addition, only approximately 1/3 of the survey participants had read systematic reviews and meta-analysis reports of animal experimental studies; only 16/266 (6.0%) had carried out/participated in and 11/266 (4.1%) had published systematic reviews/meta-analysis of animal experimental studies. The awareness and use rates of SYRCLE's risk-of-bias tool, the ARRIVE guidelines, and the GSPC were low among Chinese basic medical researchers. Therefore, specific measures are necessary to promote and popularize these standards and specifications and to introduce these standards into guidelines of Chinese domestic journals as soon as possible to raise awareness and increase use rates of researchers and journal editors, thereby improving the quality of animal experimental methods and reports.
ERIC Educational Resources Information Center
Cassel, Russell N.
This paper relates educational and psychological statistics to certain "Research Statistical Tools" (RSTs) necessary to accomplish and understand general research in the behavioral sciences. Emphasis is placed on acquiring an effective understanding of the RSTs and to this end they are are ordered to a continuum scale in terms of individual…
Ballabeni, Andrea; Boggio, Andrea; Hemenway, David
2014-01-01
Basic research in the biomedical field generates both knowledge that has a value per se regardless of its possible practical outcome and knowledge that has the potential to produce more practical benefits. Policies can increase the benefit potential to society of basic biomedical research by offering various kinds of incentives to basic researchers. In this paper we argue that soft incentives or “nudges” are particularly promising. However, to be well designed, these incentives must take into account the motivations, goals and views of the basic scientists. In the paper we present the results of an investigation that involved more than 300 scientists at Harvard Medical School and affiliated institutes. The results of this study suggest that some soft incentives could be valuable tools to increase the transformative value of fundamental investigations without affecting the spirit of the basic research and scientists’ work satisfaction. After discussing the findings, we discuss a few examples of nudges for basic researchers in the biomedical fields. PMID:24795807
Ballabeni, Andrea; Boggio, Andrea; Hemenway, David
2014-01-01
Basic research in the biomedical field generates both knowledge that has a value per se regardless of its possible practical outcome and knowledge that has the potential to produce more practical benefits. Policies can increase the benefit potential to society of basic biomedical research by offering various kinds of incentives to basic researchers. In this paper we argue that soft incentives or "nudges" are particularly promising. However, to be well designed, these incentives must take into account the motivations, goals and views of the basic scientists. In the paper we present the results of an investigation that involved more than 300 scientists at Harvard Medical School and affiliated institutes. The results of this study suggest that some soft incentives could be valuable tools to increase the transformative value of fundamental investigations without affecting the spirit of the basic research and scientists' work satisfaction. After discussing the findings, we discuss a few examples of nudges for basic researchers in the biomedical fields.
Monnier, Stéphanie; Cox, David G; Albion, Tim; Canzian, Federico
2005-01-01
Background Single Nucleotide Polymorphism (SNP) genotyping is a major activity in biomedical research. The Taqman technology is one of the most commonly used approaches. It produces large amounts of data that are difficult to process by hand. Laboratories not equipped with a Laboratory Information Management System (LIMS) need tools to organize the data flow. Results We propose a package of Visual Basic programs focused on sample management and on the parsing of input and output TaqMan files. The code is written in Visual Basic, embedded in the Microsoft Office package, and it allows anyone to have access to those tools, without any programming skills and with basic computer requirements. Conclusion We have created useful tools focused on management of TaqMan genotyping data, a critical issue in genotyping laboratories whithout a more sophisticated and expensive system, such as a LIMS. PMID:16221298
Tissue fluid pressures - From basic research tools to clinical applications
NASA Technical Reports Server (NTRS)
Hargens, Alan R.; Akeson, Wayne H.; Mubarak, Scott J.; Owen, Charles A.; Gershuni, David H.
1989-01-01
This paper describes clinical applications of two basic research tools developed and refined in the past 20 years: the wick catheter (for measuring tissue fluid pressure) and the colloid osmometer (for measuring osmotic pressure). Applications of the osmometer include estimations of the reduced osmotic pressure of sickle-cell hemoglobin with deoxygenation, and of reduced swelling pressure of human nucleus pulposus with hydration or upon action of certain enzymes. Clinical uses of the wick-catheter technique include an improvement of diagnosis and treatment of acute and chronic compartment syndromes, the elucidation of the tissue pressure thresholds for neuromuscular dysfunction, and the development of a better tourniquet for orthopedics.
Many worlds, one ethic: design and development of a global research ethics training curriculum.
Rivera, Roberto; Borasky, David; Rice, Robert; Carayon, Florence
2005-05-01
The demand for basic research ethics training has grown considerably in the past few years. Research and education organizations face the challenge of providing this training with limited resources and training tools available. To meet this need, Family Health International (FHI), a U.S.-based international research organization, recently developed a Research Ethics Training Curriculum (RETC). It was designed as a practical, user-friendly tool that provides basic, up-to-date, standardized training on the ethics of human research. The curriculum can easily be adapted to different audiences and training requirements. The RETC was reviewed by a group of international experts and field tested in five countries. It is available in English, French, and Spanish as a three-ring binder and CD-ROM, as well as on the Web. It may be used as either an interactive self-study program or for group training.
Basics and applications of genome editing technology.
Yamamoto, Takashi; Sakamoto, Naoaki
2016-01-01
Genome editing with programmable site-specific nucleases is an emerging technology that enables the manipulation of targeted genes in many organisms and cell lines. Since the development of the CRISPR-Cas9 system in 2012, genome editing has rapidly become an indispensable technology for all life science researchers, applicable in various fields. In this seminar, we will introduce the basics of genome editing and focus on the recent development of genome editing tools and technologies for the modification of various organisms and discuss future directions of the genome editing research field, from basic to medical applications.
Open Source for Knowledge and Learning Management: Strategies beyond Tools
ERIC Educational Resources Information Center
Lytras, Miltiadis, Ed.; Naeve, Ambjorn, Ed.
2007-01-01
In the last years, knowledge and learning management have made a significant impact on the IT research community. "Open Source for Knowledge and Learning Management: Strategies Beyond Tools" presents learning and knowledge management from a point of view where the basic tools and applications are provided by open source technologies.…
Microfluidic tools for cell biological research
Velve-Casquillas, Guilhem; Le Berre, Maël; Piel, Matthieu; Tran, Phong T.
2010-01-01
Summary Microfluidic technology is creating powerful tools for cell biologists to control the complete cellular microenvironment, leading to new questions and new discoveries. We review here the basic concepts and methodologies in designing microfluidic devices, and their diverse cell biological applications. PMID:21152269
NASA Technical Reports Server (NTRS)
Tahmasebi, Farhad; Pearce, Robert
2016-01-01
Description of a tool for portfolio analysis of NASA's Aeronautics research progress toward planned community strategic Outcomes is presented. For efficiency and speed, the tool takes advantage of a function developed in Excels Visual Basic for Applications. The strategic planning process for determining the community Outcomes is also briefly discussed. Stakeholder buy-in, partnership performance, progress of supporting Technical Challenges, and enablement forecast are used as the criteria for evaluating progress toward Outcomes. A few illustrative examples of using the tool are also presented.
Potential Tools for Phenotyping for Physical Characteristics of Plants, Pods, and Seed
USDA-ARS?s Scientific Manuscript database
Advances in phenotyping are a key factor for success in modern breeding as well as for basic plant research. Phenotyping provides a critical means to understand morphological, biochemical, physiological principles in the control of basic plant functions as well as for selecting superior genotypes in...
Nucleic acids-based tools for ballast water surveillance, monitoring, and research
Understanding the risks of biological invasion posed by ballast water—whether in the context of compliance testing, routine monitoring, or basic research—is fundamentally an exercise in biodiversity assessment, and as such should take advantage of the best tools avail...
Genome elimination: translating basic research into a future tool for plant breeding.
Comai, Luca
2014-06-01
During the course of our history, humankind has been through different periods of agricultural improvement aimed at enhancing our food supply and the performance of food crops. In recent years, it has become apparent that future crop improvement efforts will require new approaches to address the local challenges of farmers while empowering discovery across industry and academia. New plant breeding approaches are needed to meet this challenge to help feed a growing world population. Here I discuss how a basic research discovery is being translated into a potential future tool for plant breeding, and share the story of researcher Simon Chan, who recognized the potential application of this new approach--genome elimination--for the breeding of staple food crops in Africa and South America.
Current, Short Term, Future and Star Wars Research Projects for Ornamental Crops
USDA-ARS?s Scientific Manuscript database
The USDA-ARS Greenhouse Production Research Group is involved in fundamental and developmental plant research aimed at developing tools for early stress detection and efficient agrochemical utilization for protected horticulture crops. The group conducts basic plant biology research with the goal o...
Glycan Arrays: From Basic Biochemical Research to Bioanalytical and Biomedical Applications
NASA Astrophysics Data System (ADS)
Geissner, Andreas; Seeberger, Peter H.
2016-06-01
A major branch of glycobiology and glycan-focused biomedicine studies the interaction between carbohydrates and other biopolymers, most importantly, glycan-binding proteins. Today, this research into glycan-biopolymer interaction is unthinkable without glycan arrays, tools that enable high-throughput analysis of carbohydrate interaction partners. Glycan arrays offer many applications in basic biochemical research, for example, defining the specificity of glycosyltransferases and lectins such as immune receptors. Biomedical applications include the characterization and surveillance of influenza strains, identification of biomarkers for cancer and infection, and profiling of immune responses to vaccines. Here, we review major applications of glycan arrays both in basic and applied research. Given the dynamic nature of this rapidly developing field, we focus on recent findings.
Addiction Studies with Positron Emission Tomography
Joanna Fowler
2017-12-09
Brookhaven scientist Joanna Fowler describes Positron Emission Technology (PET) research at BNL which for the past 30 years has focused in the integration of basic research in radiotracer chemistry with the tools of neuroscience to develop new scientific
Addiction Studies with Positron Emission Tomography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Joanna Fowler
Brookhaven scientist Joanna Fowler describes Positron Emission Technology (PET) research at BNL which for the past 30 years has focused in the integration of basic research in radiotracer chemistry with the tools of neuroscience to develop new scientific
Nuclear medicine and imaging research (Instrumentation and quantitative methods of evaluation)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beck, R.N.; Cooper, M.D.
1989-09-01
This program addresses the problems involving the basic science and technology underlying the physical and conceptual tools of radioactive tracer methodology as they relate to the measurement of structural and functional parameters of physiologic importance in health and disease. The principal tool is quantitative radionuclide imaging. The overall objective of this program is to further the development and transfer of radiotracer methodology from basic theory to routine clinical practice in order that individual patients and society as a whole will receive the maximum net benefit from the new knowledge gained. The focus of the research is on the development ofmore » new instruments and radiopharmaceuticals, and the evaluation of these through the phase of clinical feasibility.« less
NASA Technical Reports Server (NTRS)
Strauss, Alvin M.; Peterson, Steven W.; Main, John A.; Dickenson, Rueben D.; Shields, Bobby L.; Lorenz, Christine H.
1992-01-01
The goal of the basic research portion of the extravehicular activity (EVA) glove research program is to gain a greater understanding of the kinematics of the hand, the characteristics of the pressurized EVA glove, and the interaction of the two. Examination of the literature showed that there existed no acceptable, non-invasive method of obtaining accurate biomechanical data on the hand. For this reason a project was initiated to develop magnetic resonance imaging as a tool for biomechanical data acquisition and visualization. Literature reviews also revealed a lack of practical modeling methods for fabric structures, so a basic science research program was also initiated in this area.
Reinventing Biostatistics Education for Basic Scientists
Weissgerber, Tracey L.; Garovic, Vesna D.; Milin-Lazovic, Jelena S.; Winham, Stacey J.; Obradovic, Zoran; Trzeciakowski, Jerome P.; Milic, Natasa M.
2016-01-01
Numerous studies demonstrating that statistical errors are common in basic science publications have led to calls to improve statistical training for basic scientists. In this article, we sought to evaluate statistical requirements for PhD training and to identify opportunities for improving biostatistics education in the basic sciences. We provide recommendations for improving statistics training for basic biomedical scientists, including: 1. Encouraging departments to require statistics training, 2. Tailoring coursework to the students’ fields of research, and 3. Developing tools and strategies to promote education and dissemination of statistical knowledge. We also provide a list of statistical considerations that should be addressed in statistics education for basic scientists. PMID:27058055
Facts about Congenital Heart Defects
... Living With Heart Defects Data & Statistics Tracking & Research Articles & Key Findings Free Materials Multimedia and Tools Links to Other Websites Information For… Media Policy Makers Basics about Congenital Heart Defects Language: ...
ERIC Educational Resources Information Center
University of Southwestern Louisiana, Lafayette.
A student who plans to enter the field of technology education must be especially motivated to incorporate computer technology into the theories of learning. Evaluation prior to the learning process establishes a frame of reference for students. After preparing students with the basic concepts of resistors and the mental tools, the expert system…
[The grounded theory as a methodological alternative for nursing research].
dos Santos, Sérgio Ribeiro; da Nóbrega, Maria Miriam
2002-01-01
This study presents a method of interpretative and systematic research with appliance to the development of studies in nursing called "the grounded theory", whose theoretical support is the symbolic interactionism. The purpose of the paper is to describe the grounded theory as an alternative methodology for the construction of knowledge in nursing. The study highlights four topics: the basic principle, the basic concepts, the trajectory of the method and the process of analysis of the data. We conclude that the systematization of data and its interpretation, based on social actors' experience, constitute strong subsidies to generate theories through this research tool.
Authors’ response: what are emotions and how are they created in the brain?
Lindquist, Kristen A; Wager, Tor D; Bliss-Moreau, Eliza; Kober, Hedy; Barret, Lisa Feldman
2012-06-01
In our response, we clarify important theoretical differences between basic emotion and psychological construction approaches. We evaluate the empirical status of the basic emotion approach, addressing whether it requires brain localization, whether localization can be observed with better analytic tools, and whether evidence for basic emotions exists in other types of measures. We then revisit the issue of whether the key hypotheses of psychological construction are supported by our meta-analytic findings. We close by elaborating on commentator suggestions for future research.
Basic Numeracy Abilities of Xhosa Reception Year Students in South Africa: Language Policy Issues
ERIC Educational Resources Information Center
Feza, Nosisi Nellie
2016-01-01
Language in mathematics learning and teaching has a significant role in influencing performance. Literature on language in mathematics learning has evolved from language as a barrier to language as a cultural tool, and recently more research has argued for use of home language as an instructional tool in mathematics classrooms. However, the…
Visualization: A Tool for Enhancing Students' Concept Images of Basic Object-Oriented Concepts
ERIC Educational Resources Information Center
Cetin, Ibrahim
2013-01-01
The purpose of this study was twofold: to investigate students' concept images about class, object, and their relationship and to help them enhance their learning of these notions with a visualization tool. Fifty-six second-year university students participated in the study. To investigate his/her concept images, the researcher developed a survey…
School Bonding in Early Adolescence: Psychometrics of the Brief Survey of School Bonding
ERIC Educational Resources Information Center
Whiteside-Mansell, Leanne; Weber, Judith L.; Moore, Page C.; Johnson, Danya; Williams, Ed R.; Ward, Wendy L.; Robbins, James M.; Phillips, B. Allyson
2015-01-01
The comprehensive assessment of middle school student bonding is important for basic research and to evaluate interventions. In this study, the psychometric properties of three assessment tools found in the literature were examined individually and then combined to create a shorter survey. With minor modifications, the tools were found to be…
Listening to Community Voices: Community-Based Research, a First Step in Partnership and Outreach
ERIC Educational Resources Information Center
Heffner, Gail Gunst; Zandee, Gail Landheer; Schwander, Lissa
2003-01-01
This paper offers some historical perspective on alternative research traditions and discusses some of the basic principles of community-based research as a tool for partnership development. The authors then describe an example of how Calvin College, a Christian comprehensive liberal arts college has used a multi-disciplinary approach in…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beck, R.N.; Cooper, M.D.
1988-06-01
This program addresses the problems involving the basic science and technology underlying the physical and conceptual tools of radioactive tracer methodology as they relate to the measurement of structural and functional parameters of physiologic importance in health and disease. The principal tool is quantitative radionuclide imaging. The overall objective of this program is to further the development and transfer of radiotracer methodology from basic theory to routine clinical practice in order that individual patients and society as a whole will receive the maximum net benefit from the new knowledge gained. The focus of the research is on the development ofmore » new instruments and radiopharmaceuticals, and the evaluation of these through the phase of clinical feasibility. 58 refs., 15 figs., 4 tabs.« less
Optical Imaging and Control of Neurons
NASA Astrophysics Data System (ADS)
Song, Yoon-Kyu
Although remarkable progress has been made in our understanding of the function, organization, and development of the brain by various approaches of modern science and technology, how the brain performs its marvelous function remains unsolved or incompletely understood. This is mainly attributed to the insufficient capability of currently available research tools and conceptual frameworks to deal with enormous complexity of the brain. Hence, in the last couple of decades, a significant effort has been made to crack the complexity of brain by utilizing research tools from diverse scientific areas. The research tools include the optical neurotechnology which incorporates the exquisite characteristics of optics, such as multi-parallel access and non-invasiveness, in sensing and stimulating the excitable membrane of a neuron, the basic functional unit of the brain. This chapter is aimed to serve as a short introduction to the optical neurotechnology for those who wish to use optical techniques as one of their brain research tools.
Benchmarking a Visual-Basic based multi-component one-dimensional reactive transport modeling tool
NASA Astrophysics Data System (ADS)
Torlapati, Jagadish; Prabhakar Clement, T.
2013-01-01
We present the details of a comprehensive numerical modeling tool, RT1D, which can be used for simulating biochemical and geochemical reactive transport problems. The code can be run within the standard Microsoft EXCEL Visual Basic platform, and it does not require any additional software tools. The code can be easily adapted by others for simulating different types of laboratory-scale reactive transport experiments. We illustrate the capabilities of the tool by solving five benchmark problems with varying levels of reaction complexity. These literature-derived benchmarks are used to highlight the versatility of the code for solving a variety of practical reactive transport problems. The benchmarks are described in detail to provide a comprehensive database, which can be used by model developers to test other numerical codes. The VBA code presented in the study is a practical tool that can be used by laboratory researchers for analyzing both batch and column datasets within an EXCEL platform.
Microgravity: A New Tool for Basic and Applied Research in Space
NASA Technical Reports Server (NTRS)
1985-01-01
This brochure highlights selected aspects of the NASA Microgravity Science and Applications program. So that we can expand our understanding and control of physical processes, this program supports basic and applied research in electronic materials, metals, glasses and ceramics, biological materials, combustion and fluids and chemicals. NASA facilities that provide weightless environments on the ground, in the air, and in space are available to U.S. and foreign investigators representing the academic and industrial communities. After a brief history of microgravity research, the text explains the advantages and methods of performing microgravity research. Illustrations follow of equipment used and experiments preformed aboard the Shuttle and of prospects for future research. The brochure concludes be describing the program goals and the opportunities for participation.
ERIC Educational Resources Information Center
Wriedt, Mario; Sculley, Julian P.; Aulakh, Darpandeep; Zhou, Hong-Cai
2016-01-01
A simple and straightforward synthesis of an ultrastable porous metal-organic framework (MOF) based on copper(II) and a mixed N donor ligand system is described as a laboratory experiment for chemistry undergraduate students. These experiments and the resulting analysis are designed to teach students basic research tools and procedures while…
Fraga, Hilda Carolina de Jesus Rios; Fukutani, Kiyoshi Ferreira; Celes, Fabiana Santana; Barral, Aldina Maria Prado; Oliveira, Camila Indiani de
2012-01-01
To evaluate the process of implementing a quality management system in a basic research laboratory of a public institution, particularly considering the feasibility and impacts of this improvement. This was a prospective and qualitative study. We employed the norm "NIT DICLA 035--Princípios das Boas Práticas de Laboratório (BPL)" and auxiliary documents of Organisation for Economic Co-operation and Development to complement the planning and implementation of a Quality System, in a basic research laboratory. In parallel, we used the PDCA tool to define the goals of each phase of the implementation process. This study enabled the laboratory to comply with the NIT DICLA 035 norm and to implement this norm during execution of a research study. Accordingly, documents were prepared and routines were established such as the registration of non-conformities, traceability of research data and equipment calibration. The implementation of a quality system, the setting of a laboratory focused on basic research is feasible once certain structural changes are made. Importantly, impacts were noticed during the process, which could be related to several improvements in the laboratory routine.
The Lake Tahoe Basin Land Use Simulation Model
Forney, William M.; Oldham, I. Benson
2011-01-01
This U.S. Geological Survey Open-File Report describes the final modeling product for the Tahoe Decision Support System project for the Lake Tahoe Basin funded by the Southern Nevada Public Land Management Act and the U.S. Geological Survey's Geographic Analysis and Monitoring Program. This research was conducted by the U.S. Geological Survey Western Geographic Science Center. The purpose of this report is to describe the basic elements of the novel Lake Tahoe Basin Land Use Simulation Model, publish samples of the data inputs, basic outputs of the model, and the details of the Python code. The results of this report include a basic description of the Land Use Simulation Model, descriptions and summary statistics of model inputs, two figures showing the graphical user interface from the web-based tool, samples of the two input files, seven tables of basic output results from the web-based tool and descriptions of their parameters, and the fully functional Python code.
Wu, Huey-Min; Li, Cheng-Hsaun; Kuo, Bor-Chen; Yang, Yu-Mao; Lin, Chin-Kai; Wan, Wei-Hsiang
2017-08-01
Morphological awareness is the foundation for the important developmental skills involved with vocabulary, as well as understanding the meaning of words, orthographic knowledge, reading, and writing. Visual perception of space and radicals in two-dimensional positions of Chinese characters' morphology is very important in identifying Chinese characters. The important predictive variables of special and visual perception in Chinese characters identification were investigated in the growth model in this research. The assessment tool is the "Computerized Visual Perception Assessment Tool for Chinese Characters Structures" developed by this study. There are two constructs, basic stroke and character structure. In the basic stroke, there are three subtests of one, two, and more than three strokes. In the character structure, there are three subtests of single-component character, horizontal-compound character, and vertical-compound character. This study used purposive sampling. In the first year, 551 children 4-6 years old participated in the study and were monitored for one year. In the second year, 388 children remained in the study and the successful follow-up rate was 70.4%. This study used a two-wave cross-lagged panel design to validate the growth model of the basic stroke and the character structure. There was significant correlation of the basic stroke and the character structure at different time points. The abilities in the basic stroke and in the character structure steadily developed over time for preschool children. Children's knowledge of the basic stroke effectively predicted their knowledge of the basic stroke and the character structure. Copyright © 2017 Elsevier Ltd. All rights reserved.
Spec Tool; an online education and research resource
NASA Astrophysics Data System (ADS)
Maman, S.; Shenfeld, A.; Isaacson, S.; Blumberg, D. G.
2016-06-01
Education and public outreach (EPO) activities related to remote sensing, space, planetary and geo-physics sciences have been developed widely in the Earth and Planetary Image Facility (EPIF) at Ben-Gurion University of the Negev, Israel. These programs aim to motivate the learning of geo-scientific and technologic disciplines. For over the past decade, the facility hosts research and outreach activities for researchers, local community, school pupils, students and educators. As software and data are neither available nor affordable, the EPIF Spec tool was created as a web-based resource to assist in initial spectral analysis as a need for researchers and students. The tool is used both in the academic courses and in the outreach education programs and enables a better understanding of the theoretical data of spectroscopy and Imaging Spectroscopy in a 'hands-on' activity. This tool is available online and provides spectra visualization tools and basic analysis algorithms including Spectral plotting, Spectral angle mapping and Linear Unmixing. The tool enables to visualize spectral signatures from the USGS spectral library and additional spectra collected in the EPIF such as of dunes in southern Israel and from Turkmenistan. For researchers and educators, the tool allows loading collected samples locally for further analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cooper, M.D.; Beck, R.N.
1988-06-01
This document describes several years research to improve PET imaging and diagnostic techniques in man. This program addresses the problems involving the basic science and technology underlying the physical and conceptual tools of radioactive tracer methodology as they relate to the measurement of structural and functional parameters of physiologic importance in health and disease. The principal tool is quantitative radionuclide imaging. The overall objective of this program is to further the development and transfer of radiotracer methodology from basic theory to routine clinical practice in order that individual patients and society as a whole will receive the maximum net benefitmore » from the new knowledge gained. The focus of the research is on the development of new instruments and radiopharmaceuticals, and the evaluation of these through the phase of clinical feasibility. The reports in the study were processed separately for the data bases. (TEM)« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beck, R.N.; Cooper, M.D.
1990-09-01
This report summarizes goals and accomplishments of the research program supported under DOE Grant No. FG02-86ER60418 entitled Instrumentation and Quantitative Methods of Evaluation, with R. Beck, P. I. and M. Cooper, Co-P.I. during the period January 15, 1990 through September 1, 1990. This program addresses the problems involving the basic science and technology underlying the physical and conceptual tools of radioactive tracer methodology as they relate to the measurement of structural and functional parameters of physiologic importance in health and disease. The principal tool is quantitative radionuclide imaging. The overall objective of this program is to further the development andmore » transfer of radiotracer methodology from basic theory to routine clinical practice in order that individual patients and society as a whole will receive the maximum net benefit from the new knowledge gained. The focus of the research is on the development of new instruments and radiopharmaceuticals, and the evaluation of these through the phase of clinical feasibility. 7 figs.« less
Bajorath, Jurgen
2012-01-01
We have generated a number of compound data sets and programs for different types of applications in pharmaceutical research. These data sets and programs were originally designed for our research projects and are made publicly available. Without consulting original literature sources, it is difficult to understand specific features of data sets and software tools, basic ideas underlying their design, and applicability domains. Currently, 30 different entries are available for download from our website. In this data article, we provide an overview of the data and tools we make available and designate the areas of research for which they should be useful. For selected data sets and methods/programs, detailed descriptions are given. This article should help interested readers to select data and tools for specific computational investigations. PMID:24358818
The Individual Basic Facts Assessment Tool
ERIC Educational Resources Information Center
Tait-McCutcheon, Sandi; Drake, Michael
2015-01-01
There is an identified and growing need for a levelled diagnostic basic facts assessment tool that provides teachers with formative information about students' mastery of a broad range of basic fact sets. The Individual Basic Facts Assessment tool has been iteratively and cumulatively developed, trialled, and refined with input from teachers and…
Cancer Imaging Phenomics Toolkit (CaPTK) | Informatics Technology for Cancer Research (ITCR)
CaPTk is a tool that facilitates translation of highly sophisticated methods that help us gain a comprehensive understanding of the underlying mechanisms of cancer from medical imaging research to the clinic. It replicates basic interactive functionalities of radiological workstations and is distributed under a BSD-style license.
Southwestern Native American Studies: A Selected Bibliography.
ERIC Educational Resources Information Center
Stabler, Karen, Comp.
Conducting research in the field of Native American studies requires the use of many different materials in the library. This guide provides a bibliography of useful tools as well as a basic strategy to follow when researching the topic. The types of documents listed include: dictionaries and encyclopedias, guides and handbooks, journal articles,…
Using Calibrated Peer Review to Teach Basic Research Skills
ERIC Educational Resources Information Center
Bracke, Marianne S.; Graveel, John G.
2014-01-01
Calibrated Peer Review (CPR) is an online tool being used in the class Introduction to Agriculture and Purdue University (AGR 10100) to integrate a writing and research component (http://cpr.molsci.ucla.edu/Home.aspx). Calibrated Peer Review combines the ability to create writing intensive assignments with an introduction to the peer-review…
ERIC Educational Resources Information Center
Spaziano, Vincent T.; Gibbons, Judith L.
1986-01-01
Describes an interdisciplinary course providing basic background in behavior, pharmacology, neuroanatomy, neurotransmitters, drugs, and specific brain disorders. Provides rationale, goals, and operational details. Discusses a research project as a tool to improve critical evaluation of science reporting and writing skills. (JM)
Testing basic ecological, evolutionary, and biogeographical principles using invasive species
Cynthia D. Huebner
2006-01-01
Sax et al. argue in Species Invasions: Insights into Ecology, Evolution, and Biogeography that species invasion is an on-going experiment and a research tool with which to test fundamental ecology, evolution, and biogeography tenets.
Systematic reviews in the field of nutrition
USDA-ARS?s Scientific Manuscript database
Systematic reviews are valuable tools for staying abreast of evolving nutrition and aging -related topics, formulating dietary guidelines, establishing nutrient reference intakes, formulating clinical practice guidance, evaluating health claims, and setting research agendas. Basic steps of conductin...
Genome Editing Redefines Precision Medicine in the Cardiovascular Field
Lahm, Harald; Dreßen, Martina; Lange, Rüdiger; Wu, Sean M.; Krane, Markus
2018-01-01
Genome editing is a powerful tool to study the function of specific genes and proteins important for development or disease. Recent technologies, especially CRISPR/Cas9 which is characterized by convenient handling and high precision, revolutionized the field of genome editing. Such tools have enormous potential for basic science as well as for regenerative medicine. Nevertheless, there are still several hurdles that have to be overcome, but patient-tailored therapies, termed precision medicine, seem to be within reach. In this review, we focus on the achievements and limitations of genome editing in the cardiovascular field. We explore different areas of cardiac research and highlight the most important developments: (1) the potential of genome editing in human pluripotent stem cells in basic research for disease modelling, drug screening, or reprogramming approaches and (2) the potential and remaining challenges of genome editing for regenerative therapies. Finally, we discuss social and ethical implications of these new technologies. PMID:29731778
Development of wavelet analysis tools for turbulence
NASA Technical Reports Server (NTRS)
Bertelrud, A.; Erlebacher, G.; Dussouillez, PH.; Liandrat, M. P.; Liandrat, J.; Bailly, F. Moret; Tchamitchian, PH.
1992-01-01
Presented here is the general framework and the initial results of a joint effort to derive novel research tools and easy to use software to analyze and model turbulence and transition. Given here is a brief review of the issues, a summary of some basic properties of wavelets, and preliminary results. Technical aspects of the implementation, the physical conclusions reached at this time, and current developments are discussed.
Algorithm Building and Learning Programming Languages Using a New Educational Paradigm
NASA Astrophysics Data System (ADS)
Jain, Anshul K.; Singhal, Manik; Gupta, Manu Sheel
2011-08-01
This research paper presents a new concept of using a single tool to associate syntax of various programming languages, algorithms and basic coding techniques. A simple framework has been programmed in Python that helps students learn skills to develop algorithms, and implement them in various programming languages. The tool provides an innovative and a unified graphical user interface for development of multimedia objects, educational games and applications. It also aids collaborative learning amongst students and teachers through an integrated mechanism based on Remote Procedure Calls. The paper also elucidates an innovative method for code generation to enable students to learn the basics of programming languages using drag-n-drop methods for image objects.
Using Self-Reflection To Increase Science Process Skills in the General Chemistry Laboratory
NASA Astrophysics Data System (ADS)
Veal, William R.; Taylor, Dawne; Rogers, Amy L.
2009-03-01
Self-reflection is a tool of instruction that has been used in the science classroom. Research has shown great promise in using video as a learning tool in the classroom. However, the integration of self-reflective practice using video in the general chemistry laboratory to help students develop process skills has not been done. Immediate video feedback and direct instruction were employed in a general chemistry laboratory course to improve students' mastery and understanding of basic and advanced process skills. Qualitative results and statistical analysis of quantitative data proved that self-reflection significantly helped students develop basic and advanced process skills, yet did not seem to influence the general understanding of the science content.
Web-based platform for collaborative medical imaging research
NASA Astrophysics Data System (ADS)
Rittner, Leticia; Bento, Mariana P.; Costa, André L.; Souza, Roberto M.; Machado, Rubens C.; Lotufo, Roberto A.
2015-03-01
Medical imaging research depends basically on the availability of large image collections, image processing and analysis algorithms, hardware and a multidisciplinary research team. It has to be reproducible, free of errors, fast, accessible through a large variety of devices spread around research centers and conducted simultaneously by a multidisciplinary team. Therefore, we propose a collaborative research environment, named Adessowiki, where tools and datasets are integrated and readily available in the Internet through a web browser. Moreover, processing history and all intermediate results are stored and displayed in automatic generated web pages for each object in the research project or clinical study. It requires no installation or configuration from the client side and offers centralized tools and specialized hardware resources, since processing takes place in the cloud.
Lindbo, John A; Falk, Bryce W
2017-06-01
Worldwide, plant viruses cause serious reductions in marketable crop yield and in some cases even plant death. In most cases, the most effective way to control virus diseases is through genetically controlled resistance. However, developing virus-resistant (VR) crops through traditional breeding can take many years, and in some cases is not even possible. Because of this, the demonstration of the first VR transgenic plants in 1985 generated much attention. This seminal report served as an inflection point for research in both basic and applied plant pathology, the results of which have dramatically changed both basic research and in a few cases, commercial crop production. The typical review article on this topic has focused on only basic or only applied research results stemming from this seminal discovery. This can make it difficult for the reader to appreciate the full impact of research on transgenic virus resistance, and the contributions from fundamental research that led to translational applications of this technology. In this review, we take a global view of this topic highlighting the significant changes to both basic and applied plant pathology research and commercial food production that have accumulated in the last 30 plus years. We present these milestones in the historical context of some of the scientific, economic, and environmental drivers for developing specific VR crops. The intent of this review is to provide a single document that adequately records the significant accomplishments of researchers in both basic and applied plant pathology research on this topic and how they relate to each other. We hope this review therefore serves as both an instructional tool for students new to the topic, as well as a source of conversation and discussion for how the technology of engineered virus resistance could be applied in the future.
The poverty-related neglected diseases: Why basic research matters.
Hotez, Peter J
2017-11-01
Together, malaria and the neglected tropical diseases (NTDs) kill more than 800,000 people annually, while creating long-term disability in millions more. International support for mass drug administration, bed nets, and other preventive measures has resulted in huge public health gains, while support for translational research is leading to the development of some new neglected disease drugs, diagnostics, and vaccines. However, funding for basic science research has not kept up, such that we are missing opportunities to create a more innovative pipeline of control tools for parasitic and related diseases. There is an urgent need to expand basic science approaches for neglected diseases, especially in the areas of systems biology and immunology; ecology, evolution, and mathematical biology; functional and comparative OMICs; gene editing; expanded use of model organisms; and a new single-cell combinatorial indexing RNA sequencing approach. The world's poor deserve access to innovation for neglected diseases. It should be considered a fundamental human right.
Wireless Mapping, GIS, and Learning about the Digital Divide: A Classroom Experience
ERIC Educational Resources Information Center
Giordano, Alberto; Lu, Yongmei; Anderson, Sharolyn; Fonstad, Mark
2007-01-01
The purpose of this article is to describe a capstone course in undergraduate student geographical research in which GIS and other geospatial tools were used to teach undergraduate students basic geographical principles. The course uses the "cooperative learning" pedagogical approach to address one of a number of client-supplied research projects,…
The Plant Protoplast: A Useful Tool for Plant Research and Student Instruction
ERIC Educational Resources Information Center
Wagner, George J.; And Others
1978-01-01
A plant protoplast is basically a plant cell that lacks a cell wall. This article outlines some of the ways in which protoplasts may be used to advance understanding of plant cell biology in research and student instruction. Topics include high efficiency experimental virus infection, organelle isolation, and osmotic effects. (Author/MA)
OXlearn: a new MATLAB-based simulation tool for connectionist models.
Ruh, Nicolas; Westermann, Gert
2009-11-01
OXlearn is a free, platform-independent MATLAB toolbox in which standard connectionist neural network models can be set up, run, and analyzed by means of a user-friendly graphical interface. Due to its seamless integration with the MATLAB programming environment, the inner workings of the simulation tool can be easily inspected and/or extended using native MATLAB commands or components. This combination of usability, transparency, and extendability makes OXlearn an efficient tool for the implementation of basic research projects or the prototyping of more complex research endeavors, as well as for teaching. Both the MATLAB toolbox and a compiled version that does not require access to MATLAB can be downloaded from http://psych.brookes.ac.uk/oxlearn/.
ERIC Educational Resources Information Center
Wilson, Paul
The numerous forms filed with the Federal Communications Commission (FCC) provide information about a variety of topics. Basic licensing information that is available concerns engineering, ownership, and equal employment opportunity. The FCC's broadcast bureau collects information about programing, the ascertainment of community needs, public…
USDA-ARS?s Scientific Manuscript database
A Visual Basic agro-climate application by climatologists at the International Center for Agricultural Research in the Dry Areas and the U.S. Department of Agriculture is described here. The database from which the application derives climate information consists of weather generator parameters der...
USDA-ARS?s Scientific Manuscript database
A Visual Basic agro-climate application developed by climatologists at the International Center for Agricultural Research in the Dry Areas and the U.S. Department of Agriculture is described here. The database from which the application derives climate information consists of weather generator param...
Introducing FNCS: Framework for Network Co-Simulation
None
2018-06-07
This video provides a basic overview of the PNNL Future Power Grid Initiative-developed Framework for Network Co-Simulation (FNCS). It discusses the increasing amounts of data coming from the power grid, and the need for a tool like FNCS that brings together data, transmission and distribution simulators. Included is a description of the FNCS architecture, and the advantages this new open source tool can bring to grid research and development efforts.
Introducing FNCS: Framework for Network Co-Simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2014-10-23
This video provides a basic overview of the PNNL Future Power Grid Initiative-developed Framework for Network Co-Simulation (FNCS). It discusses the increasing amounts of data coming from the power grid, and the need for a tool like FNCS that brings together data, transmission and distribution simulators. Included is a description of the FNCS architecture, and the advantages this new open source tool can bring to grid research and development efforts.
FOSS Tools for Research Infrastructures - A Success Story?
NASA Astrophysics Data System (ADS)
Stender, V.; Schroeder, M.; Wächter, J.
2015-12-01
Established initiatives and mandated organizations, e.g. the Initiative for Scientific Cyberinfrastructures (NSF, 2007) or the European Strategy Forum on Research Infrastructures (ESFRI, 2008), promote and foster the development of sustainable research infrastructures. The basic idea behind these infrastructures is the provision of services supporting scientists to search, visualize and access data, to collaborate and exchange information, as well as to publish data and other results. Especially the management of research data is gaining more and more importance. In geosciences these developments have to be merged with the enhanced data management approaches of Spatial Data Infrastructures (SDI). The Centre for GeoInformationTechnology (CeGIT) at the GFZ German Research Centre for Geosciences has the objective to establish concepts and standards of SDIs as an integral part of research infrastructure architectures. In different projects, solutions to manage research data for land- and water management or environmental monitoring have been developed based on a framework consisting of Free and Open Source Software (FOSS) components. The framework provides basic components supporting the import and storage of data, discovery and visualization as well as data documentation (metadata). In our contribution, we present our data management solutions developed in three projects, Central Asian Water (CAWa), Sustainable Management of River Oases (SuMaRiO) and Terrestrial Environmental Observatories (TERENO) where FOSS components build the backbone of the data management platform. The multiple use and validation of tools helped to establish a standardized architectural blueprint serving as a contribution to Research Infrastructures. We examine the question of whether FOSS tools are really a sustainable choice and whether the increased efforts of maintenance are justified. Finally it should help to answering the question if the use of FOSS for Research Infrastructures is a success story.
Lustigman, Sara; Geldhof, Peter; Grant, Warwick N; Osei-Atweneboana, Mike Y; Sripa, Banchob; Basáñez, María-Gloria
2012-01-01
Successful and sustainable intervention against human helminthiases depends on optimal utilisation of available control measures and development of new tools and strategies, as well as an understanding of the evolutionary implications of prolonged intervention on parasite populations and those of their hosts and vectors. This will depend largely on updated knowledge of relevant and fundamental parasite biology. There is a need, therefore, to exploit and apply new knowledge and techniques in order to make significant and novel gains in combating helminthiases and supporting the sustainability of current and successful mass drug administration (MDA) programmes. Among the fields of basic research that are likely to yield improved control tools, the Disease Reference Group on Helminth Infections (DRG4) has identified four broad areas that stand out as central to the development of the next generation of helminth control measures: 1) parasite genetics, genomics, and functional genomics; 2) parasite immunology; 3) (vertebrate) host-parasite interactions and immunopathology; and 4) (invertebrate) host-parasite interactions and transmission biology. The DRG4 was established in 2009 by the Special Programme for Research and Training in Tropical Diseases (TDR). The Group was given the mandate to undertake a comprehensive review of recent advances in helminthiases research in order to identify notable gaps and highlight priority areas. This paper summarises recent advances and discusses challenges in the investigation of the fundamental biology of those helminth parasites under the DRG4 Group's remit according to the identified priorities, and presents a research and development agenda for basic parasite research and enabling technologies that will help support control and elimination efforts against human helminthiases.
Atmospheric and Space Sciences: Ionospheres and Plasma Environments
NASA Astrophysics Data System (ADS)
Yiǧit, Erdal
2018-01-01
The SpringerBriefs on Atmospheric and Space Sciences in two volumes presents a concise and interdisciplinary introduction to the basic theory, observation & modeling of atmospheric and ionospheric coupling processes on Earth. The goal is to contribute toward bridging the gap between meteorology, aeronomy, and planetary science. In addition recent progress in several related research topics, such atmospheric wave coupling and variability, is discussed. Volume 1 will focus on the atmosphere, while Volume 2 will present the ionospheres and the plasma environments. Volume 2 is aimed primarily at (research) students and young researchers that would like to gain quick insight into the basics of space sciences and current research. In combination with the first volume, it also is a useful tool for professors who would like to develop a course in atmospheric and space physics.
Visualization: a tool for enhancing students' concept images of basic object-oriented concepts
NASA Astrophysics Data System (ADS)
Cetin, Ibrahim
2013-03-01
The purpose of this study was twofold: to investigate students' concept images about class, object, and their relationship and to help them enhance their learning of these notions with a visualization tool. Fifty-six second-year university students participated in the study. To investigate his/her concept images, the researcher developed a survey including open-ended questions, which was administered to the participants. Follow-up interviews with 12 randomly selected students were conducted to explore their answers to the survey in depth. The results of the first part of the research were utilized to construct visualization scenarios. The students used these scenarios to develop animations using Flash software. The study found that most of the students experienced difficulties in learning object-oriented notions. Overdependence on code-writing practice and examples and incorrectly learned analogies were determined to be the sources of their difficulties. Moreover, visualization was found to be a promising approach in facilitating students' concept images of basic object-oriented notions. The results of this study have implications for researchers and practitioners when designing programming instruction.
The VIDA Framework as an Education Tool: Leveraging Volcanology Data for Educational Purposes
NASA Astrophysics Data System (ADS)
Faied, D.; Sanchez, A.
2009-04-01
The VIDA Framework as an Education Tool: Leveraging Volcanology Data for Educational Purposes Dohy Faied, Aurora Sanchez (on behalf of SSP08 VAPOR Project Team) While numerous global initiatives exist to address the potential hazards posed by volcanic eruption events and assess impacts from a civil security viewpoint, there does not yet exist a single, unified, international system of early warning and hazard tracking for eruptions. Numerous gaps exist in the risk reduction cycle, from data collection, to data processing, and finally dissemination of salient information to relevant parties. As part of the 2008 International Space University's Space Studies Program, a detailed gap analysis of the state of volcano disaster risk reduction was undertaken, and this paper presents the principal results. This gap analysis considered current sensor technologies, data processing algorithms, and utilization of data products by various international organizations. Recommendations for strategies to minimize or eliminate certain gaps are also provided. In the effort to address the gaps, a framework evolved at system level. This framework, known as VIDA, is a tool to develop user requirements for civil security in hazardous contexts, and a candidate system concept for a detailed design phase. While the basic intention of VIDA is to support disaster risk reduction efforts, there are several methods of leveraging raw science data to support education across a wide demographic. Basic geophysical data could be used to educate school children about the characteristics of volcanoes, satellite mappings could support informed growth and development of societies in at-risk areas, and raw sensor data could contribute to a wide range of university-level research projects. Satellite maps, basic geophysical data, and raw sensor data are combined and accessible in a way that allows the relationships between these data types to be explored and used in a training environment. Such a resource naturally lends itself to research efforts in the subject but also research in operational tools, system architecture, and human/machine interaction in civil protection or emergency scenarios.
Static Verification for Code Contracts
NASA Astrophysics Data System (ADS)
Fähndrich, Manuel
The Code Contracts project [3] at Microsoft Research enables programmers on the .NET platform to author specifications in existing languages such as C# and VisualBasic. To take advantage of these specifications, we provide tools for documentation generation, runtime contract checking, and static contract verification.
Applied Linguistics Project: Student-Led Computer Assisted Research in High School EAL/EAP
ERIC Educational Resources Information Center
Bohát, Róbert; Rödlingová, Beata; Horáková, Nina
2015-01-01
The Applied Linguistics Project (ALP) started at the International School of Prague (ISP) in 2013. Every year, Grade 9 English as an Additional Language (EAL) students identify an area of learning in need of improvement and design a research method followed by data collection and analysis using basic computer software tools or online corpora.…
Applications of Nuclear and Particle Physics Technology: Particles & Detection — A Brief Overview
NASA Astrophysics Data System (ADS)
Weisenberger, Andrew G.
A brief overview of the technology applications with significant societal benefit that have their origins in nuclear and particle physics research is presented. It is shown through representative examples that applications of nuclear physics can be classified into two basic areas: 1) applying the results of experimental nuclear physics and 2) applying the tools of experimental nuclear physics. Examples of the application of the tools of experimental nuclear and particle physics research are provided in the fields of accelerator and detector based technologies namely synchrotron light sources, nuclear medicine, ion implantation and radiation therapy.
Laser beam machining of polycrystalline diamond for cutting tool manufacturing
NASA Astrophysics Data System (ADS)
Wyszyński, Dominik; Ostrowski, Robert; Zwolak, Marek; Bryk, Witold
2017-10-01
The paper concerns application of DPSS Nd: YAG 532nm pulse laser source for machining of polycrystalline WC based diamond inserts (PCD). The goal of the research was to determine optimal laser cutting parameters for cutting tool shaping. Basic criteria to reach the goal was cutting edge quality (minimalization of finishing operations), material removal rate (time and cost efficiency), choice of laser beam characteristics (polarization, power, focused beam diameter). The research was planned and realised and analysed according to design of experiment rules (DOE). The analysis of the cutting edge was prepared with use of Alicona Infinite Focus measurement system.
Freiburg RNA tools: a central online resource for RNA-focused research and teaching.
Raden, Martin; Ali, Syed M; Alkhnbashi, Omer S; Busch, Anke; Costa, Fabrizio; Davis, Jason A; Eggenhofer, Florian; Gelhausen, Rick; Georg, Jens; Heyne, Steffen; Hiller, Michael; Kundu, Kousik; Kleinkauf, Robert; Lott, Steffen C; Mohamed, Mostafa M; Mattheis, Alexander; Miladi, Milad; Richter, Andreas S; Will, Sebastian; Wolff, Joachim; Wright, Patrick R; Backofen, Rolf
2018-05-21
The Freiburg RNA tools webserver is a well established online resource for RNA-focused research. It provides a unified user interface and comprehensive result visualization for efficient command line tools. The webserver includes RNA-RNA interaction prediction (IntaRNA, CopraRNA, metaMIR), sRNA homology search (GLASSgo), sequence-structure alignments (LocARNA, MARNA, CARNA, ExpaRNA), CRISPR repeat classification (CRISPRmap), sequence design (antaRNA, INFO-RNA, SECISDesign), structure aberration evaluation of point mutations (RaSE), and RNA/protein-family models visualization (CMV), and other methods. Open education resources offer interactive visualizations of RNA structure and RNA-RNA interaction prediction as well as basic and advanced sequence alignment algorithms. The services are freely available at http://rna.informatik.uni-freiburg.de.
Lessons from 25 years of genetic mapping in onion: where next?
USDA-ARS?s Scientific Manuscript database
Genetic maps are useful tools for both basic research and plant improvement. Close association of genetic markers with genes controlling economically important traits allows for indirect selection, avoiding often time-consuming and expensive phenotypic evaluations. As a result, detailed genetic maps...
The mathematical and computer modeling of the worm tool shaping
NASA Astrophysics Data System (ADS)
Panchuk, K. L.; Lyashkov, A. A.; Ayusheev, T. V.
2017-06-01
Traditionally mathematical profiling of the worm tool is carried out on the first T. Olivier method, known in the theory of gear gearings, with receiving an intermediate surface of the making lath. It complicates process of profiling and its realization by means of computer 3D-modeling. The purpose of the work is the improvement of mathematical model of profiling and its realization based on the methods of 3D-modeling. Research problems are: receiving of the mathematical model of profiling which excludes the presence of the making lath in it; realization of the received model by means of frame and superficial modeling; development and approbation of technology of solid-state modeling for the solution of the problem of profiling. As the basic, the kinematic method of research of the mutually envelope surfaces is accepted. Computer research is executed by means of CAD based on the methods of 3D-modeling. We have developed mathematical model of profiling of the worm tool; frame, superficial and solid-state models of shaping of the mutually enveloping surfaces of the detail and the tool are received. The offered mathematical models and the technologies of 3D-modeling of shaping represent tools for theoretical and experimental profiling of the worm tool. The results of researches can be used at design of metal-cutting tools.
NASA Astrophysics Data System (ADS)
Puig, Albert; LHCb Starterkit Team
2017-10-01
The vast majority of high-energy physicists use and produce software every day. Software skills are usually acquired “on the go” and dedicated training courses are rare. The LHCb Starterkit is a new training format for getting LHCb collaborators started in effectively using software to perform their research. The course focuses on teaching basic skills for research computing. Unlike traditional tutorials we focus on starting with basics, performing all the material live, with a high degree of interactivity, giving priority to understanding the tools as opposed to handing out recipes that work “as if by magic”. The LHCb Starterkit was started by two young members of the collaboration inspired by the principles of Software Carpentry, and the material is created in a collaborative fashion using the tools we teach. Three successful entry-level workshops, as well as an advance one, have taken place since the start of the initiative in 2015, and were taught largely by PhD students to other PhD students.
New trends in articular cartilage repair.
Cucchiarini, Magali; Henrionnet, Christel; Mainard, Didier; Pinzano, Astrid; Madry, Henning
2015-12-01
Damage to the articular cartilage is an important, prevalent, and unsolved clinical issue for the orthopaedic surgeon. This review summarizes innovative basic research approaches that may improve the current understanding of cartilage repair processes and lead to novel therapeutic options. In this regard, new aspects of cartilage tissue engineering with a focus on the choice of the best-suited cell source are presented. The importance of non-destructive cartilage imaging is highlighted with the recent availability of adapted experimental tools such as Second Harmonic Generation (SHG) imaging. Novel insights into cartilage pathophysiology based on the involvement of the infrapatellar fat pad in osteoarthritis are also described. Also, recombinant adeno-associated viral vectors are discussed as clinically adapted, efficient tools for potential gene-based medicines in a variety of articular cartilage disorders. Taken as a whole, such advances in basic research in diverse fields of articular cartilage repair may lead to the development of improved therapies in the clinics for an improved, effective treatment of cartilage lesions in a close future.
Basic Hand Tools for Bricklaying and Cement Masonry [and] Basic Hand Tools of the Carpenter.
ERIC Educational Resources Information Center
Texas A and M Univ., College Station. Vocational Instructional Services.
Intended for student use, this unit discusses and illustrates the tools used in brick and masonry and carpentry. Contents of the brick and masonry section include informative materials on bricklaying tools (brick trowels, joint tools, levels, squares, line and accessories, rules, hammers and chisels, tool kits) and cement masonry tools (tampers,…
Issues in Biomedical Research Data Management and Analysis: Needs and Barriers
Anderson, Nicholas R.; Lee, E. Sally; Brockenbrough, J. Scott; Minie, Mark E.; Fuller, Sherrilynne; Brinkley, James; Tarczy-Hornoch, Peter
2007-01-01
Objectives A. Identify the current state of data management needs of academic biomedical researchers. B. Explore their anticipated data management and analysis needs. C. Identify barriers to addressing those needs. Design A multimodal needs analysis was conducted using a combination of an online survey and in-depth one-on-one semi-structured interviews. Subjects were recruited via an e-mail list representing a wide range of academic biomedical researchers in the Pacific Northwest. Measurements The results from 286 survey respondents were used to provide triangulation of the qualitative analysis of data gathered from 15 semi-structured in-depth interviews. Results Three major themes were identified: 1) there continues to be widespread use of basic general-purpose applications for core data management; 2) there is broad perceived need for additional support in managing and analyzing large datasets; and 3) the barriers to acquiring currently available tools are most commonly related to financial burdens on small labs and unmet expectations of institutional support. Conclusion Themes identified in this study suggest that at least some common data management needs will best be served by improving access to basic level tools such that researchers can solve their own problems. Additionally, institutions and informaticians should focus on three components: 1) facilitate and encourage the use of modern data exchange models and standards, enabling researchers to leverage a common layer of interoperability and analysis; 2) improve the ability of researchers to maintain provenance of data and models as they evolve over time though tools and the leveraging of standards; and 3) develop and support information management service cores that could assist in these previous components while providing researchers with unique data analysis and information design support within a spectrum of informatics capabilities. PMID:17460139
How medicine has become a science?
Zieliński, Andrzej
2014-01-01
The historical review of medical activities draws attention how late in its very long history therapies of proven effectiveness were introduced. Author attributes it to the late development of methods which would be capable to determine the causal relations which would scientifically justified identification the causes and risk factors of diseases as well as checking the effectiveness of preventive and therapeutic procedures. Among the fundamental tools for scientific knowledge of the causes and mechanisms of diseases, the author indicates: achievements of basic science and the development of epidemiological methods used to study causal relationships. In the author's opinion the results of basic research are an essential source of variables among which, with an increased likelihood could be found the causes and risk factors of studied conditions, including diseases. The author also stresses the role of medical technology, which is the primary source of potential medicines, other therapeutic procedures and diagnostic methods whose effectiveness is tested in experimental epidemiological studies. Medical technologies create also tools for the development of basic sciences.
Basic Emotions, Natural Kinds, Emotion Schemas, and a New Paradigm.
Izard, Carroll E
2007-09-01
Research on emotion flourishes in many disciplines and specialties, yet experts cannot agree on its definition. Theorists and researchers use the term emotion in ways that imply different processes and meanings. Debate continues about the nature of emotions, their functions, their relations to broad affective dimensions, the processes that activate them, and their role in our daily activities and pursuits. I will address these issues here, specifically in terms of basic emotions as natural kinds, the nature of emotion schemas, the development of emotion-cognition relations that lead to emotion schemas, and discrete emotions in relation to affective dimensions. Finally, I propose a new paradigm that assumes continual emotion as a factor in organizing consciousness and as an influence on mind and behavior. The evidence reviewed suggests that a theory that builds on concepts of both basic emotions and emotion schemas provides a viable research tool and is compatible with more holistic or dimensional approaches. © 2007 Association for Psychological Science.
NASA Astrophysics Data System (ADS)
Madaras, Gary S.
2002-05-01
The use of computer modeling as a marketing, diagnosis, design, and research tool in the practice of acoustical consulting is discussed. From the time it is obtained, the software can be used as an effective marketing tool. It is not until the software basics are learned and some amount of testing and verification occurs that the software can be used as a tool for diagnosing the acoustics of existing rooms. A greater understanding of the output types and formats as well as experience in interpreting the results is required before the software can be used as an efficient design tool. Lastly, it is only after repetitive use as a design tool that the software can be used as a cost-effective means of conducting research in practice. The discussion is supplemented with specific examples of actual projects provided by various consultants within multiple firms. Focus is placed on the use of CATT-Acoustic software and predicting the room acoustics of large performing arts halls as well as other public assembly spaces.
Industrial applications of the microgravity environment
NASA Technical Reports Server (NTRS)
1988-01-01
Opportunities for commercialization of the microgravity environment will depend upon the success of basic research projects performed in space. Significant demands for manufacturing opportunities are unlikely in the near term. The microgravity environment is to be considered primarily as a tool for research and secondarily as a manufacturing site. This research tool is unique, valuable, and presently available to U.S. investigators only through resources provided by NASA. The United States has an obligation to facilitate corporate research, maintain a flexible international policy, foster use of and assure access to a wide variety of facilities, and develop a posture of national and international leadership in and stewardship of research and materials processing in the microgravity environment. The National Research Council's Committee on Industrial Applications of the Microgravity Environment recommends six actions that strengthen this posture, including the formation of an authoritative organization to oversee the implementation of a program of microgravity research and its industrial applications.
What are the priorities in basic asthma research? A United Kingdom perspective.
Hallsworth, Matthew P; Major, Philippa J; Barnes, Jack; Lee, Tak H
2003-02-01
The National Asthma Campaign (in the United Kingdom) has recently completed a strategic review of priorities for basic asthma research over the next 5 to 10 years. Leading asthma experts and representatives of the main funding agencies were involved in a nationwide consultation. Discussions were carried out in 7 thematic areas: Genetics of asthma, early-life events, environmental influences, immunology and immunotherapy, inflammation and anti-inflammation, airway remodeling, and the interface between academia and industry. Discussions were not restricted by considerations of financial affordability but were driven by vision and science. The consultation highlighted a number of generic issues pertaining to the organization of basic asthma research. Phenotypes of asthma require more robust characterization, particularly for genetic studies. Emphasis on longitudinal studies should be encouraged, and more information can still be gained from existing well-characterized asthma cohorts, though this requires some coordination. Human research is particularly strong and should continue, and the use of human tissue is vital to our understanding of the disease at the cellular and molecular levels. Animal models of asthma remain an important tool with which to dissect disease mechanisms, but they must be improved and refined. The consultation covered a wide range of issues and highlighted the need for collaboration at all levels between research groups and with industry and also between funding agencies. The recommendations made have relevance to everyone involved in basic asthma research. This article describes the recommendations and reviews the specific research issues relating to each of the 7 thematic areas.
Nanobody-derived nanobiotechnology tool kits for diverse biomedical and biotechnology applications.
Wang, Yongzhong; Fan, Zhen; Shao, Lei; Kong, Xiaowei; Hou, Xianjuan; Tian, Dongrui; Sun, Ying; Xiao, Yazhong; Yu, Li
2016-01-01
Owing to peculiar properties of nanobody, including nanoscale size, robust structure, stable and soluble behaviors in aqueous solution, reversible refolding, high affinity and specificity for only one cognate target, superior cryptic cleft accessibility, and deep tissue penetration, as well as a sustainable source, it has been an ideal research tool for the development of sophisticated nanobiotechnologies. Currently, the nanobody has been evolved into versatile research and application tool kits for diverse biomedical and biotechnology applications. Various nanobody-derived formats, including the nanobody itself, the radionuclide or fluorescent-labeled nanobodies, nanobody homo- or heteromultimers, nanobody-coated nanoparticles, and nanobody-displayed bacteriophages, have been successfully demonstrated as powerful nanobiotechnological tool kits for basic biomedical research, targeting drug delivery and therapy, disease diagnosis, bioimaging, and agricultural and plant protection. These applications indicate a special advantage of these nanobody-derived technologies, already surpassing the "me-too" products of other equivalent binders, such as the full-length antibodies, single-chain variable fragments, antigen-binding fragments, targeting peptides, and DNA-based aptamers. In this review, we summarize the current state of the art in nanobody research, focusing on the nanobody structural features, nanobody production approach, nanobody-derived nanobiotechnology tool kits, and the potentially diverse applications in biomedicine and biotechnology. The future trends, challenges, and limitations of the nanobody-derived nanobiotechnology tool kits are also discussed.
School bonding in early adolescents: Psychometrics of the brief survey of school bonding
USDA-ARS?s Scientific Manuscript database
The comprehensive assessment of middle school student bonding is important for basic research and to evaluate interventions. Of over 30 tools reviewed, this study examined the psychometric properties of three that together assessed the four constructs identified by Hirschi as key elements. With some...
ERIC Educational Resources Information Center
Dikli, Semire
2006-01-01
The impacts of computers on writing have been widely studied for three decades. Even basic computers functions, i.e. word processing, have been of great assistance to writers in modifying their essays. The research on Automated Essay Scoring (AES) has revealed that computers have the capacity to function as a more effective cognitive tool (Attali,…
WestuRe: U.S. Pacific Coast estuary/watershed data and R tools
There are about 350 estuaries along the U.S. Pacific Coast. Basic descriptive data for these estuaries, such as their size and watershed area, are important for coastal-scale research and conservation planning. However, this information is spread among many sources and can be dif...
Advances in Language Planning.
ERIC Educational Resources Information Center
Fishman, Joshua A.
This volume is an attempt to provide the sociology of language with the basic teaching-learning tools needed in order to facilitate its academic growth and consolidation. It provides the students and specialist in language planning with a comprehensive anthology of articles dealing with this area of research in the sociology of language. The…
ERIC Educational Resources Information Center
National Inst. of General Medical Sciences (NIH), Bethesda, MD.
This booklet, geared toward an advanced high school or early college-level audience, describes how basic chemistry and biochemistry research can spur a better understanding of human health. It reveals how networks of chemical reactions keep our bodies running smoothly. Some of the tools and technologies used to explore these reactions are…
Cognitive Foundry v. 3.0 (OSS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Basilico, Justin; Dixon, Kevin; McClain, Jonathan
2009-11-18
The Cognitive Foundry is a unified collection of tools designed for research and applications that use cognitive modeling, machine learning, or pattern recognition. The software library contains design patterns, interface definitions, and default implementations of reusable software components and algorithms designed to support a wide variety of research and development needs. The library contains three main software packages: the Common package that contains basic utilities and linear algebraic methods, the Cognitive Framework package that contains tools to assist in implementing and analyzing theories of cognition, and the Machine Learning package that provides general algorithms and methods for populating Cognitive Frameworkmore » components from domain-relevant data.« less
Research on the Intensity Analysis and Result Visualization of Construction Land in Urban Planning
NASA Astrophysics Data System (ADS)
Cui, J.; Dong, B.; Li, J.; Li, L.
2017-09-01
As a fundamental work of urban planning, the intensity analysis of construction land involves many repetitive data processing works that are prone to cause errors or data precision loss, and the lack of efficient methods and tools to visualizing the analysis results in current urban planning. In the research a portable tool is developed by using the Model Builder technique embedded in ArcGIS to provide automatic data processing and rapid result visualization for the works. A series of basic modules provided by ArcGIS are linked together to shape a whole data processing chain in the tool. Once the required data is imported, the analysis results and related maps and graphs including the intensity values and zoning map, the skyline analysis map etc. are produced automatically. Finally the tool is installation-free and can be dispatched quickly between planning teams.
Ruggiero, Rafael N; Rossignoli, Matheus T; De Ross, Jana B; Hallak, Jaime E C; Leite, Joao P; Bueno-Junior, Lezio S
2017-01-01
Much of our knowledge of the endocannabinoid system in schizophrenia comes from behavioral measures in rodents, like prepulse inhibition of the acoustic startle and open-field locomotion, which are commonly used along with neurochemical approaches or drug challenge designs. Such methods continue to map fundamental mechanisms of sensorimotor gating, hyperlocomotion, social interaction, and underlying monoaminergic, glutamatergic, and GABAergic disturbances. These strategies will require, however, a greater use of neurophysiological tools to better inform clinical research. In this sense, electrophysiology and viral vector-based circuit dissection, like optogenetics, can further elucidate how exogenous cannabinoids worsen (e.g., tetrahydrocannabinol, THC) or ameliorate (e.g., cannabidiol, CBD) schizophrenia symptoms, like hallucinations, delusions, and cognitive deficits. Also, recent studies point to a complex endocannabinoid-endovanilloid interplay, including the influence of anandamide (endogenous CB 1 and TRPV 1 agonist) on cognitive variables, such as aversive memory extinction. In fact, growing interest has been devoted to TRPV 1 receptors as promising therapeutic targets. Here, these issues are reviewed with an emphasis on the neurophysiological evidence. First, we contextualize imaging and electrographic findings in humans. Then, we present a comprehensive review on rodent electrophysiology. Finally, we discuss how basic research will benefit from further combining psychopharmacological and neurophysiological tools.
ERIC Educational Resources Information Center
Lutzker, Marilyn
This introductory guide to basic library research tools in the field of criminal justice was compiled for use by students at the John Jay College of Criminal Justice as part of the Library Instruction Program. Included are chapters on devising a search strategy; the use of the card catalog; encyclopedia and dictionaries; indexes and abstracts;…
ERIC Educational Resources Information Center
Musawi, Ali Al; Ambusaidi, Abdullah; Al-Balushi, Sulaiman; Al-Sinani, Mohamed; Al-Balushi, Kholoud
2017-01-01
This paper aims to measure the effectiveness of the 3DL on Omani students' acquisition of practical abilities and skills. It examines the effectiveness of the 3D-lab in science education and scientific thinking acquisition as part of a national project funded by The Research Council. Four research tools in a Pre-Post Test Control Group Design,…
General-Purpose Electronic System Tests Aircraft
NASA Technical Reports Server (NTRS)
Glover, Richard D.
1989-01-01
Versatile digital equipment supports research, development, and maintenance. Extended aircraft interrogation and display system is general-purpose assembly of digital electronic equipment on ground for testing of digital electronic systems on advanced aircraft. Many advanced features, including multiple 16-bit microprocessors, pipeline data-flow architecture, advanced operating system, and resident software-development tools. Basic collection of software includes program for handling many types of data and for displays in various formats. User easily extends basic software library. Hardware and software interfaces to subsystems provided by user designed for flexibility in configuration to meet user's requirements.
SimVascular: An Open Source Pipeline for Cardiovascular Simulation.
Updegrove, Adam; Wilson, Nathan M; Merkow, Jameson; Lan, Hongzhi; Marsden, Alison L; Shadden, Shawn C
2017-03-01
Patient-specific cardiovascular simulation has become a paradigm in cardiovascular research and is emerging as a powerful tool in basic, translational and clinical research. In this paper we discuss the recent development of a fully open-source SimVascular software package, which provides a complete pipeline from medical image data segmentation to patient-specific blood flow simulation and analysis. This package serves as a research tool for cardiovascular modeling and simulation, and has contributed to numerous advances in personalized medicine, surgical planning and medical device design. The SimVascular software has recently been refactored and expanded to enhance functionality, usability, efficiency and accuracy of image-based patient-specific modeling tools. Moreover, SimVascular previously required several licensed components that hindered new user adoption and code management and our recent developments have replaced these commercial components to create a fully open source pipeline. These developments foster advances in cardiovascular modeling research, increased collaboration, standardization of methods, and a growing developer community.
Mild Traumatic Brain Injury Pocket Guide (CONUS)
2010-01-01
Cognitive Rehab Driving Following TBI Patient Education Clinical Tools and Resources Report Documentation Page Form ApprovedOMB No. 0704-0188 Public... Rehab Driving Following TBI Patient Education Clinical Tools and Resources 2 3 TBI Basics VA/DoD CPG Management of Headaches Management of Other...Symptoms ICD-9 Coding Cognitive Rehab Driving Following TBI Patient Education Clinical Tools and Resources TBI BASICS 4 5 TBI BASICS dod definition
Behavioral Health Program Element
NASA Technical Reports Server (NTRS)
Leveton, Lauren B.
2006-01-01
The project goal is to develop behavioral health prevention and maintenance system for continued crew health, safety, and performance for exploration missions. The basic scope includes a) Operationally-relevant research related to clinical cognitive and behavioral health of crewmembers; b) Ground-based studies using analog environments (Antarctic, NEEMO, simulations, and other testbeds; c) ISS studies (ISSMP) focusing on operational issues related to behavioral health outcomes and standards; d) Technology development activities for monitoring and diagnostic tools; and e) Cross-disciplinary research (e.g., human factors and habitability research, skeletal muscle, radiation).
Development of a research ethics knowledge and analytical skills assessment tool.
Taylor, Holly A; Kass, Nancy E; Ali, Joseph; Sisson, Stephen; Bertram, Amanda; Bhan, Anant
2012-04-01
The goal of this project was to develop and validate a new tool to evaluate learners' knowledge and skills related to research ethics. A core set of 50 questions from existing computer-based online teaching modules were identified, refined and supplemented to create a set of 74 multiple-choice, true/false and short answer questions. The questions were pilot-tested and item discrimination was calculated for each question. Poorly performing items were eliminated or refined. Two comparable assessment tools were created. These assessment tools were administered as a pre-test and post-test to a cohort of 58 Indian junior health research investigators before and after exposure to a new course on research ethics. Half of the investigators were exposed to the course online, the other half in person. Item discrimination was calculated for each question and Cronbach's α for each assessment tool. A final version of the assessment tool that incorporated the best questions from the pre-/post-test phase was used to assess retention of research ethics knowledge and skills 3 months after course delivery. The final version of the REKASA includes 41 items and had a Cronbach's α of 0.837. The results illustrate, in one sample of learners, the successful, systematic development and use of a knowledge and skills assessment tool in research ethics capable of not only measuring basic knowledge in research ethics and oversight but also assessing learners' ability to apply ethics knowledge to the analytical task of reasoning through research ethics cases, without reliance on essay or discussion-based examination. These promising preliminary findings should be confirmed with additional groups of learners.
The Human Ageing Genomic Resources: online databases and tools for biogerontologists
de Magalhães, João Pedro; Budovsky, Arie; Lehmann, Gilad; Costa, Joana; Li, Yang; Fraifeld, Vadim; Church, George M.
2009-01-01
Summary Ageing is a complex, challenging phenomenon that will require multiple, interdisciplinary approaches to unravel its puzzles. To assist basic research on ageing, we developed the Human Ageing Genomic Resources (HAGR). This work provides an overview of the databases and tools in HAGR and describes how the gerontology research community can employ them. Several recent changes and improvements to HAGR are also presented. The two centrepieces in HAGR are GenAge and AnAge. GenAge is a gene database featuring genes associated with ageing and longevity in model organisms, a curated database of genes potentially associated with human ageing, and a list of genes tested for their association with human longevity. A myriad of biological data and information is included for hundreds of genes, making GenAge a reference for research that reflects our current understanding of the genetic basis of ageing. GenAge can also serve as a platform for the systems biology of ageing, and tools for the visualization of protein-protein interactions are also included. AnAge is a database of ageing in animals, featuring over 4,000 species, primarily assembled as a resource for comparative and evolutionary studies of ageing. Longevity records, developmental and reproductive traits, taxonomic information, basic metabolic characteristics, and key observations related to ageing are included in AnAge. Software is also available to aid researchers in the form of Perl modules to automate numerous tasks and as an SPSS script to analyse demographic mortality data. The Human Ageing Genomic Resources are available online at http://genomics.senescence.info. PMID:18986374
Kearse, Matthew; Moir, Richard; Wilson, Amy; Stones-Havas, Steven; Cheung, Matthew; Sturrock, Shane; Buxton, Simon; Cooper, Alex; Markowitz, Sidney; Duran, Chris; Thierer, Tobias; Ashton, Bruce; Meintjes, Peter; Drummond, Alexei
2012-01-01
Summary: The two main functions of bioinformatics are the organization and analysis of biological data using computational resources. Geneious Basic has been designed to be an easy-to-use and flexible desktop software application framework for the organization and analysis of biological data, with a focus on molecular sequences and related data types. It integrates numerous industry-standard discovery analysis tools, with interactive visualizations to generate publication-ready images. One key contribution to researchers in the life sciences is the Geneious public application programming interface (API) that affords the ability to leverage the existing framework of the Geneious Basic software platform for virtually unlimited extension and customization. The result is an increase in the speed and quality of development of computation tools for the life sciences, due to the functionality and graphical user interface available to the developer through the public API. Geneious Basic represents an ideal platform for the bioinformatics community to leverage existing components and to integrate their own specific requirements for the discovery, analysis and visualization of biological data. Availability and implementation: Binaries and public API freely available for download at http://www.geneious.com/basic, implemented in Java and supported on Linux, Apple OSX and MS Windows. The software is also available from the Bio-Linux package repository at http://nebc.nerc.ac.uk/news/geneiousonbl. Contact: peter@biomatters.com PMID:22543367
Kearse, Matthew; Moir, Richard; Wilson, Amy; Stones-Havas, Steven; Cheung, Matthew; Sturrock, Shane; Buxton, Simon; Cooper, Alex; Markowitz, Sidney; Duran, Chris; Thierer, Tobias; Ashton, Bruce; Meintjes, Peter; Drummond, Alexei
2012-06-15
The two main functions of bioinformatics are the organization and analysis of biological data using computational resources. Geneious Basic has been designed to be an easy-to-use and flexible desktop software application framework for the organization and analysis of biological data, with a focus on molecular sequences and related data types. It integrates numerous industry-standard discovery analysis tools, with interactive visualizations to generate publication-ready images. One key contribution to researchers in the life sciences is the Geneious public application programming interface (API) that affords the ability to leverage the existing framework of the Geneious Basic software platform for virtually unlimited extension and customization. The result is an increase in the speed and quality of development of computation tools for the life sciences, due to the functionality and graphical user interface available to the developer through the public API. Geneious Basic represents an ideal platform for the bioinformatics community to leverage existing components and to integrate their own specific requirements for the discovery, analysis and visualization of biological data. Binaries and public API freely available for download at http://www.geneious.com/basic, implemented in Java and supported on Linux, Apple OSX and MS Windows. The software is also available from the Bio-Linux package repository at http://nebc.nerc.ac.uk/news/geneiousonbl.
ERIC Educational Resources Information Center
Minishi-Majanja, Mabel K.
2003-01-01
Information and communication technologies (ICTs) have become basic ingredients of, and competitive tools in, the information-intensive tertiary/higher education sector. Their increased and specialised use in teaching and learning, research, academic administration, institutional management and information provision translates into greater access…
2008-06-01
make better technology investment decisions. C. FOLLOW-ON RESEARCH POTENTIAL Like most assets, knowledge is only valuable if it can be transmuted ...CID Agent Sexual Assault Investigations Training E5 5 Conduct preliminary investigation on referable cases 5811 Basic Military Police Intelligence In
Teaching Markov Chain Monte Carlo: Revealing the Basic Ideas behind the Algorithm
ERIC Educational Resources Information Center
Stewart, Wayne; Stewart, Sepideh
2014-01-01
For many scientists, researchers and students Markov chain Monte Carlo (MCMC) simulation is an important and necessary tool to perform Bayesian analyses. The simulation is often presented as a mathematical algorithm and then translated into an appropriate computer program. However, this can result in overlooking the fundamental and deeper…
Technology Acceptance of Electronic Medical Records by Nurses
ERIC Educational Resources Information Center
Stocker, Gary
2010-01-01
The purpose of this study was to evaluate the Technology Acceptance Model's (TAM) relevance of the intention of nurses to use electronic medical records in acute health care settings. The basic technology acceptance research of Davis (1989) was applied to the specific technology tool of electronic medical records (EMR) in a specific setting…
Little, A
1992-01-01
Hospital survival requires adaptation, adaptation requires understanding, and understanding requires information. These are the basic equations behind hospital strategic marketing, and one of the answers may lie in hospitals' own patient-data systems. Marketers' and administrators' enlightened application of case-mix information could become one more hospital survival tool.
ERIC Educational Resources Information Center
Ruller, Roberto; Silva-Rocha, Rafael; Silva, Artur; Schneider, Maria Paula Cruz; Ward, Richard John
2011-01-01
Protein engineering is a powerful tool, which correlates protein structure with specific functions, both in applied biotechnology and in basic research. Here, we present a practical teaching course for engineering the green fluorescent protein (GFP) from "Aequorea victoria" by a random mutagenesis strategy using error-prone polymerase…
Nuclear medicine and imaging research (quantitative studies in radiopharmaceutical science)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cooper, M.D.; Beck, R.N.
1990-09-01
This is a report of progress in Year Two (January 1, 1990--December 31, 1990) of Grant FG02-86ER60438, Quantitative Studies in Radiopharmaceutical Science,'' awarded for the three-year period January 1, 1989--December 31, 1991 as a competitive renewal following site visit in the fall of 1988. This program addresses the problems involving the basic science and technology underlying the physical and conceptual tools of radioactive tracer methodology as they relate to the measurement of structural and functional parameters of physiologic importance in health and disease. The principal tool is quantitative radionuclide imaging. The overall objective of this program is to further themore » development and transfer of radiotracer methodology from basic theory to routine clinical practice in order that individual patients and society as a whole will receive the maximum net benefit from the new knowledge gained. The focus of the research is on the development of new instruments and radiopharmaceuticals, and the evaluation of these through the phase of clinical feasibility. 25 refs., 13 figs., 1 tab.« less
Effectiveness of a Technology-Based Intervention to Teach Evidence-Based Practice: The EBR Tool.
Long, JoAnn D; Gannaway, Paula; Ford, Cindy; Doumit, Rita; Zeeni, Nadine; Sukkarieh-Haraty, Ola; Milane, Aline; Byers, Beverly; Harrison, LaNell; Hatch, Daniel; Brown, Justin; Proper, Sharlan; White, Patricia; Song, Huaxin
2016-02-01
As the world becomes increasingly digital, advances in technology have changed how students access evidence-based information. Research suggests that students overestimate their ability to locate quality online research and lack the skills needed to evaluate the scientific literature. Clinical nurses report relying on personal experience to answer clinical questions rather than searching evidence-based sources. To address the problem, a web-based, evidence-based research (EBR) tool that is usable from a computer, smartphone, or iPad was developed and tested. The purpose of the EBR tool is to guide students through the basic steps needed to locate and critically appraise the online scientific literature while linking users to quality electronic resources to support evidence-based practice (EBP). Testing of the tool took place in a mixed-method, quasi-experimental, and two-population randomized controlled trial (RCT) design in a U.S. and Middle East university. A statistically significant improvement in overall research skills was supported in the quasi-experimental nursing student group and RCT nutrition student group using the EBR tool. A statistically significant proportional difference was supported in the RCT nutrition and PharmD intervention groups in participants' ability to distinguish the credibility of online source materials compared with controls. The majority of participants could correctly apply PICOTS to a case study when using the tool. The data from this preliminary study suggests that the EBR tool enhanced student overall research skills and selected EBP skills while generating data for assessment of learning outcomes. The EBR tool places evidence-based resources at the fingertips of users by addressing some of the most commonly cited barriers to research utilization while exposing users to information and online literacy standards of practice, meeting a growing need within nursing curricula. © 2016 Sigma Theta Tau International.
Basic mapping principles for visualizing cancer data using Geographic Information Systems (GIS).
Brewer, Cynthia A
2006-02-01
Maps and other data graphics may play a role in generating ideas and hypotheses at the beginning of a project. They are useful as part of analyses for evaluating model results and then at the end of a project when researchers present their results and conclusions to varied audiences, such as their local research group, decision makers, or a concerned public. Cancer researchers are gaining skill with geographic information system (GIS) mapping as one of their many tools and are broadening the symbolization approaches they use for investigating and illustrating their data. A single map is one of many possible representations of the data, so making multiple maps is often part of a complete mapping effort. Symbol types, color choices, and data classing each affect the information revealed by a map and are best tailored to the specific characteristics of data. Related data can be examined in series with coordinated classing and can also be compared using multivariate symbols that build on the basic rules of symbol design. Informative legend wording and setting suitable map projections are also basic to skilled mapmaking.
Silicon photonics cloud (SiCloud)
NASA Astrophysics Data System (ADS)
DeVore, Peter T. S.; Jiang, Yunshan; Lynch, Michael; Miyatake, Taira; Carmona, Christopher; Chan, Andrew C.; Muniam, Kuhan; Jalali, Bahram
2015-02-01
We present SiCloud (Silicon Photonics Cloud), the first free, instructional web-based research and education tool for silicon photonics. SiCloud's vision is to provide a host of instructional and research web-based tools. Such interactive learning tools enhance traditional teaching methods by extending access to a very large audience, resulting in very high impact. Interactive tools engage the brain in a way different from merely reading, and so enhance and reinforce the learning experience. Understanding silicon photonics is challenging as the topic involves a wide range of disciplines, including material science, semiconductor physics, electronics and waveguide optics. This web-based calculator is an interactive analysis tool for optical properties of silicon and related material (SiO2, Si3N4, Al2O3, etc.). It is designed to be a one stop resource for students, researchers and design engineers. The first and most basic aspect of Silicon Photonics is the Material Parameters, which provides the foundation for the Device, Sub-System and System levels. SiCloud includes the common dielectrics and semiconductors for waveguide core, cladding, and photodetection, as well as metals for electrical contacts. SiCloud is a work in progress and its capability is being expanded. SiCloud is being developed at UCLA with funding from the National Science Foundation's Center for Integrated Access Networks (CIAN) Engineering Research Center.
Open Science: a first step towards Science Communication
NASA Astrophysics Data System (ADS)
Grigorov, Ivo; Tuddenham, Peter
2015-04-01
As Earth Science communicators gear up to adopt the new tools and captivating approaches to engage citizen scientists, budding entrepreneurs, policy makers and the public in general, researchers have the responsibility, and opportunity, to fully adopt Open Science principles and capitalize on its full societal impact and engagement. Open Science is about removing all barriers to basic research, whatever its formats, so that it can be freely used, re-used and re-hashed, thus fueling discourse and accelerating generation of innovative ideas. The concept is central to EU's Responsible Research and Innovation philosophy, and removing barriers to basic research measurably contributes to engaging citizen scientists into the research process, it sets the scene for co-creation of solutions to societal challenges, and raises the general science literacy level of the public. Despite this potential, only 50% of today's basic research is freely available. Open Science can be the first passive step of communicating marine research outside academia. Full and unrestricted access to our knowledge including data, software code and scientific publications is not just an ethical obligation, but also gives solid credibility to a more sophisticated communication strategy on engaging society. The presentation will demonstrate how Open Science perfectly compliments a coherent communication strategy for placing Marine Research in societal context, and how it underpin an effective integration of Ocean & Earth Literacy principles in standard educational, as well mobilizing citizen marine scientists, thus making marine science Open Science.
Solar Observations as Educational Tools (P8)
NASA Astrophysics Data System (ADS)
Shylaja, B. S.
2006-11-01
taralaya89@yahoo.co.in Solar observations are very handy tools to expose the students to the joy of research. In this presentation I briefly discuss the various experiments already done here with a small 6" Coude refractor. These include simple experiments like eclipse observations, rotation measurements, variation in the angular size of the sun through the year as well as sun spot size variations, Doppler measurements, identification of elements from solar spectrum (from published high resolution spectrum), limb darkening measurements, deriving the curve of growth (from published data). I also describe the theoretical implications of the experiments and future plans to develop this as a platform for motivating students towards a career in basic science research.
Madzak, Catherine
2018-06-25
Yarrowia lipolytica is an oleaginous saccharomycetous yeast with a long history of industrial use. It aroused interest several decades ago as host for heterologous protein production. Thanks to the development of numerous molecular and genetic tools, Y. lipolytica is now a recognized system for expressing heterologous genes and secreting the corresponding proteins of interest. As genomic and transcriptomic tools increased our basic knowledge on this yeast, we can now envision engineering its metabolic pathways for use as whole-cell factory in various bioconversion processes. Y. lipolytica is currently being developed as a workhorse for biotechnology, notably for single-cell oil production and upgrading of industrial wastes into valuable products. As it becomes more and more difficult to keep up with an ever-increasing literature on Y. lipolytica engineering technology, this article aims to provide basic and actualized knowledge on this research area. The most useful reviews on Y. lipolytica biology, use, and safety will be evoked, together with a resume of the engineering tools available in this yeast. This mini-review will then focus on recently developed tools and engineering strategies, with a particular emphasis on promoter tuning, metabolic pathways assembly, and genome editing technologies.
Software Tools to Support Research on Airport Departure Planning
NASA Technical Reports Server (NTRS)
Carr, Francis; Evans, Antony; Feron, Eric; Clarke, John-Paul
2003-01-01
A simple, portable and useful collection of software tools has been developed for the analysis of airport surface traffic. The tools are based on a flexible and robust traffic-flow model, and include calibration, validation and simulation functionality for this model. Several different interfaces have been developed to help promote usage of these tools, including a portable Matlab(TM) implementation of the basic algorithms; a web-based interface which provides online access to automated analyses of airport traffic based on a database of real-world operations data which covers over 250 U.S. airports over a 5-year period; and an interactive simulation-based tool currently in use as part of a college-level educational module. More advanced applications for airport departure traffic include taxi-time prediction and evaluation of "windowing" congestion control.
Advanced genetic tools for plant biotechnology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, WS; Yuan, JS; Stewart, CN
2013-10-09
Basic research has provided a much better understanding of the genetic networks and regulatory hierarchies in plants. To meet the challenges of agriculture, we must be able to rapidly translate this knowledge into generating improved plants. Therefore, in this Review, we discuss advanced tools that are currently available for use in plant biotechnology to produce new products in plants and to generate plants with new functions. These tools include synthetic promoters, 'tunable' transcription factors, genome-editing tools and site-specific recombinases. We also review some tools with the potential to enable crop improvement, such as methods for the assembly and synthesis ofmore » large DNA molecules, plant transformation with linked multigenes and plant artificial chromosomes. These genetic technologies should be integrated to realize their potential for applications to pressing agricultural and environmental problems.« less
Advanced genetic tools for plant biotechnology.
Liu, Wusheng; Yuan, Joshua S; Stewart, C Neal
2013-11-01
Basic research has provided a much better understanding of the genetic networks and regulatory hierarchies in plants. To meet the challenges of agriculture, we must be able to rapidly translate this knowledge into generating improved plants. Therefore, in this Review, we discuss advanced tools that are currently available for use in plant biotechnology to produce new products in plants and to generate plants with new functions. These tools include synthetic promoters, 'tunable' transcription factors, genome-editing tools and site-specific recombinases. We also review some tools with the potential to enable crop improvement, such as methods for the assembly and synthesis of large DNA molecules, plant transformation with linked multigenes and plant artificial chromosomes. These genetic technologies should be integrated to realize their potential for applications to pressing agricultural and environmental problems.
Deciphering Phosphotyrosine-Dependent Signaling Networks in Cancer by SH2 Profiling
Machida, Kazuya; Khenkhar, Malik
2012-01-01
It has been a decade since the introduction of SH2 profiling, a modular domain-based molecular diagnostics tool. This review covers the original concept of SH2 profiling, different analytical platforms, and their applications, from the detailed analysis of single proteins to broad screening in translational research. Illustrated by practical examples, we discuss the uniqueness and advantages of the approach as well as its limitations and challenges. We provide guidance for basic researchers and oncologists who may consider SH2 profiling in their respective cancer research, especially for those focusing on tyrosine phosphoproteomics. SH2 profiling can serve as an alternative phosphoproteomics tool to dissect aberrant tyrosine kinase pathways responsible for individual malignancies, with the goal of facilitating personalized diagnostics for the treatment of cancer. PMID:23226573
Crockett, Elahé T
2014-09-24
The National Institutes of Health has recognized a compelling need to train highly qualified individuals and promote diversity in the biomedical/clinical sciences research workforce. In response, we have developed a research-training program known as REPID (Research Education Program to Increase Diversity among Health Researchers) to prepare students/learners to pursue research careers in these fields and address the lack of diversity and health disparities. By inclusion of students/learners from minority and diverse backgrounds, the REPID program aims to provide a research training and enrichment experience through team mentoring to inspire students/learners to pursue research careers in biomedical and health-related fields. Students/learners are recruited from the University campus from a diverse population of undergraduates, graduates, health professionals, and lifelong learners. Our recruits first enroll into an innovative on-line introductory course in Basics and Methods in Biomedical Research that uses a laboratory Tool-Kit (a lab in a box called the My Dr. ET Lab Tool-Kit) to receive the standard basics of research education, e.g., research skills, and lab techniques. The students/learners will also learn about the responsible conduct of research, research concept/design, data recording/analysis, and scientific writing/presentation. The course is followed by a 12-week hands-on research experience during the summer. The students/learners also attend workshops and seminars/conferences. The students/learners receive scholarship to cover stipends, research related expenses, and to attend a scientific conference. The scholarship allows the students/learners to gain knowledge and seize opportunities in biomedical and health-related careers. This is an ongoing program, and during the first three years of the program, fifty-one (51) students/learners have been recruited. Thirty-six (36) have completed their research training, and eighty percent (80%) of them have continued their research experiences beyond the program. The combination of carefully providing standard basics of research education and mentorship has been successful and instrumental for training these students/learners and their success in finding biomedical/health-related jobs and/or pursuing graduate/medical studies. All experiences have been positive and highly promoted. This approach has the potential to train a highly qualified workforce, change lives, enhance biomedical research, and by extension, improve national health-care.
Construction of databases: advances and significance in clinical research.
Long, Erping; Huang, Bingjie; Wang, Liming; Lin, Xiaoyu; Lin, Haotian
2015-12-01
Widely used in clinical research, the database is a new type of data management automation technology and the most efficient tool for data management. In this article, we first explain some basic concepts, such as the definition, classification, and establishment of databases. Afterward, the workflow for establishing databases, inputting data, verifying data, and managing databases is presented. Meanwhile, by discussing the application of databases in clinical research, we illuminate the important role of databases in clinical research practice. Lastly, we introduce the reanalysis of randomized controlled trials (RCTs) and cloud computing techniques, showing the most recent advancements of databases in clinical research.
Sajer, Sascha
2017-01-01
This Perspective will discuss topics recently suggested by Prof. Helmut Kern, Vienna, Austria, to advance the research activities of his team, that is: Topic A, 10 years post RISE; Topic B, New research for new solutions on old research questions; Topic C, Working groups on nerve regeneration, training-parameters of seniors in different ages, muscle adaptation; and studies of connective tissue and cartilage. This Perspective summarizes some of the basic concepts and of the evidence-based tools for developing further translational research activities. Clinically relevant results will ask for continuous interests of Basic and Applied Myologists and for the support during the next five to ten years of public and private granting agencies. All together, they will end in protocols, devices and multidisciplinary managements for persons suffering with muscle denervation, neuromuscular-related or non-related pain and for the increasing population of old, older and oldest senior citizens in Europe and beyond. PMID:29299226
How to Design a Genetic Mating Scheme: A Basic Training Package for Drosophila Genetics
Roote, John; Prokop, Andreas
2013-01-01
Drosophila melanogaster is a powerful model organism for biological research. The essential and common instrument of fly research is genetics, the art of applying Mendelian rules in the specific context of Drosophila with its unique classical genetic tools and the breadth of modern genetic tools and strategies brought in by molecular biology, transgenic technologies and the use of recombinases. Training newcomers to fly genetics is a complex and time-consuming task but too important to be left to chance. Surprisingly, suitable training resources for beginners currently are not available. Here we provide a training package for basic Drosophila genetics, designed to ensure that basic knowledge on all key areas is covered while reducing the time invested by trainers. First, a manual introduces to fly history, rationale for mating schemes, fly handling, Mendelian rules in fly, markers and balancers, mating scheme design, and transgenic technologies. Its self-study is followed by a practical training session on gender and marker selection, introducing real flies under the dissecting microscope. Next, through self-study of a PowerPoint presentation, trainees are guided step-by-step through a mating scheme. Finally, to consolidate knowledge, trainees are asked to design similar mating schemes reflecting routine tasks in a fly laboratory. This exercise requires individual feedback but also provides unique opportunities for trainers to spot weaknesses and strengths of each trainee and take remedial action. This training package is being successfully applied at the Manchester fly facility and may serve as a model for further training resources covering other aspects of fly research. PMID:23390611
Mixed-methods research in pharmacy practice: basics and beyond (part 1).
Hadi, Muhammad Abdul; Alldred, David Phillip; Closs, S José; Briggs, Michelle
2013-10-01
This is the first of two papers which explore the use of mixed-methods research in pharmacy practice. In an era of evidence-based medicine and policy, high-quality research evidence is essential for the development of effective pharmacist-led services. Over the past decade, the use of mixed-methods research has become increasingly common in healthcare, although to date its use has been relatively limited in pharmacy practice research. In this article, the basic concepts of mixed-methods research including its definition, typologies and advantages in relation to pharmacy practice research are discussed. Mixed-methods research brings together qualitative and quantitative methodologies within a single study to answer or understand a research problem. There are a number of mixed-methods designs available, but the selection of an appropriate design must always be dictated by the research question. Importantly, mixed-methods research should not be seen as a 'tool' to collect qualitative and quantitative data, rather there should be some degree of 'integration' between the two data sets. If conducted appropriately, mixed-methods research has the potential to generate quality research evidence by combining strengths and overcoming the respective limitations of qualitative and quantitative methodologies. © 2012 Royal Pharmaceutical Society.
NASA Technical Reports Server (NTRS)
Smith, Jeffrey
2003-01-01
The Bio- Visualization, Imaging and Simulation (BioVIS) Technology Center at NASA's Ames Research Center is dedicated to developing and applying advanced visualization, computation and simulation technologies to support NASA Space Life Sciences research and the objectives of the Fundamental Biology Program. Research ranges from high resolution 3D cell imaging and structure analysis, virtual environment simulation of fine sensory-motor tasks, computational neuroscience and biophysics to biomedical/clinical applications. Computer simulation research focuses on the development of advanced computational tools for astronaut training and education. Virtual Reality (VR) and Virtual Environment (VE) simulation systems have become important training tools in many fields from flight simulation to, more recently, surgical simulation. The type and quality of training provided by these computer-based tools ranges widely, but the value of real-time VE computer simulation as a method of preparing individuals for real-world tasks is well established. Astronauts routinely use VE systems for various training tasks, including Space Shuttle landings, robot arm manipulations and extravehicular activities (space walks). Currently, there are no VE systems to train astronauts for basic and applied research experiments which are an important part of many missions. The Virtual Glovebox (VGX) is a prototype VE system for real-time physically-based simulation of the Life Sciences Glovebox where astronauts will perform many complex tasks supporting research experiments aboard the International Space Station. The VGX consists of a physical display system utilizing duel LCD projectors and circular polarization to produce a desktop-sized 3D virtual workspace. Physically-based modeling tools (Arachi Inc.) provide real-time collision detection, rigid body dynamics, physical properties and force-based controls for objects. The human-computer interface consists of two magnetic tracking devices (Ascention Inc.) attached to instrumented gloves (Immersion Inc.) which co-locate the user's hands with hand/forearm representations in the virtual workspace. Force-feedback is possible in a work volume defined by a Phantom Desktop device (SensAble inc.). Graphics are written in OpenGL. The system runs on a 2.2 GHz Pentium 4 PC. The prototype VGX provides astronauts and support personnel with a real-time physically-based VE system to simulate basic research tasks both on Earth and in the microgravity of Space. The immersive virtual environment of the VGX also makes it a useful tool for virtual engineering applications including CAD development, procedure design and simulation of human-system systems in a desktop-sized work volume.
Meeting the Challenge of Students' Understanding of Formulae in High-School Physics: A Learning Tool
ERIC Educational Resources Information Center
Bagno, Esther; Berger, Hana; Eylon, Bat-Sheva
2008-01-01
In this paper we describe a diagnostic study to investigate students' understanding of two basic formulae in physics. Based on the findings of the study, we have developed a classroom activity focused on the interpretation of formulae. The activity was developed cooperatively by physics education researchers and high-school physics teachers and…
ERIC Educational Resources Information Center
Hadjerrouit, Said
2015-01-01
This research study aims at evaluating the suitability of SimReal+ for effective use in teacher education. SimReal+ was originally developed to teach mathematics in universities, but it is has been recently improved to include school mathematics. The basic idea of SimReal+ is that the visualization of mathematical concepts is a powerful technique…
ERIC Educational Resources Information Center
Wefer, Stephen H.
2003-01-01
"Name That Gene" is a simple classroom activity that incorporates bioinformatics (available biological information) into the classroom using "Basic Logical Alignment Search Tool (BLAST)." An excellent classroom activity involving bioinformatics and "BLAST" has been previously explored using sequences from bacteria, but it is tailored for college…
On the Latent Regression Model of Item Response Theory. Research Report. ETS RR-07-12
ERIC Educational Resources Information Center
Antal, Tamás
2007-01-01
Full account of the latent regression model for the National Assessment of Educational Progress is given. The treatment includes derivation of the EM algorithm, Newton-Raphson method, and the asymptotic standard errors. The paper also features the use of the adaptive Gauss-Hermite numerical integration method as a basic tool to evaluate…
Analysis of Analogy Use in Secondary Education Science Textbooks in Turkey
ERIC Educational Resources Information Center
Akçay, Süleyman
2016-01-01
Analogical reasoning is both an innate ability and a basic learning mechanism that can be improved. In classrooms, it is an important tool used by teachers, especially when explaining difficult or abstract issues. In addition to its use in all aspects of our lives, analogical reasoning is commonly used in textbooks. This research examines the…
Technology assessment of future intercity passenger transporation systems. Volume 1: Summary report
NASA Technical Reports Server (NTRS)
1976-01-01
Technical, economic, environmental, and sociopolitical issues associated with future intercity transportation system options were assured. Technology assessment was used as a tool to assist in the identification of basic research and technology development tasks that should be undertaken. The emphasis was on domestic passenger transportation, but interfaces with freight and international transportation were considered.
Buying a Car: Using On-Line Tools. Technology Update.
ERIC Educational Resources Information Center
McCoy, Kimberly
This lesson plan was created to assist learners in adult basic and literacy education programs with the car-buying process. The goal for the lesson is to "effectively use the Internet to research necessary items before purchasing a car." These nine learning objectives are set: (1) determine what kind of car is needed; (2) determine how…
Towards open-source, low-cost haptics for surgery simulation.
Suwelack, Stefan; Sander, Christian; Schill, Julian; Serf, Manuel; Danz, Marcel; Asfour, Tamim; Burger, Wolfgang; Dillmann, Rüdiger; Speidel, Stefanie
2014-01-01
In minimally invasive surgery (MIS), virtual reality (VR) training systems have become a promising education tool. However, the adoption of these systems in research and clinical settings is still limited by the high costs of dedicated haptics hardware for MIS. In this paper, we present ongoing research towards an open-source, low-cost haptic interface for MIS simulation. We demonstrate the basic mechanical design of the device, the sensor setup as well as its software integration.
[Basic research during residency in Israel: is change needed?].
Fishbain, Dana; Shoenfeld, Yehuda; Ashkenazi, Shai
2013-10-01
A six-month research period is a mandatory part of the residency training program in most basic specialties in Israel and is named: the "basic science period". This is the only period in an Israeli physician's medical career which is dedicated strictly to research, accentuating the importance of medical research to the quality of training and level of medicine in Israel. From another point of view, one may argue that in an era of shortage of physicians on the one hand and the dizzying rate of growth in medical knowledge on the other hand, every moment spent training in residency is precious, therefore, making the decision of whether to dedicate six months for research becomes ever more relevant. This question is currently raised for discussion once again by the Scientific Council of the Israeli Medical Association. The Scientific Council lately issued a call for comments sent to all Israeli physicians, asking their opinion on several key questions regarding basic science research. Learning the public's opinion will serve as a background for discussion. A total of 380 physicians responded to the call and specified their standpoint on the subject, among them heads of departments, units and clinics, senior physicians and residents. The findings pointed to strong support in maintaining the research period as part of residency training due to its importance to medical training and medicine, although half the respondents supported the use of various alternative formats for research together with the existing format. Those alternative format suggestions will be thoroughly reviewed. A smaller group of respondents supported allowing residents a choice between two tracks--with or without a research period, and only a few were in favor of canceling the research requirement altogether. The writers maintain that the "basic science period" of research during residency training is vital and its contribution to the high level of specialists and high level of medicine requires its conservation. Nevertheless, alternative formats which might be suitable for some residents should be considered, and auxiliary tools to help residents fulfill their potential in research and raise the quality of written research papers should be constructed.
Ferrer-Dufol, Ana; Menao-Guillen, Sebastian
2009-04-10
The relationship between basic research and its potential clinical applications is often a difficult subject. Clinical toxicology has always been very dependent on experimental research whose usefulness has been impaired by the existence of huge differences in the toxicity expression of different substances, inter- and intra-species which make it difficult to predict clinical effects in humans. The new methods in molecular biology developed in the last decades are furnishing very useful tools to study some of the more relevant molecules implied in toxicokinetic and toxicodynamic processes. We aim to show some meaningful examples of how recent research developments with genes and proteins have clear applications to understand significant clinical matters, such as inter-individual variations in susceptibility to chemicals, and other phenomena related to the way some substances act to induce variations in the expression and functionality of these targets.
Ruggiero, Rafael N.; Rossignoli, Matheus T.; De Ross, Jana B.; Hallak, Jaime E. C.; Leite, Joao P.; Bueno-Junior, Lezio S.
2017-01-01
Much of our knowledge of the endocannabinoid system in schizophrenia comes from behavioral measures in rodents, like prepulse inhibition of the acoustic startle and open-field locomotion, which are commonly used along with neurochemical approaches or drug challenge designs. Such methods continue to map fundamental mechanisms of sensorimotor gating, hyperlocomotion, social interaction, and underlying monoaminergic, glutamatergic, and GABAergic disturbances. These strategies will require, however, a greater use of neurophysiological tools to better inform clinical research. In this sense, electrophysiology and viral vector-based circuit dissection, like optogenetics, can further elucidate how exogenous cannabinoids worsen (e.g., tetrahydrocannabinol, THC) or ameliorate (e.g., cannabidiol, CBD) schizophrenia symptoms, like hallucinations, delusions, and cognitive deficits. Also, recent studies point to a complex endocannabinoid-endovanilloid interplay, including the influence of anandamide (endogenous CB1 and TRPV1 agonist) on cognitive variables, such as aversive memory extinction. In fact, growing interest has been devoted to TRPV1 receptors as promising therapeutic targets. Here, these issues are reviewed with an emphasis on the neurophysiological evidence. First, we contextualize imaging and electrographic findings in humans. Then, we present a comprehensive review on rodent electrophysiology. Finally, we discuss how basic research will benefit from further combining psychopharmacological and neurophysiological tools. PMID:28680405
Information resources at the National Center for Biotechnology Information.
Woodsmall, R M; Benson, D A
1993-01-01
The National Center for Biotechnology Information (NCBI), part of the National Library of Medicine, was established in 1988 to perform basic research in the field of computational molecular biology as well as build and distribute molecular biology databases. The basic research has led to new algorithms and analysis tools for interpreting genomic data and has been instrumental in the discovery of human disease genes for neurofibromatosis and Kallmann syndrome. The principal database responsibility is the National Institutes of Health (NIH) genetic sequence database, GenBank. NCBI, in collaboration with international partners, builds, distributes, and provides online and CD-ROM access to over 112,000 DNA sequences. Another major program is the integration of multiple sequences databases and related bibliographic information and the development of network-based retrieval systems for Internet access. PMID:8374583
Introduction to metabolomics and its applications in ophthalmology
Tan, S Z; Begley, P; Mullard, G; Hollywood, K A; Bishop, P N
2016-01-01
Metabolomics is the study of endogenous and exogenous metabolites in biological systems, which aims to provide comparative semi-quantitative information about all metabolites in the system. Metabolomics is an emerging and potentially powerful tool in ophthalmology research. It is therefore important for health professionals and researchers involved in the speciality to understand the basic principles of metabolomics experiments. This article provides an overview of the experimental workflow and examples of its use in ophthalmology research from the study of disease metabolism and pathogenesis to identification of biomarkers. PMID:26987591
Malaria Evolution in South Asia: Knowledge for Control and Elimination
Narayanasamy, Krishnamoorthy; Chery, Laura; Basu, Analabha; Duraisingh, Manoj T.; Escalante, Ananias; Fowble, Joseph; Guler, Jennifer L.; Herricks, Thurston; Kumar, Ashwani; Majumder, Partha; Maki, Jennifer; Mascarenhas, Anjali; Rodrigues, Janneth; Roy, Bikram; Sen, Somdutta; Shastri, Jayanthi; Smith, Joseph; Valecha, Neena; White, John; Rathod, Pradipsinh K.
2013-01-01
The study of malaria parasites on the Indian subcontinent should help us understand unexpected disease outbreaks and unpredictable disease presentations from Plasmodium falciparum and from Plasmodium vivax infections. The Malaria Evolution in South Asia (MESA) research program is one of ten International Centers of Excellence for Malaria Research (ICEMR) sponsored by the US National Institute of Health. In this second of two reviews, we describe why population structures of Plasmodia in India will be characterized and how we will determine their consequences on disease presentation, outcome and patterns. Specific projects will determine if genetic diversity, possibly driven by parasites with higher genetic plasticity, plays a role in changing epidemiology, pathogenesis, vector competence of parasite populations, and whether innate human genetic traits protect Indians from malaria today. Deep local clinical knowledge of malaria in India will be supplemented by basic scientists who bring new research tools. Such tools will include whole genome sequencing and analysis methods; in vitro assays to measure genome plasticity, RBC cytoadhesion, invasion, and deformability; mosquito infectivity assays to evaluate changing parasite-vector compatibilities; and host genetics to understand protective traits in Indian populations. The MESA-ICEMR study sites span diagonally across India, including a mixture of very urban and rural hospitals, each with very different disease patterns and patient populations. Research partnerships include government-associated research institutes, private medical schools, city and state government hospitals, and hospitals with industry ties. Between 2012-2017, in addition to developing clinical research and basic science infrastructure at new clinical sites, our training workshops will engage new scientists and clinicians throughout South Asia in the malaria research field. PMID:22266213
Making Basic Science Studies in Glaucoma More Clinically Relevant: The Need for a Consensus.
Toris, Carol B; Gelfman, Claire; Whitlock, Andy; Sponsel, William E; Rowe-Rendleman, Cheryl L
2017-09-01
Glaucoma is a chronic, progressive, and debilitating optic neuropathy that causes retinal damage and visual defects. The pathophysiologic mechanisms of glaucoma remain ill-defined, and there is an indisputable need for contributions from basic science researchers in defining pathways for translational research. However, glaucoma researchers today face significant challenges due to the lack of a map of integrated pathways from bench to bedside and the lack of consensus statements to guide in choosing the right research questions, techniques, and model systems. Here, we present the case for the development of such maps and consensus statements, which are critical for faster development of the most efficacious glaucoma therapy. We underscore that interrogating the preclinical path of both successful and unsuccessful clinical programs is essential to defining future research. One aspect of this is evaluation of available preclinical research tools. To begin this process, we highlight the utility of currently available animal models for glaucoma and emphasize that there is a particular need for models of glaucoma with normal intraocular pressure. In addition, we outline a series of discoveries from cell-based, animal, and translational research that begin to reveal a map of glaucoma from cell biology to physiology to disease pathology. Completion of these maps requires input and consensus from the global glaucoma research community. This article sets the stage by outlining various approaches to such a consensus. Together, these efforts will help accelerate basic science research, leading to discoveries with significant clinical impact for people with glaucoma.
Data Standards for Flow Cytometry
SPIDLEN, JOSEF; GENTLEMAN, ROBERT C.; HAALAND, PERRY D.; LANGILLE, MORGAN; MEUR, NOLWENN LE; OCHS, MICHAEL F.; SCHMITT, CHARLES; SMITH, CLAYTON A.; TREISTER, ADAM S.; BRINKMAN, RYAN R.
2009-01-01
Flow cytometry (FCM) is an analytical tool widely used for cancer and HIV/AIDS research, and treatment, stem cell manipulation and detecting microorganisms in environmental samples. Current data standards do not capture the full scope of FCM experiments and there is a demand for software tools that can assist in the exploration and analysis of large FCM datasets. We are implementing a standardized approach to capturing, analyzing, and disseminating FCM data that will facilitate both more complex analyses and analysis of datasets that could not previously be efficiently studied. Initial work has focused on developing a community-based guideline for recording and reporting the details of FCM experiments. Open source software tools that implement this standard are being created, with an emphasis on facilitating reproducible and extensible data analyses. As well, tools for electronic collaboration will assist the integrated access and comprehension of experiments to empower users to collaborate on FCM analyses. This coordinated, joint development of bioinformatics standards and software tools for FCM data analysis has the potential to greatly facilitate both basic and clinical research—impacting a notably diverse range of medical and environmental research areas. PMID:16901228
Recent developments in research and treatment for social phobia (social anxiety disorder).
Cottraux, Jean
2005-01-01
This review covers three themes of research that brought fresh data useful for clinical practice in a handicapping anxiety disorder: social phobia. Recent findings deriving from basic biological research, new forms of psychological therapies, and recent psychopharmacology controlled trials are reviewed. The basic neuroimaging research suggests that greater activation of the amygdala to novel versus familiar faces may be an underlying trait marker for social phobia. Social phobia may represent a phenotype that expresses a genetically driven trait of social withdrawal, which may be related to infantile inhibited temperament (Kagan's syndrome). The development of virtual reality therapy as therapeutic tool for social phobia appeared promising in one controlled, but not randomized, study. A controlled study suggests that social phobias in children can be effectively treated with cognitive behavioural therapy. This represents an extension of the work done with adults. Venlafaxine appears an effective short-term treatment for social anxiety disorder in two controlled studies. A new compound, pregabalin, appeared clearly effective in a positive controlled study. This trial marks the advent of a new pharmacological lineage for social phobia. Both venlafaxine and pregabalin, however, have been studied in short-term studies. Longer follow-up and relapse prevention studies are warranted. Neuroimaging research points to a temperamental basis for social phobia. Virtual reality therapy is an emerging tool to carry out exposure treatment. Group cognitive behavioural therapy can be extended successfully to children. Venlafaxine and pregabalin have a proven short-term effectiveness in social phobia.
Evans-Agnew, Robin A; Postma, Julie; Camacho, Ariana Ochoa; Hershberg, Rachel M; Trujilio, Elsa; Tinajera, Maria
2018-01-01
Childhood marks the highest risk for allergic sensitization to asthma triggers. Hispanic/Latino children are at higher risk for hospitalization for asthma than non-Hispanic White children. Childcare providers lack knowledge about reducing asthma triggers. The purpose of this paper is to describe a community-based participatory research (CBPR) initiative aimed at developing and pilot testing a bilingual walk-through assessment tool for asthma-friendly childcare environments. Ten Latina mothers of children with asthma living in the Pacific Northwest collaborated with research partners to develop and pilot test a Childcare Environmental Health (CEH) assessment walk-through survey.Results and Lessons Learned: The women innovated the survey with photography and structural examinations of stress and provision of basic needs. The survey tool identified environmental threats to asthma in all three childcares surveyed. Parents are well-positioned to build trust with childcare providers, assess asthma triggers, and recommend practical mitigation strategies.
Towards early software reliability prediction for computer forensic tools (case study).
Abu Talib, Manar
2016-01-01
Versatility, flexibility and robustness are essential requirements for software forensic tools. Researchers and practitioners need to put more effort into assessing this type of tool. A Markov model is a robust means for analyzing and anticipating the functioning of an advanced component based system. It is used, for instance, to analyze the reliability of the state machines of real time reactive systems. This research extends the architecture-based software reliability prediction model for computer forensic tools, which is based on Markov chains and COSMIC-FFP. Basically, every part of the computer forensic tool is linked to a discrete time Markov chain. If this can be done, then a probabilistic analysis by Markov chains can be performed to analyze the reliability of the components and of the whole tool. The purposes of the proposed reliability assessment method are to evaluate the tool's reliability in the early phases of its development, to improve the reliability assessment process for large computer forensic tools over time, and to compare alternative tool designs. The reliability analysis can assist designers in choosing the most reliable topology for the components, which can maximize the reliability of the tool and meet the expected reliability level specified by the end-user. The approach of assessing component-based tool reliability in the COSMIC-FFP context is illustrated with the Forensic Toolkit Imager case study.
Translational research needs us to go back to basics and collaborate: interview with Lars Sundstrom.
Sundstrom, Lars
2016-09-01
Lars Sundstrom is Director of Enterprise and Translation at the West of England Academic Health Sciences Network [1] (UK), a Professor of Practice in Translational Medicine and Co-Director of the Elizabeth Blackwell Institute for Health Research at Bristol University [2] (UK), and an honorary Professor of Medicine at Cardiff University (UK). He has extensive experience in translational medicine and clinical neurosciences, holding positions at several eminent universities. He has also held executive and board-level positions at several SMEs, developing new therapeutics for neurological conditions and tools for drug discovery. He has also been an advisor to several UK and local government task forces and to the European Commission and the European Federation of Pharmaceutical Industry Associations. He was a founding member of the European Brain Council in Brussels, and set up the Severnside Alliance for Translational Research, developing a regional network partnership to link clinical and basic scientists. He was also involved in the creation of Health Research Wales.
Optical Coherence Tomography: Basic Concepts and Applications in Neuroscience Research
2017-01-01
Optical coherence tomography is a micrometer-scale imaging modality that permits label-free, cross-sectional imaging of biological tissue microstructure using tissue backscattering properties. After its invention in the 1990s, OCT is now being widely used in several branches of neuroscience as well as other fields of biomedical science. This review study reports an overview of OCT's applications in several branches or subbranches of neuroscience such as neuroimaging, neurology, neurosurgery, neuropathology, and neuroembryology. This study has briefly summarized the recent applications of OCT in neuroscience research, including a comparison, and provides a discussion of the remaining challenges and opportunities in addition to future directions. The chief aim of the review study is to draw the attention of a broad neuroscience community in order to maximize the applications of OCT in other branches of neuroscience too, and the study may also serve as a benchmark for future OCT-based neuroscience research. Despite some limitations, OCT proves to be a useful imaging tool in both basic and clinical neuroscience research. PMID:29214158
National Combustion Code: A Multidisciplinary Combustor Design System
NASA Technical Reports Server (NTRS)
Stubbs, Robert M.; Liu, Nan-Suey
1997-01-01
The Internal Fluid Mechanics Division conducts both basic research and technology, and system technology research for aerospace propulsion systems components. The research within the division, which is both computational and experimental, is aimed at improving fundamental understanding of flow physics in inlets, ducts, nozzles, turbomachinery, and combustors. This article and the following three articles highlight some of the work accomplished in 1996. A multidisciplinary combustor design system is critical for optimizing the combustor design process. Such a system should include sophisticated computer-aided design (CAD) tools for geometry creation, advanced mesh generators for creating solid model representations, a common framework for fluid flow and structural analyses, modern postprocessing tools, and parallel processing. The goal of the present effort is to develop some of the enabling technologies and to demonstrate their overall performance in an integrated system called the National Combustion Code.
Goulding, F S; Stone, Y
1970-10-16
The past decade has seen the rapid development and exploitation of one of the most significant tools of nuclear physics, the semiconductor radiation detector. Applications of the device to the analysis of materials promises to be one of the major contributions of nuclear research to technology, and may even assist in some aspects of our environmental problems. In parallel with the development of these applications, further developments in detectors for nuclear research are taking place: the use of very thin detectors for heavyion identification, position-sensitive detectors for nuclear-reaction studies, and very pure germanium for making more satisfactory detectors for many applications suggest major future contributions to physics.
FRED 2: an immunoinformatics framework for Python
Schubert, Benjamin; Walzer, Mathias; Brachvogel, Hans-Philipp; Szolek, András; Mohr, Christopher; Kohlbacher, Oliver
2016-01-01
Summary: Immunoinformatics approaches are widely used in a variety of applications from basic immunological to applied biomedical research. Complex data integration is inevitable in immunological research and usually requires comprehensive pipelines including multiple tools and data sources. Non-standard input and output formats of immunoinformatics tools make the development of such applications difficult. Here we present FRED 2, an open-source immunoinformatics framework offering easy and unified access to methods for epitope prediction and other immunoinformatics applications. FRED 2 is implemented in Python and designed to be extendable and flexible to allow rapid prototyping of complex applications. Availability and implementation: FRED 2 is available at http://fred-2.github.io Contact: schubert@informatik.uni-tuebingen.de Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27153717
FRED 2: an immunoinformatics framework for Python.
Schubert, Benjamin; Walzer, Mathias; Brachvogel, Hans-Philipp; Szolek, András; Mohr, Christopher; Kohlbacher, Oliver
2016-07-01
Immunoinformatics approaches are widely used in a variety of applications from basic immunological to applied biomedical research. Complex data integration is inevitable in immunological research and usually requires comprehensive pipelines including multiple tools and data sources. Non-standard input and output formats of immunoinformatics tools make the development of such applications difficult. Here we present FRED 2, an open-source immunoinformatics framework offering easy and unified access to methods for epitope prediction and other immunoinformatics applications. FRED 2 is implemented in Python and designed to be extendable and flexible to allow rapid prototyping of complex applications. FRED 2 is available at http://fred-2.github.io schubert@informatik.uni-tuebingen.de Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.
Combining Induced Pluripotent Stem Cells and Genome Editing Technologies for Clinical Applications.
Chang, Chia-Yu; Ting, Hsiao-Chien; Su, Hong-Lin; Jeng, Jing-Ren
2018-01-01
In this review, we introduce current developments in induced pluripotent stem cells (iPSCs), site-specific nuclease (SSN)-mediated genome editing tools, and the combined application of these two novel technologies in biomedical research and therapeutic trials. The sustainable pluripotent property of iPSCs in vitro not only provides unlimited cell sources for basic research but also benefits precision medicines for human diseases. In addition, rapidly evolving SSN tools efficiently tailor genetic manipulations for exploring gene functions and can be utilized to correct genetic defects of congenital diseases in the near future. Combining iPSC and SSN technologies will create new reliable human disease models with isogenic backgrounds in vitro and provide new solutions for cell replacement and precise therapies.
Appraising Quantitative Research in Health Education: Guidelines for Public Health Educators
Hayes, Sandra C.; Scharalda, Jeanfreau G.; Stetson, Barbara; Jones-Jack, Nkenge H.; Valliere, Matthew; Kirchain, William R.; Fagen, Michael; LeBlanc, Cris
2010-01-01
Many practicing health educators do not feel they possess the skills necessary to critically appraise quantitative research. This publication is designed to help provide practicing health educators with basic tools helpful to facilitate a better understanding of quantitative research. This article describes the major components—title, introduction, methods, analyses, results and discussion sections—of quantitative research. Readers will be introduced to information on the various types of study designs and seven key questions health educators can use to facilitate the appraisal process. Upon reading, health educators will be in a better position to determine whether research studies are well designed and executed. PMID:20400654
ERIC Educational Resources Information Center
Grover, Anita; Lam, Tai Ning; Hunt, C. Anthony
2008-01-01
We present a simulation tool to aid the study of basic pharmacology principles. By taking advantage of the properties of agent-based modeling, the tool facilitates taking a mechanistic approach to learning basic concepts, in contrast to the traditional empirical methods. Pharmacodynamics is a particular aspect of pharmacology that can benefit from…
ERIC Educational Resources Information Center
Mubaslat, Mania Moayad
2012-01-01
This study attempts to determine the role of educational games on learning a foreign language, and to compare games with more traditional practices as effective learning tools on the basic educational stage students at governmental schools in Jordan, an experimental research is conducted using three groups out of six randomly. To determine the…
ERIC Educational Resources Information Center
Appelt, Wolfgang; Mambrey, Peter
The GMD (German National Research Center for Information Technology) has developed the BSCW (Basic Support for Cooperative Work) Shared Workspace system within the last four years with the goal of transforming the Web from a primarily passive information repository to an active cooperation medium. The BSCW system is a Web-based groupware tool for…
ERIC Educational Resources Information Center
Benton, Morgan C.
2008-01-01
This dissertation sought to answer the question: Is it possible to build a software tool that will allow teachers to write better multiple-choice questions? The thesis proceeded from the finding that the quality of teaching is very influential in the amount that students learn. A basic premise of this research, then, is that improving teachers…
A Methodology for Teaching Rhetorical Fundamentals in a Course Centered Around Social Movements.
ERIC Educational Resources Information Center
Palleschi, Patricia
This paper attempts to incorporate the current research done on the rhetoric of social movements into a coherent syllabus for a basic rhetoric course. It deals with the adaptation of the tools of rhetoric at the disposal of the beginning student into a procedure for an analysis of social movements and provides that such an analysis give the…
ERIC Educational Resources Information Center
Mueller, Charles M.; Jacobsen, Natalia D.
2016-01-01
Qualitative research focusing primarily on advanced-proficiency second language (L2) learners suggests that online corpora can function as useful reference tools for language learners, especially when addressing phraseological issues. However, the feasibility and effectiveness of online corpus consultation for learners at a basic level of L2…
NASA Astrophysics Data System (ADS)
Barbier, Geoffrey; Liu, Huan
The rise of online social media is providing a wealth of social network data. Data mining techniques provide researchers and practitioners the tools needed to analyze large, complex, and frequently changing social media data. This chapter introduces the basics of data mining, reviews social media, discusses how to mine social media data, and highlights some illustrative examples with an emphasis on social networking sites and blogs.
CEREBRA: a 3-D visualization tool for brain network extracted from fMRI data.
Nasir, Baris; Yarman Vural, Fatos T
2016-08-01
In this paper, we introduce a new tool, CEREBRA, to visualize the 3D network of human brain, extracted from the fMRI data. The tool aims to analyze the brain connectivity by representing the selected voxels as the nodes of the network. The edge weights among the voxels are estimated by considering the relationships among the voxel time series. The tool enables the researchers to observe the active brain regions and the interactions among them by using graph theoretic measures, such as, the edge weight and node degree distributions. CEREBRA provides an interactive interface with basic display and editing options for the researchers to study their hypotheses about the connectivity of the brain network. CEREBRA interactively simplifies the network by selecting the active voxels and the most correlated edge weights. The researchers may remove the voxels and edges by using local and global thresholds selected on the window. The built-in graph reduction algorithms are then eliminate the irrelevant regions, voxels and edges and display various properties of the network. The toolbox is capable of space-time representation of the voxel time series and estimated arc weights by using the animated heat maps.
[Progress on neuropsychology and event-related potentials in patients with brain trauma].
Dong, Ri-xia; Cai, Wei-xiong; Tang, Tao; Huang, Fu-yin
2010-02-01
With the development of information technology, as one of the research frontiers in neurophysiology, event-related potentials (ERP) is concerned increasingly by international scholars, which provides a feasible and objective method for exploring cognitive function. There are many advances in neuropsychology due to new assessment tool for the last years. The basic theories in the field of ERP and neuropsychology were reviewed in this article. The research and development in evaluating cognitive function of patients with syndrome after brain trauma were focused in this review, and the perspectives for the future research of ERP was also explored.
Standardized data sharing in a paediatric oncology research network--a proof-of-concept study.
Hochedlinger, Nina; Nitzlnader, Michael; Falgenhauer, Markus; Welte, Stefan; Hayn, Dieter; Koumakis, Lefteris; Potamias, George; Tsiknakis, Manolis; Saraceno, Davide; Rinaldi, Eugenia; Ladenstein, Ruth; Schreier, Günter
2015-01-01
Data that has been collected in the course of clinical trials are potentially valuable for additional scientific research questions in so called secondary use scenarios. This is of particular importance in rare disease areas like paediatric oncology. If data from several research projects need to be connected, so called Core Datasets can be used to define which information needs to be extracted from every involved source system. In this work, the utility of the Clinical Data Interchange Standards Consortium (CDISC) Operational Data Model (ODM) as a format for Core Datasets was evaluated and a web tool was developed which received Source ODM XML files and--via Extensible Stylesheet Language Transformation (XSLT)--generated standardized Core Dataset ODM XML files. Using this tool, data from different source systems were extracted and pooled for joined analysis in a proof-of-concept study, facilitating both, basic syntactic and semantic interoperability.
Concept cartoons for diagnosing student’s misconceptions in the topic of buffers
NASA Astrophysics Data System (ADS)
Kusumaningrum, I. A.; Ashadi; Indriyanti, N. Y.
2018-05-01
Student’s misconceptions have been concerned over twenty years in the chemistry education research. It influences students to learn new knowledge and gain a correct concept. The buffer solution is found as a difficult topic due to student’s misconception. However, the research related this subject are still rare. Concept cartoon has been used as one of the effective tools to diagnose misconceptions. This study aims to identify the effectiveness of concept cartoon to diagnose them. The concept cartoon consists of three concept questions. 98 students of grade 11 as respondents of this research and followed by interview for selected students. The data obtain of the study are analyzed by using a scoring key. The detected misconceptions are about what buffers do, what buffers are, and how buffers are able to do what they do. Concept cartoon is potential as a basic tool for remedial teaching.
The 'whole-animal approach' as a heuristic principle in neuroscience research.
Serani-Merlo, Alejandro; Paz, Rodrigo; Castillo, Andrés
2005-01-01
Neuroscience embraces a heterogeneous group of disciplines. A conceptual framework that allows a better articulation of these different theoretical and experimental perspectives is needed. A 'whole-animal approach is proposed as a theoretical and hermeneutic tool. To illustrate the potential of this point of view, an overview of the research that has been performed in the extinction of fear-conditioned responses from Pavlov to the present is discussed. This is an example of how a whole-animal-based approach may help to organize and integrate basic and clinical neuroscience research. Our proposal is in agreement with recent statements calling for more integrative approaches in biological and neuropsychiatric research.
Transputer parallel processing at NASA Lewis Research Center
NASA Technical Reports Server (NTRS)
Ellis, Graham K.
1989-01-01
The transputer parallel processing lab at NASA Lewis Research Center (LeRC) consists of 69 processors (transputers) that can be connected into various networks for use in general purpose concurrent processing applications. The main goal of the lab is to develop concurrent scientific and engineering application programs that will take advantage of the computational speed increases available on a parallel processor over the traditional sequential processor. Current research involves the development of basic programming tools. These tools will help standardize program interfaces to specific hardware by providing a set of common libraries for applications programmers. The thrust of the current effort is in developing a set of tools for graphics rendering/animation. The applications programmer currently has two options for on-screen plotting. One option can be used for static graphics displays and the other can be used for animated motion. The option for static display involves the use of 2-D graphics primitives that can be called from within an application program. These routines perform the standard 2-D geometric graphics operations in real-coordinate space as well as allowing multiple windows on a single screen.
2009-01-01
Virtually, all research on basic mechanisms of aging has used species that are short lived and thus demonstrably unsuccessful at combating basic aging processes. A novel comparative approach would use a diversity of populations and species, focusing on those with particularly long, healthy lives, seeking the causative mechanisms that distinguish them from shorter lived relatives. Species of interest from this perspective include the naked mole rat, a mouse-size rodent that lives up to 30 years in the laboratory, and the little brown bat, which lives up to 34 years in the wild. Comparisons among dogs of different sizes, which differ by more than 50% in health span might also prove rewarding, as might novel species chosen because of their similarity to humans in certain key traits. Primates, because of their sophisticated cognitive ability, are a group of special value, and small, short-lived primates like the common marmoset might prove especially beneficial. Cell repositories and tissue banks from key species, as well as genomic and analytic tools optimized for comparative studies, would make valuable contributions to a new comparative approach to basic aging research. PMID:19223603
Safety issues in high speed machining
NASA Astrophysics Data System (ADS)
1994-05-01
There are several risks related to High-Speed Milling, but they have not been systematically determined or studied so far. Increased loads by high centrifugal forces may result in dramatic hazards. Flying tools or fragments from a tool with high kinetic energy may damage surrounding people, machines and devices. In the project, mechanical risks were evaluated, theoretic values for kinetic energies of rotating tools were calculated, possible damages of the flying objects were determined and terms to eliminate the risks were considered. The noise levels of the High-Speed Machining center owned by the Helsinki University of Technology (HUT) and the Technical Research Center of Finland (VTT) in practical machining situation were measured and the results were compared to those after basic preventive measures were taken.
AI Tools for Foreign Language Training
1989-07-01
certain of the four basic language skills (reading, writing, speak- ing, hearing ) are supported in this envircnmnnt. hile this argument is valid, we... skills . While this paper will not review the psycholinguistic parameters pertaining to foreign language learning, we mention it as cne of the essential...Institute Technologies for Skill Acquisition and Retention Technical Area Zita M. Simutis, Chief Training Research Laboratory Jack H. HiJler, Director U.S
ERIC Educational Resources Information Center
Vermont Inst. for Self-Reliance, Rutland.
This guide provides a description of Responsive Text (RT), a method for presenting job-relevant information within a computer-based support system. A summary of what RT is and why it is important is provided first. The first section of the guide provides a brief overview of what research tells about the reading process and how the general design…
Semiconductor Characterization: from Growth to Manufacturing
NASA Astrophysics Data System (ADS)
Colombo, Luigi
The successful growth and/or deposition of materials for any application require basic understanding of the materials physics for a given device. At the beginning, the first and most obvious characterization tool is visual observation; this is particularly true for single crystal growth. The characterization tools are usually prioritized in order of ease of measurement, and have become especially sophisticated as we have moved from the characterization of macroscopic crystals and films to atomically thin materials and nanostructures. While a lot attention is devoted to characterization and understanding of materials physics at the nano level, the characterization of single crystals as substrates or active components is still critically important. In this presentation, I will review and discuss the basic materials characterization techniques used to get to the materials physics to bring crystals and thin films from research to manufacturing in the fields of infrared detection, non-volatile memories, and transistors. Finally I will present and discuss metrology techniques used to understand the physics and chemistry of atomically thin two-dimensional materials for future device applications.
Photorejuvenation: still not a fully established clinical tool for cosmetic treatment
NASA Astrophysics Data System (ADS)
Gong, Wei; Xie, Shusen; Li, Hui
2006-01-01
Several methods have been used to improve the esthetic appearance of photodamaged skin including dermabrasion, chemical peels and laser resurfacing using CO2 and Er:YAG laser. These procedures sacrifice epidermis, resulting in a long recuperation period and potential complications including persistent scarring, infection, hyperpigmentation, etc. Compared to ablative CO2 or Er:YAG laser resurfacing, non-ablative photorejuvenation technologies are playing an increasing role in the treatment of photodamaged skin. The clinical objective of which is to maximize thermal damage to upper dermis while minimizing injury to overlying skin. A variety of laser and non-laser systems have been used in the initial stage for this treatment. In our review, different treatment modalities have resulted in varying degrees of clinical effects. The basic mechanisms relate to improvement in employing non-ablative technologies are also discussed. Photorejuvenation is still not a fully established clinical tool for cosmetic treatment according to our review, therefore more research on basic mechanisms should be made.
Gainotti, Sabina; Torreri, Paola; Wang, Chiuhui Mary; Reihs, Robert; Mueller, Heimo; Heslop, Emma; Roos, Marco; Badowska, Dorota Mazena; de Paulis, Federico; Kodra, Yllka; Carta, Claudio; Martìn, Estrella Lopez; Miller, Vanessa Rangel; Filocamo, Mirella; Mora, Marina; Thompson, Mark; Rubinstein, Yaffa; Posada de la Paz, Manuel; Monaco, Lucia; Lochmüller, Hanns; Taruscio, Domenica
2018-05-01
In rare disease (RD) research, there is a huge need to systematically collect biomaterials, phenotypic, and genomic data in a standardized way and to make them findable, accessible, interoperable and reusable (FAIR). RD-Connect is a 6 years global infrastructure project initiated in November 2012 that links genomic data with patient registries, biobanks, and clinical bioinformatics tools to create a central research resource for RDs. Here, we present RD-Connect Registry & Biobank Finder, a tool that helps RD researchers to find RD biobanks and registries and provide information on the availability and accessibility of content in each database. The finder concentrates information that is currently sparse on different repositories (inventories, websites, scientific journals, technical reports, etc.), including aggregated data and metadata from participating databases. Aggregated data provided by the finder, if appropriately checked, can be used by researchers who are trying to estimate the prevalence of a RD, to organize a clinical trial on a RD, or to estimate the volume of patients seen by different clinical centers. The finder is also a portal to other RD-Connect tools, providing a link to the RD-Connect Sample Catalogue, a large inventory of RD biological samples available in participating biobanks for RD research. There are several kinds of users and potential uses for the RD-Connect Registry & Biobank Finder, including researchers collaborating with academia and the industry, dealing with the questions of basic, translational, and/or clinical research. As of November 2017, the finder is populated with aggregated data for 222 registries and 21 biobanks.
Toward mapping the biology of the genome.
Chanock, Stephen
2012-09-01
This issue of Genome Research presents new results, methods, and tools from The ENCODE Project (ENCyclopedia of DNA Elements), which collectively represents an important step in moving beyond a parts list of the genome and promises to shape the future of genomic research. This collection sheds light on basic biological questions and frames the current debate over the optimization of tools and methodological challenges necessary to compare and interpret large complex data sets focused on how the genome is organized and regulated. In a number of instances, the authors have highlighted the strengths and limitations of current computational and technical approaches, providing the community with useful standards, which should stimulate development of new tools. In many ways, these papers will ripple through the scientific community, as those in pursuit of understanding the "regulatory genome" will heavily traverse the maps and tools. Similarly, the work should have a substantive impact on how genetic variation contributes to specific diseases and traits by providing a compendium of functional elements for follow-up study. The success of these papers should not only be measured by the scope of the scientific insights and tools but also by their ability to attract new talent to mine existing and future data.
Rose, Peter W; Prlić, Andreas; Bi, Chunxiao; Bluhm, Wolfgang F; Christie, Cole H; Dutta, Shuchismita; Green, Rachel Kramer; Goodsell, David S; Westbrook, John D; Woo, Jesse; Young, Jasmine; Zardecki, Christine; Berman, Helen M; Bourne, Philip E; Burley, Stephen K
2015-01-01
The RCSB Protein Data Bank (RCSB PDB, http://www.rcsb.org) provides access to 3D structures of biological macromolecules and is one of the leading resources in biology and biomedicine worldwide. Our efforts over the past 2 years focused on enabling a deeper understanding of structural biology and providing new structural views of biology that support both basic and applied research and education. Herein, we describe recently introduced data annotations including integration with external biological resources, such as gene and drug databases, new visualization tools and improved support for the mobile web. We also describe access to data files, web services and open access software components to enable software developers to more effectively mine the PDB archive and related annotations. Our efforts are aimed at expanding the role of 3D structure in understanding biology and medicine. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.
Electrophoresis experiments in microgravity
NASA Technical Reports Server (NTRS)
Snyder, Robert S.; Rhodes, Percy H.
1991-01-01
The use of the microgravity environment to separate and purify biological cells and proteins has been a major activity since the beginning of the NASA Microgravity Science and Applications program. Purified populations of cells are needed for research, transplantation and analysis of specific cell constituents. Protein purification is a necessary step in research areas such as genetic engineering where the new protein has to be separated from the variety of other proteins synthesized from the microorganism. Sufficient data are available from the results of past electrophoresis experiments in space to show that these experiments were designed with incomplete knowledge of the fluid dynamics of the process including electrohydrodynamics. However, electrophoresis is still an important separation tool in the laboratory and thermal convection does limit its performance. Thus, there is a justification for electrophoresis but the emphasis of future space experiments must be directed toward basic research with model experiments to understand the microgravity environment and fluid analysis to test the basic principles of the process.
Thermoluminescence as a Research Tool to Investigate Luminescence Mechanisms
2017-01-01
Thermally stimulated luminescence (TSL) is known as a technique used in radiation dosimetry and dating. However, since the luminescence is very sensitive to the defects in a solid, it can also be used in material research. In this review, it is shown how TSL can be used as a research tool to investigate luminescent characteristics and underlying luminescent mechanisms. First, some basic characteristics and a theoretical background of the phenomenon are given. Next, methods and difficulties in extracting trapping parameters are addressed. Then, the instrumentation needed to measure the luminescence, both as a function of temperature and wavelength, is described. Finally, a series of very diverse examples is given to illustrate how TSL has been used in the determination of energy levels of defects, in the research of persistent luminescence phosphors, and in phenomena like band gap engineering, tunnelling, photosynthesis, and thermal quenching. It is concluded that in the field of luminescence spectroscopy, thermally stimulated luminescence has proven to be an experimental technique with unique properties to study defects in solids. PMID:29186873
Kirlian Photography as a Teaching Tool of Physics
NASA Astrophysics Data System (ADS)
Terrel, Andy; Thacker, Beth Ann, , Dr.
2002-10-01
There are a number of groups across the country working on redesigning introductory physics courses by incorporating physics education research, modeling, and making the courses appeal to students in broader fields. We spent the summer exploring Kirlian photography, a subject that can be understood by students with a basic comprehension of electrostatics but is still questioned by many people in other fields. Kirlian photography's applications have captivated alternative medicine but still requires research from both physics and biology to understand if it has potential as medical tool. We used a simple setup to reproduce the physics that has been done to see if it could be used in an educational setting. I will demonstrate how Kirlian photography can be explained by physics but also how the topic still needs research to completely understand its possible biological applications. By incorporating such a topic into a curriculum, one is able to teach students to explore supposed supernatural phenomena scientifically and to promote research among undergraduate students.
The t-test: An Influential Inferential Tool in Chaplaincy and Other Healthcare Research.
Jankowski, Katherine R B; Flannelly, Kevin J; Flannelly, Laura T
2018-01-01
The t-test developed by William S. Gosset (also known as Student's t-test and the two-sample t-test) is commonly used to compare one sample mean on a measure with another sample mean on the same measure. The outcome of the t-test is used to draw inferences about how different the samples are from each other. It is probably one of the most frequently relied upon statistics in inferential research. It is easy to use: a researcher can calculate the statistic with three simple tools: paper, pen, and a calculator. A computer program can quickly calculate the t-test for large samples. The ease of use can result in the misuse of the t-test. This article discusses the development of the original t-test, basic principles of the t-test, two additional types of t-tests (the one-sample t-test and the paired t-test), and recommendations about what to consider when using the t-test to draw inferences in research.
A Research Agenda to Underpin Malaria Eradication
Alonso, Pedro L.; Brown, Graham; Arevalo-Herrera, Myriam; Binka, Fred; Chitnis, Chetan; Collins, Frank; Doumbo, Ogobara K.; Greenwood, Brian; Hall, B. Fenton; Levine, Myron M.; Mendis, Kamini; Newman, Robert D.; Plowe, Christopher V.; Rodríguez, Mario Henry; Sinden, Robert; Slutsker, Laurence; Tanner, Marcel
2011-01-01
The interruption of malaria transmission worldwide is one of the greatest challenges for international health and development communities. The current expert view suggests that, by aggressively scaling up control with currently available tools and strategies, much greater gains could be achieved against malaria, including elimination from a number of countries and regions; however, even with maximal effort we will fall short of global eradication. The Malaria Eradication Research Agenda (malERA) complements the current research agenda—primarily directed towards reducing morbidity and mortality—with one that aims to identify key knowledge gaps and define the strategies and tools that will result in reducing the basic reproduction rate to less than 1, with the ultimate aim of eradication of the parasite from the human population. Sustained commitment from local communities, civil society, policy leaders, and the scientific community, together with a massive effort to build a strong base of researchers from the endemic areas will be critical factors in the success of this new agenda. PMID:21311579
A survey of tools for the analysis of quantitative PCR (qPCR) data.
Pabinger, Stephan; Rödiger, Stefan; Kriegner, Albert; Vierlinger, Klemens; Weinhäusel, Andreas
2014-09-01
Real-time quantitative polymerase-chain-reaction (qPCR) is a standard technique in most laboratories used for various applications in basic research. Analysis of qPCR data is a crucial part of the entire experiment, which has led to the development of a plethora of methods. The released tools either cover specific parts of the workflow or provide complete analysis solutions. Here, we surveyed 27 open-access software packages and tools for the analysis of qPCR data. The survey includes 8 Microsoft Windows, 5 web-based, 9 R-based and 5 tools from other platforms. Reviewed packages and tools support the analysis of different qPCR applications, such as RNA quantification, DNA methylation, genotyping, identification of copy number variations, and digital PCR. We report an overview of the functionality, features and specific requirements of the individual software tools, such as data exchange formats, availability of a graphical user interface, included procedures for graphical data presentation, and offered statistical methods. In addition, we provide an overview about quantification strategies, and report various applications of qPCR. Our comprehensive survey showed that most tools use their own file format and only a fraction of the currently existing tools support the standardized data exchange format RDML. To allow a more streamlined and comparable analysis of qPCR data, more vendors and tools need to adapt the standardized format to encourage the exchange of data between instrument software, analysis tools, and researchers.
Tambe, Joshua; Minkande, Jacqueline Ze; Moifo, Boniface; Mbu, Robinson; Ongolo-Zogo, Pierre; Gonsu, Joseph
2014-12-21
Research activities for medical students and residents (trainees) are expected to serve as a foundation for the acquisition of basic research skills. Some medical schools therefore recommend research work as partial requirement for certification. However medical trainees have many difficulties concerning research, for which reason potential remedial strategies need to be constantly developed and tested. The views of medical trainees are assessed followed by their use and appraisal of a novel "self-help" tool designed for the purposes of this study with potential for improvement and a wider application. This study was a cross-sectional survey of volunteering final-year medical students and residents of a medical school in Cameroon. This study surveyed the opinions of a total of 120 volunteers of which 82 (68%) were medical students. Three out of 82 (4%) medical students reported they had participated in research activities with a publication versus 10 out of 38 residents (26%). The reported difficulties in research for these trainees included referencing of material (84%), writing a research proposal (79%), searching for literature (73%) and knowledge of applicable statistical tests (72%) amongst others. All participants declared the "self-help" tool was simple to use, guided them to think and better understand their research focus. Medical trainees require much assistance on research and some "self-help" tools such as the template used in this study might be a useful adjunct to didactic lectures.
Intelligent tutoring systems as tools for investigating individual differences in learning
NASA Technical Reports Server (NTRS)
Shute, Valerie J.
1987-01-01
The ultimate goal of this research is to build an improved model-based selection and classification system for the United States Air Force. Researchers are developing innovative approaches to ability testing. The Learning Abilities Measurement Program (LAMP) examines individual differences in learning abilities, seeking answers to the questions of why some people learn more and better than others and whether there are basic cognitive processes applicable across tasks and domains that are predictive of successful performance (or whether there are more complex problem solving behaviors involved).
Barriers to the Preclinical Development of Therapeutics that Target Aging Mechanisms
Burd, Christin E.; Gill, Matthew S.; Niedernhofer, Laura J.; Robbins, Paul D.; Austad, Steven N.; Barzilai, Nir
2016-01-01
Through the progress of basic science research, fundamental mechanisms that contribute to age-related decline are being described with increasing depth and detail. Although these efforts have identified new drug targets and compounds that extend life span in model organisms, clinical trials of therapeutics that target aging processes remain scarce. Progress in aging research is hindered by barriers associated with the translation of basic science discoveries into the clinic. This report summarizes discussions held at a 2014 Geroscience Network retreat focused on identifying hurdles that currently impede the preclinical development of drugs targeting fundamental aging processes. From these discussions, it was evident that aging researchers have varied perceptions of the ideal preclinical pipeline. To forge a clear and cohesive path forward, several areas of controversy must first be resolved and new tools developed. Here, we focus on five key issues in preclinical drug development (drug discovery, lead compound development, translational preclinical biomarkers, funding, and integration between researchers and clinicians), expanding upon discussions held at the Geroscience Retreat and suggesting areas for further research. By bringing these findings to the attention of the aging research community, we hope to lay the foundation for a concerted preclinical drug development pipeline. PMID:27535964
Cellular automata and its applications in protein bioinformatics.
Xiao, Xuan; Wang, Pu; Chou, Kuo-Chen
2011-09-01
With the explosion of protein sequences generated in the postgenomic era, it is highly desirable to develop high-throughput tools for rapidly and reliably identifying various attributes of uncharacterized proteins based on their sequence information alone. The knowledge thus obtained can help us timely utilize these newly found protein sequences for both basic research and drug discovery. Many bioinformatics tools have been developed by means of machine learning methods. This review is focused on the applications of a new kind of science (cellular automata) in protein bioinformatics. A cellular automaton (CA) is an open, flexible and discrete dynamic model that holds enormous potentials in modeling complex systems, in spite of the simplicity of the model itself. Researchers, scientists and practitioners from different fields have utilized cellular automata for visualizing protein sequences, investigating their evolution processes, and predicting their various attributes. Owing to its impressive power, intuitiveness and relative simplicity, the CA approach has great potential for use as a tool for bioinformatics.
Web-based analysis and publication of flow cytometry experiments.
Kotecha, Nikesh; Krutzik, Peter O; Irish, Jonathan M
2010-07-01
Cytobank is a Web-based application for storage, analysis, and sharing of flow cytometry experiments. Researchers use a Web browser to log in and use a wide range of tools developed for basic and advanced flow cytometry. In addition to providing access to standard cytometry tools from any computer, Cytobank creates a platform and community for developing new analysis and publication tools. Figure layouts created on Cytobank are designed to allow transparent access to the underlying experiment annotation and data processing steps. Since all flow cytometry files and analysis data are stored on a central server, experiments and figures can be viewed or edited by anyone with the proper permission, from any computer with Internet access. Once a primary researcher has performed the initial analysis of the data, collaborators can engage in experiment analysis and make their own figure layouts using the gated, compensated experiment files. Cytobank is available to the scientific community at http://www.cytobank.org. (c) 2010 by John Wiley & Sons, Inc.
Web-Based Analysis and Publication of Flow Cytometry Experiments
Kotecha, Nikesh; Krutzik, Peter O.; Irish, Jonathan M.
2014-01-01
Cytobank is a web-based application for storage, analysis, and sharing of flow cytometry experiments. Researchers use a web browser to log in and use a wide range of tools developed for basic and advanced flow cytometry. In addition to providing access to standard cytometry tools from any computer, Cytobank creates a platform and community for developing new analysis and publication tools. Figure layouts created on Cytobank are designed to allow transparent access to the underlying experiment annotation and data processing steps. Since all flow cytometry files and analysis data are stored on a central server, experiments and figures can be viewed or edited by anyone with the proper permissions from any computer with Internet access. Once a primary researcher has performed the initial analysis of the data, collaborators can engage in experiment analysis and make their own figure layouts using the gated, compensated experiment files. Cytobank is available to the scientific community at www.cytobank.org PMID:20578106
The RCSB Protein Data Bank: views of structural biology for basic and applied research and education
Rose, Peter W.; Prlić, Andreas; Bi, Chunxiao; Bluhm, Wolfgang F.; Christie, Cole H.; Dutta, Shuchismita; Green, Rachel Kramer; Goodsell, David S.; Westbrook, John D.; Woo, Jesse; Young, Jasmine; Zardecki, Christine; Berman, Helen M.; Bourne, Philip E.; Burley, Stephen K.
2015-01-01
The RCSB Protein Data Bank (RCSB PDB, http://www.rcsb.org) provides access to 3D structures of biological macromolecules and is one of the leading resources in biology and biomedicine worldwide. Our efforts over the past 2 years focused on enabling a deeper understanding of structural biology and providing new structural views of biology that support both basic and applied research and education. Herein, we describe recently introduced data annotations including integration with external biological resources, such as gene and drug databases, new visualization tools and improved support for the mobile web. We also describe access to data files, web services and open access software components to enable software developers to more effectively mine the PDB archive and related annotations. Our efforts are aimed at expanding the role of 3D structure in understanding biology and medicine. PMID:25428375
A Researcher's Guide to Mass Spectrometry-Based Proteomics
Savaryn, John P.; Toby, Timothy K.; Kelleher, Neil L.
2016-01-01
Mass spectrometry (MS) is widely recognized as a powerful analytical tool for molecular research. MS is used by researchers around the globe to identify, quantify, and characterize biomolecules like proteins from any number of biological conditions or sample types. As instrumentation has advanced, and with the coupling of liquid chromatography (LC) for high-throughput LC-MS/MS, a proteomics experiment measuring hundreds to thousands of proteins/protein groups is now commonplace. While expert practitioners who best understand the operation of LC-MS systems tend to have strong backgrounds in physics and engineering, consumers of proteomics data and technology are not exposed to the physio-chemical principles underlying the information they seek. Since articles and reviews tend not to focus on bridging this divide, our goal here is to span this gap and translate MS ion physics into language intuitive to the general reader active in basic or applied biomedical research. Here, we visually describe what happens to ions as they enter and move around inside a mass spectrometer. We describe basic MS principles, including electric current, ion optics, ion traps, quadrupole mass filters, and Orbitrap FT-analyzers. PMID:27553853
Indicators for the use of robotic labs in basic biomedical research: a literature analysis
2017-01-01
Robotic labs, in which experiments are carried out entirely by robots, have the potential to provide a reproducible and transparent foundation for performing basic biomedical laboratory experiments. In this article, we investigate whether these labs could be applicable in current experimental practice. We do this by text mining 1,628 papers for occurrences of methods that are supported by commercial robotic labs. Using two different concept recognition tools, we find that 86%–89% of the papers have at least one of these methods. This and our other results provide indications that robotic labs can serve as the foundation for performing many lab-based experiments. PMID:29134146
Heavy Equipment Mechanic. Instructor Edition.
ERIC Educational Resources Information Center
Hendrix, Laborn J.; And Others
This manual is intended to assist heavy equipment instructors in teaching the latest concepts and functions of heavy equipment. It includes 7 sections and 27 instructional units. Sections (and units) are: orientation (shop safety and first aid, hand tools and miscellaneous tools, measuring, basic rigging and hoisting), engines (basic engine…
ERIC Educational Resources Information Center
Bjorkquist, David C.
A job-oriented program emphasizing application to the specific occupation of tool design was compared with a field-oriented program intended to give a broad basic preparation for a variety of jobs in the field of mechanical technology. Both programs were conducted under the Manpower Development and Training Act (MDTA) for a period of 52 weeks.…
Integrated Modeling and Analysis of Physical Oceanographic and Acoustic Processes
2015-09-30
goal is to improve ocean physical state and acoustic state predictive capabilities. The goal fitting the scope of this project is the creation of... Project -scale objectives are to complete targeted studies of oceanographic processes in a few regimes, accompanied by studies of acoustic propagation...by the basic research efforts of this project . An additional objective is to develop improved computational tools for acoustics and for the
Applications of the pipeline environment for visual informatics and genomics computations
2011-01-01
Background Contemporary informatics and genomics research require efficient, flexible and robust management of large heterogeneous data, advanced computational tools, powerful visualization, reliable hardware infrastructure, interoperability of computational resources, and detailed data and analysis-protocol provenance. The Pipeline is a client-server distributed computational environment that facilitates the visual graphical construction, execution, monitoring, validation and dissemination of advanced data analysis protocols. Results This paper reports on the applications of the LONI Pipeline environment to address two informatics challenges - graphical management of diverse genomics tools, and the interoperability of informatics software. Specifically, this manuscript presents the concrete details of deploying general informatics suites and individual software tools to new hardware infrastructures, the design, validation and execution of new visual analysis protocols via the Pipeline graphical interface, and integration of diverse informatics tools via the Pipeline eXtensible Markup Language syntax. We demonstrate each of these processes using several established informatics packages (e.g., miBLAST, EMBOSS, mrFAST, GWASS, MAQ, SAMtools, Bowtie) for basic local sequence alignment and search, molecular biology data analysis, and genome-wide association studies. These examples demonstrate the power of the Pipeline graphical workflow environment to enable integration of bioinformatics resources which provide a well-defined syntax for dynamic specification of the input/output parameters and the run-time execution controls. Conclusions The LONI Pipeline environment http://pipeline.loni.ucla.edu provides a flexible graphical infrastructure for efficient biomedical computing and distributed informatics research. The interactive Pipeline resource manager enables the utilization and interoperability of diverse types of informatics resources. The Pipeline client-server model provides computational power to a broad spectrum of informatics investigators - experienced developers and novice users, user with or without access to advanced computational-resources (e.g., Grid, data), as well as basic and translational scientists. The open development, validation and dissemination of computational networks (pipeline workflows) facilitates the sharing of knowledge, tools, protocols and best practices, and enables the unbiased validation and replication of scientific findings by the entire community. PMID:21791102
More emotional facial expressions during episodic than during semantic autobiographical retrieval.
El Haj, Mohamad; Antoine, Pascal; Nandrino, Jean Louis
2016-04-01
There is a substantial body of research on the relationship between emotion and autobiographical memory. Using facial analysis software, our study addressed this relationship by investigating basic emotional facial expressions that may be detected during autobiographical recall. Participants were asked to retrieve 3 autobiographical memories, each of which was triggered by one of the following cue words: happy, sad, and city. The autobiographical recall was analyzed by a software for facial analysis that detects and classifies basic emotional expressions. Analyses showed that emotional cues triggered the corresponding basic facial expressions (i.e., happy facial expression for memories cued by happy). Furthermore, we dissociated episodic and semantic retrieval, observing more emotional facial expressions during episodic than during semantic retrieval, regardless of the emotional valence of cues. Our study provides insight into facial expressions that are associated with emotional autobiographical memory. It also highlights an ecological tool to reveal physiological changes that are associated with emotion and memory.
Fractional vector calculus for fractional advection dispersion
NASA Astrophysics Data System (ADS)
Meerschaert, Mark M.; Mortensen, Jeff; Wheatcraft, Stephen W.
2006-07-01
We develop the basic tools of fractional vector calculus including a fractional derivative version of the gradient, divergence, and curl, and a fractional divergence theorem and Stokes theorem. These basic tools are then applied to provide a physical explanation for the fractional advection-dispersion equation for flow in heterogeneous porous media.
Basic Engineer Equipment Mechanic.
ERIC Educational Resources Information Center
Marine Corps Inst., Washington, DC.
This student guide, one of a series of correspondence training courses designed to improve the job performance of members of the Marine Corps, deals with the skills needed by basic engineer equipment mechanics. Addressed in the four individual units of the course are the following topics: mechanics and their tools (mechanics, hand tools, and power…
Using a Self-Administered Visual Basic Software Tool To Teach Psychological Concepts.
ERIC Educational Resources Information Center
Strang, Harold R.; Sullivan, Amie K.; Schoeny, Zahrl G.
2002-01-01
Introduces LearningLinks, a Visual Basic software tool that allows teachers to create individualized learning modules that use constructivist and behavioral learning principles. Describes field testing of undergraduates at the University of Virginia that tested a module designed to improve understanding of the psychological concepts of…
Basic Technology Tools for Administrators: Preparing for the New Millennium.
ERIC Educational Resources Information Center
Aguilera, Raymond; Hendricks, Joen M.
This paper suggests activities for school administrators to learn basic technology tools. Step-by-step instructions are provided for browsing and using the Internet, organizing favorite World Wide Web sites, and organizing Internet bookmarks. Interesting job search, legal, and professional organization Web sites for administrators are listed. A…
NASA Automated Fiber Placement Capabilities: Similar Systems, Complementary Purposes
NASA Technical Reports Server (NTRS)
Wu, K. Chauncey; Jackson, Justin R.; Pelham, Larry I.; Stewart, Brian K.
2015-01-01
New automated fiber placement systems at the NASA Langley Research Center and NASA Marshall Space Flight Center provide state-of-art composites capabilities to these organizations. These systems support basic and applied research at Langley, complementing large-scale manufacturing and technology development at Marshall. These systems each consist of a multi-degree of freedom mobility platform including a commercial robot, a commercial tool changer mechanism, a bespoke automated fiber placement end effector, a linear track, and a rotational tool support structure. In addition, new end effectors with advanced capabilities may be either bought or developed with partners in industry and academia to extend the functionality of these systems. These systems will be used to build large and small composite parts in support of the ongoing NASA Composites for Exploration Upper Stage Project later this year.
[Modeling of carbon cycling in terrestrial ecosystem: a review].
Mao, Liuxi; Sun, Yanling; Yan, Xiaodong
2006-11-01
Terrestrial carbon cycling is one of the important issues in global change research, while carbon cycling modeling has become a necessary method and tool in understanding this cycling. This paper reviewed the research progress in terrestrial carbon cycling, with the focus on the basic framework of simulation modeling, two essential models of carbon cycling, and the classes of terrestrial carbon cycling modeling, and analyzed the present situation of terrestrial carbon cycling modeling. It was pointed out that the future research direction could be based on the biophysical modeling of dynamic vegetation, and this modeling could be an important component in the earth system modeling.
Bioinformatics by Example: From Sequence to Target
NASA Astrophysics Data System (ADS)
Kossida, Sophia; Tahri, Nadia; Daizadeh, Iraj
2002-12-01
With the completion of the human genome, and the imminent completion of other large-scale sequencing and structure-determination projects, computer-assisted bioscience is aimed to become the new paradigm for conducting basic and applied research. The presence of these additional bioinformatics tools stirs great anxiety for experimental researchers (as well as for pedagogues), since they are now faced with a wider and deeper knowledge of differing disciplines (biology, chemistry, physics, mathematics, and computer science). This review targets those individuals who are interested in using computational methods in their teaching or research. By analyzing a real-life, pharmaceutical, multicomponent, target-based example the reader will experience this fascinating new discipline.
Show and tell: disclosure and data sharing in experimental pathology.
Schofield, Paul N; Ward, Jerrold M; Sundberg, John P
2016-06-01
Reproducibility of data from experimental investigations using animal models is increasingly under scrutiny because of the potentially negative impact of poor reproducibility on the translation of basic research. Histopathology is a key tool in biomedical research, in particular for the phenotyping of animal models to provide insights into the pathobiology of diseases. Failure to disclose and share crucial histopathological experimental details compromises the validity of the review process and reliability of the conclusions. We discuss factors that affect the interpretation and validation of histopathology data in publications and the importance of making these data accessible to promote replicability in research. © 2016. Published by The Company of Biologists Ltd.
High-resolution PET [Positron Emission Tomography] for Medical Science Studies
DOE R&D Accomplishments Database
Budinger, T. F.; Derenzo, S. E.; Huesman, R. H.; Jagust, W. J.; Valk, P. E.
1989-09-01
One of the unexpected fruits of basic physics research and the computer revolution is the noninvasive imaging power available to today's physician. Technologies that were strictly the province of research scientists only a decade or two ago now serve as the foundations for such standard diagnostic tools as x-ray computer tomography (CT), magnetic resonance imaging (MRI), magnetic resonance spectroscopy (MRS), ultrasound, single photon emission computed tomography (SPECT), and positron emission tomography (PET). Furthermore, prompted by the needs of both the practicing physician and the clinical researcher, efforts to improve these technologies continue. This booklet endeavors to describe the advantages of achieving high resolution in PET imaging.
ARTiiFACT: a tool for heart rate artifact processing and heart rate variability analysis.
Kaufmann, Tobias; Sütterlin, Stefan; Schulz, Stefan M; Vögele, Claus
2011-12-01
The importance of appropriate handling of artifacts in interbeat interval (IBI) data must not be underestimated. Even a single artifact may cause unreliable heart rate variability (HRV) results. Thus, a robust artifact detection algorithm and the option for manual intervention by the researcher form key components for confident HRV analysis. Here, we present ARTiiFACT, a software tool for processing electrocardiogram and IBI data. Both automated and manual artifact detection and correction are available in a graphical user interface. In addition, ARTiiFACT includes time- and frequency-based HRV analyses and descriptive statistics, thus offering the basic tools for HRV analysis. Notably, all program steps can be executed separately and allow for data export, thus offering high flexibility and interoperability with a whole range of applications.
Wierzba, Małgorzata; Riegel, Monika; Wypych, Marek; Jednoróg, Katarzyna; Turnau, Paweł; Grabowska, Anna; Marchewka, Artur
2015-01-01
The Nencki Affective Word List (NAWL) has recently been introduced as a standardized database of Polish words suitable for studying various aspects of language and emotions. Though the NAWL was originally based on the most commonly used dimensional approach, it is not the only way of studying emotions. Another framework is based on discrete emotional categories. Since the two perspectives are recognized as complementary, the aim of the present study was to supplement the NAWL database by the addition of categories corresponding to basic emotions. Thus, 2902 Polish words from the NAWL were presented to 265 subjects, who were instructed to rate them according to the intensity of each of the five basic emotions: happiness, anger, sadness, fear and disgust. The general characteristics of the present word database, as well as the relationships between the studied variables are shown to be consistent with typical patterns found in previous studies using similar databases for different languages. Here we present the Basic Emotions in the Nencki Affective Word List (NAWL BE) as a database of verbal material suitable for highly controlled experimental research. To make the NAWL more convenient to use, we introduce a comprehensive method of classifying stimuli to basic emotion categories. We discuss the advantages of our method in comparison to other methods of classification. Additionally, we provide an interactive online tool (http://exp.lobi.nencki.gov.pl/nawl-analysis) to help researchers browse and interactively generate classes of stimuli to meet their specific requirements.
Wierzba, Małgorzata; Riegel, Monika; Wypych, Marek; Jednoróg, Katarzyna; Turnau, Paweł; Grabowska, Anna; Marchewka, Artur
2015-01-01
The Nencki Affective Word List (NAWL) has recently been introduced as a standardized database of Polish words suitable for studying various aspects of language and emotions. Though the NAWL was originally based on the most commonly used dimensional approach, it is not the only way of studying emotions. Another framework is based on discrete emotional categories. Since the two perspectives are recognized as complementary, the aim of the present study was to supplement the NAWL database by the addition of categories corresponding to basic emotions. Thus, 2902 Polish words from the NAWL were presented to 265 subjects, who were instructed to rate them according to the intensity of each of the five basic emotions: happiness, anger, sadness, fear and disgust. The general characteristics of the present word database, as well as the relationships between the studied variables are shown to be consistent with typical patterns found in previous studies using similar databases for different languages. Here we present the Basic Emotions in the Nencki Affective Word List (NAWL BE) as a database of verbal material suitable for highly controlled experimental research. To make the NAWL more convenient to use, we introduce a comprehensive method of classifying stimuli to basic emotion categories. We discuss the advantages of our method in comparison to other methods of classification. Additionally, we provide an interactive online tool (http://exp.lobi.nencki.gov.pl/nawl-analysis) to help researchers browse and interactively generate classes of stimuli to meet their specific requirements. PMID:26148193
Nanotechnology research: applications in nutritional sciences.
Srinivas, Pothur R; Philbert, Martin; Vu, Tania Q; Huang, Qingrong; Kokini, Josef L; Saltos, Etta; Saos, Etta; Chen, Hongda; Peterson, Charles M; Friedl, Karl E; McDade-Ngutter, Crystal; Hubbard, Van; Starke-Reed, Pamela; Miller, Nancy; Betz, Joseph M; Dwyer, Johanna; Milner, John; Ross, Sharon A
2010-01-01
The tantalizing potential of nanotechnology is to fabricate and combine nanoscale approaches and building blocks to make useful tools and, ultimately, interventions for medical science, including nutritional science, at the scale of approximately 1-100 nm. In the past few years, tools and techniques that facilitate studies and interventions in the nanoscale range have become widely available and have drawn widespread attention. Recently, investigators in the food and nutrition sciences have been applying the tools of nanotechnology in their research. The Experimental Biology 2009 symposium entitled "Nanotechnology Research: Applications in Nutritional Sciences" was organized to highlight emerging applications of nanotechnology to the food and nutrition sciences, as well as to suggest ways for further integration of these emerging technologies into nutrition research. Speakers focused on topics that included the problems and possibilities of introducing nanoparticles in clinical or nutrition settings, nanotechnology applications for increasing bioavailability of bioactive food components in new food products, nanotechnology opportunities in food science, as well as emerging safety and regulatory issues in this area, and the basic research applications such as the use of quantum dots to visualize cellular processes and protein-protein interactions. The session highlighted several emerging areas of potential utility in nutrition research. Nutrition scientists are encouraged to leverage ongoing efforts in nanomedicine through collaborations. These efforts could facilitate exploration of previously inaccessible cellular compartments and intracellular pathways and thus uncover strategies for new prevention and therapeutic modalities.
Nanotechnology Research: Applications in Nutritional Sciences12
Srinivas, Pothur R.; Philbert, Martin; Vu, Tania Q.; Huang, Qingrong; Kokini, Josef L.; Saos, Etta; Chen, Hongda; Peterson, Charles M.; Friedl, Karl E.; McDade-Ngutter, Crystal; Hubbard, Van; Starke-Reed, Pamela; Miller, Nancy; Betz, Joseph M.; Dwyer, Johanna; Milner, John; Ross, Sharon A.
2010-01-01
The tantalizing potential of nanotechnology is to fabricate and combine nanoscale approaches and building blocks to make useful tools and, ultimately, interventions for medical science, including nutritional science, at the scale of ∼1–100 nm. In the past few years, tools and techniques that facilitate studies and interventions in the nanoscale range have become widely available and have drawn widespread attention. Recently, investigators in the food and nutrition sciences have been applying the tools of nanotechnology in their research. The Experimental Biology 2009 symposium entitled “Nanotechnology Research: Applications in Nutritional Sciences” was organized to highlight emerging applications of nanotechnology to the food and nutrition sciences, as well as to suggest ways for further integration of these emerging technologies into nutrition research. Speakers focused on topics that included the problems and possibilities of introducing nanoparticles in clinical or nutrition settings, nanotechnology applications for increasing bioavailability of bioactive food components in new food products, nanotechnology opportunities in food science, as well as emerging safety and regulatory issues in this area, and the basic research applications such as the use of quantum dots to visualize cellular processes and protein-protein interactions. The session highlighted several emerging areas of potential utility in nutrition research. Nutrition scientists are encouraged to leverage ongoing efforts in nanomedicine through collaborations. These efforts could facilitate exploration of previously inaccessible cellular compartments and intracellular pathways and thus uncover strategies for new prevention and therapeutic modalities. PMID:19939997
MatMRI and MatHIFU: software toolboxes for real-time monitoring and control of MR-guided HIFU
2013-01-01
Background The availability of open and versatile software tools is a key feature to facilitate pre-clinical research for magnetic resonance imaging (MRI) and magnetic resonance-guided high-intensity focused ultrasound (MR-HIFU) and expedite clinical translation of diagnostic and therapeutic medical applications. In the present study, two customizable software tools that were developed at the Thunder Bay Regional Research Institute are presented for use with both MRI and MR-HIFU. Both tools operate in a MATLAB®; environment. The first tool is named MatMRI and enables real-time, dynamic acquisition of MR images with a Philips MRI scanner. The second tool is named MatHIFU and enables the execution and dynamic modification of user-defined treatment protocols with the Philips Sonalleve MR-HIFU therapy system to perform ultrasound exposures in MR-HIFU therapy applications. Methods MatMRI requires four basic steps: initiate communication, subscribe to MRI data, query for new images, and unsubscribe. MatMRI can also pause/resume the imaging and perform real-time updates of the location and orientation of images. MatHIFU requires four basic steps: initiate communication, prepare treatment protocol, and execute treatment protocol. MatHIFU can monitor the state of execution and, if required, modify the protocol in real time. Results Four applications were developed to showcase the capabilities of MatMRI and MatHIFU to perform pre-clinical research. Firstly, MatMRI was integrated with an existing small animal MR-HIFU system (FUS Instruments, Toronto, Ontario, Canada) to provide real-time temperature measurements. Secondly, MatMRI was used to perform T2-based MR thermometry in the bone marrow. Thirdly, MatHIFU was used to automate acoustic hydrophone measurements on a per-element basis of the 256-element transducer of the Sonalleve system. Finally, MatMRI and MatHIFU were combined to produce and image a heating pattern that recreates the word ‘HIFU’ in a tissue-mimicking heating phantom. Conclusions MatMRI and MatHIFU leverage existing MRI and MR-HIFU clinical platforms to facilitate pre-clinical research. MatMRI substantially simplifies the real-time acquisition and processing of MR data. MatHIFU facilitates the testing and characterization of new therapy applications using the Philips Sonalleve clinical MR-HIFU system. Under coordination with Philips Healthcare, both MatMRI and MatHIFU are intended to be freely available as open-source software packages to other research groups. PMID:25512856
Faster, Less Expensive Dies Using RSP Tooling
NASA Astrophysics Data System (ADS)
Knirsch, James R.
2007-08-01
RSP Tooling is an indirect spray form additive process that can produce production tooling for virtually any forming process and from virtually any metal. In the past 24 months a significant amount of research and development has been performed. This resulted in an increase in the basic metallurgical understanding of what transpires during the rapid solidification of the metal, significant improvements in the production machine up time, ceramic developments that have improved finish, process changes that have resulted in a shorter lead time for tool delivery, and the testing of many new alloys. RSP stands for Rapid Solidification Process and is the key to the superior metallurgical properties that result from the technology. Most metals that are sprayed in the process leave the machine with the same physical properties as the same metal normally achieves through heat treatment and in some cases the properties are superior. Many new applications are being pursued including INVAR tools for aerospace composite materials, and bimetallic tools made from tool steel and beryllium copper for die casting and plastic injection molding. Recent feasibility studies have been performed with tremendous success.
Co-authorship network analysis in health research: method and potential use.
Fonseca, Bruna de Paula Fonseca E; Sampaio, Ricardo Barros; Fonseca, Marcus Vinicius de Araújo; Zicker, Fabio
2016-04-30
Scientific collaboration networks are a hallmark of contemporary academic research. Researchers are no longer independent players, but members of teams that bring together complementary skills and multidisciplinary approaches around common goals. Social network analysis and co-authorship networks are increasingly used as powerful tools to assess collaboration trends and to identify leading scientists and organizations. The analysis reveals the social structure of the networks by identifying actors and their connections. This article reviews the method and potential applications of co-authorship network analysis in health. The basic steps for conducting co-authorship studies in health research are described and common network metrics are presented. The application of the method is exemplified by an overview of the global research network for Chikungunya virus vaccines.
Basic Wiring. Module 2 of the Vocational Education Readiness Test (VERT).
ERIC Educational Resources Information Center
Thomas, Edward L., Comp.
Focusing on basic welding, this module is one of eight included in the Vocational Education Readiness Test (VERT). The module begins by listing the objectives of the module and describing tools and equipment needed. The remainder of the module contains sections on manipulative skills, trade vocabulary, tool identification, trade computation…
NASA Astrophysics Data System (ADS)
Bennett, Kristin
2004-03-01
As one of the lead agencies for nanotechnology research and development, the Department of Energy (DOE) is revolutionizing the way we understand and manipulate materials at the nanoscale. As the Federal government's single largest supporter of basic research in the physical sciences in the United States, and overseeing the Nation's cross-cutting research programs in high-energy physics, nuclear physics, and fusion energy sciences, the DOE guides the grand challenges in nanomaterials research that will have an impact on everything from medicine, to energy production, to manufacturing. Within the DOE's Office of Science, the Office of Basic Energy Sciences (BES) leads research and development at the nanoscale, which supports the Department's missions of national security, energy, science, and the environment. The cornerstone of the program in nanoscience is the establishment and operation of five new Nanoscale Science Research Centers (NSRCs), which are under development at six DOE Laboratories. Throughout its history, DOE's Office of Science has designed, constructed and operated many of the nation's most advanced, large-scale research and development user facilities, of importance to all areas of science. These state-of-the art facilities are shared with the science community worldwide and contain technologies and instruments that are available nowhere else. Like all DOE national user facilities, the new NSRCs are designed to make novel state-of-the-art research tools available to the world, and to accelerate a broad scale national effort in basic nanoscience and nanotechnology. The NSRCs will be sited adjacent to or near existing DOE/BES major user facilities, and are designed to enable national user access to world-class capabilities for the synthesis, processing, fabrication, and analysis of materials at the nanoscale, and to transform the nation's approach to nanomaterials.
Learning motion concepts using real-time microcomputer-based laboratory tools
NASA Astrophysics Data System (ADS)
Thornton, Ronald K.; Sokoloff, David R.
1990-09-01
Microcomputer-based laboratory (MBL) tools have been developed which interface to Apple II and Macintosh computers. Students use these tools to collect physical data that are graphed in real time and then can be manipulated and analyzed. The MBL tools have made possible discovery-based laboratory curricula that embody results from educational research. These curricula allow students to take an active role in their learning and encourage them to construct physical knowledge from observation of the physical world. The curricula encourage collaborative learning by taking advantage of the fact that MBL tools present data in an immediately understandable graphical form. This article describes one of the tools—the motion detector (hardware and software)—and the kinematics curriculum. The effectiveness of this curriculum compared to traditional college and university methods for helping students learn basic kinematics concepts has been evaluated by pre- and post-testing and by observation. There is strong evidence for significantly improved learning and retention by students who used the MBL materials, compared to those taught in lecture.
The integration of FMEA with other problem solving tools: A review of enhancement opportunities
NASA Astrophysics Data System (ADS)
Ng, W. C.; Teh, S. Y.; Low, H. C.; Teoh, P. C.
2017-09-01
Failure Mode Effect Analysis (FMEA) is one the most effective and accepted problem solving (PS) tools for most of the companies in the world. Since FMEA was first introduced in 1949, practitioners have implemented FMEA in various industries for their quality improvement initiatives. However, studies have shown that there are drawbacks that hinder the effectiveness of FMEA for continuous quality improvement from product design to manufacturing. Therefore, FMEA is integrated with other PS tools such as inventive problem solving methodology (TRIZ), Quality Function Deployment (QFD), Root Cause Analysis (RCA) and seven basic tools of quality to address the drawbacks. This study begins by identifying the drawbacks in FMEA. A comprehensive literature review on the integration of FMEA with other tools is carried out to categorise the integrations based on the drawbacks identified. The three categories are inefficiency of failure analysis, psychological inertia and neglect of customers’ perspective. This study concludes by discussing the gaps and opportunities in the integration for future research.
Consumer nueroscience: a new area of study for biomedical engineers.
Babiloni, Fabio
2012-01-01
In scientific literature, the most accepted definition of consumer neuroscience or neuromarketing is that it is a field of study concerning the application of neuroscience methods to analyze and understand human behavior related to markets and marketing exchanges. First, it might seem strange that marketers would be interested in using neuroscience to understand consumer's preferences. Yet in practice, the basic goal of marketers is to guide the design and presentation of products in such a way that they are highly compatible with consumer preferences. To understand consumers preferences, several standard research tools are commonly used by marketers, such as personal interviews with the consumers, scoring questionnaries gathered from consumers, and focus groups. The reason marketing researchers are interested in using brain imaging tools instead of simply asking people for their preferences in front of marketing stimuli, arises from the assumption that people cannot (or do not want to) fully explain their preference when explicitly asked. Researchers in the field hypothesize that neuroimaging tools can access information within the consumer's brain during the generation of a preference or the observation of a commercial advertisement. The question of will this information be useful in further promoting the product is still up for debate in marketing literature. From the marketing researchers point of view, there is a hope that this body of brain imaging techniques will provide an efficient tradeoff between costs and benefits of the research. Currently, neuroscience methodology includes powerful brain imaging tools based on the gathering of hemodynamic or electromagnetic signals related to the human brain activity during the performance of a relevant task for marketing objectives. These tools are briefly reviewed in this article.
1981-04-30
However, SREM was not designed to harmonize these kinds of problems. Rather, it is a tool to investigate the logic of the processing specified in the... design . Supoorting programs were also conducted to perform basic research into such areas as software reliability, static and dynamic validation techniques...development. 0 Maintain requirements development independent of the target machine and the eventual software design . 0. Allow for easy response to
ProMateus—an open research approach to protein-binding sites analysis
Neuvirth, Hani; Heinemann, Uri; Birnbaum, David; Tishby, Naftali; Schreiber, Gideon
2007-01-01
The development of bioinformatic tools by individual labs results in the abundance of parallel programs for the same task. For example, identification of binding site regions between interacting proteins is done using: ProMate, WHISCY, PPI-Pred, PINUP and others. All servers first identify unique properties of binding sites and then incorporate them into a predictor. Obviously, the resulting prediction would improve if the most suitable parameters from each of those predictors would be incorporated into one server. However, because of the variation in methods and databases, this is currently not feasible. Here, the protein-binding site prediction server is extended into a general protein-binding sites research tool, ProMateus. This web tool, based on ProMate's infrastructure enables the easy exploration and incorporation of new features and databases by the user, providing an evaluation of the benefit of individual features and their combination within a set framework. This transforms the individual research into a community exercise, bringing out the best from all users for optimized predictions. The analysis is demonstrated on a database of protein protein and protein-DNA interactions. This approach is basically different from that used in generating meta-servers. The implications of the open-research approach are discussed. ProMateus is available at http://bip.weizmann.ac.il/promate. PMID:17488838
A new approach to electrophoresis in space
NASA Technical Reports Server (NTRS)
Snyder, Robert S.; Rhodes, Percy H.
1990-01-01
Previous electrophoresis experiments performed in space are reviewed. There is sufficient data available from the results of these experiments to show that they were designed with incomplete knowledge of the fluid dynamics of the process including electrohydrodynamics. Redesigning laboratory chambers and operating procedures developed on Earth for space without understanding both the advantages and disadvantages of the microgravity environment has yielded poor separations of both cells and proteins. However, electrophoreris is still an important separation tool in the laboratory and thermal convection does limit its performance. Thus, there is a justification for electrophoresis but the emphasis of future space experiments must be directed toward basic research with model experiments to understand the microgravity environment and fluid analysis to test the basic principles of the process.
Juvenile Radio-Tag Study: Lower Granite Dam, 1985 Annual Report.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stuehrenberg, Lowell C.
The concept of using mass releases of juvenile radio tags represents a new and potentially powerful research tool that could be effectively applied to juvenile salmonid passage problems at dams on the Columbia and Snake Rivers. A system of detector antennas, strategically located, would automatically detect and record individually tagged juvenile salmonids as they pass through the spillway, powerhouse, bypass system, or tailrace areas below the dam. Accurate measurements of spill effectiveness, fish guiding efficiency (FGE), collection efficiency (CE), spillway survival, powerhouse survival, and bypass survival would be possible without handling large numbers of unmarked fish. A prototype juvenile radio-tagmore » system was developed and tested by the National Marine Fisheries Service (NMFS) and Bonneville Power Administration (BPA) at John Day Dam and at Lower Granite Dam. This report summarizes research to: (1) evaluate the effectiveness of the prototype juvenile radio-tag system in a field situation and (2) to test the basic assumptions inherent in using the juvenile radio tag as a research tool.« less
Khan, Imtiaz A; Fraser, Adam; Bray, Mark-Anthony; Smith, Paul J; White, Nick S; Carpenter, Anne E; Errington, Rachel J
2014-12-01
Experimental reproducibility is fundamental to the progress of science. Irreproducible research decreases the efficiency of basic biological research and drug discovery and impedes experimental data reuse. A major contributing factor to irreproducibility is difficulty in interpreting complex experimental methodologies and designs from written text and in assessing variations among different experiments. Current bioinformatics initiatives either are focused on computational research reproducibility (i.e. data analysis) or laboratory information management systems. Here, we present a software tool, ProtocolNavigator, which addresses the largely overlooked challenges of interpretation and assessment. It provides a biologist-friendly open-source emulation-based tool for designing, documenting and reproducing biological experiments. ProtocolNavigator was implemented in Python 2.7, using the wx module to build the graphical user interface. It is a platform-independent software and freely available from http://protocolnavigator.org/index.html under the GPL v2 license. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Astrophysical Computation in Research, the Classroom and Beyond
NASA Astrophysics Data System (ADS)
Frank, Adam
2009-03-01
In this talk I review progress in the use of simulations as a tool for astronomical research, for education and public outreach. The talk will include the basic elements of numerical simulations as well as advances in algorithms which have led to recent dramatic progress such as the use of Adaptive Mesh Refinement methods. The scientific focus of the talk will be star formation jets and outflows while the educational emphasis will be on the use of advanced platforms for simulation based learning in lecture and integrated homework. Learning modules for science outreach websites such as DISCOVER magazine will also be highlighted.
Chun, Yong Soon; Chaudhari, Pooja; Jang, Yoon-Young
2010-12-14
The recent advances in the induced pluripotent stem cell (iPSC) research have significantly changed our perspectives on regenerative medicine by providing researchers with a unique tool to derive disease-specific stem cells for study. In this review, we describe the human iPSC generation from developmentally diverse origins (i.e. endoderm-, mesoderm-, and ectoderm- tissue derived human iPSCs) and multistage hepatic differentiation protocols, and discuss both basic and clinical applications of these cells including disease modeling, drug toxicity screening/drug discovery, gene therapy and cell replacement therapy.
Computational neuropharmacology: dynamical approaches in drug discovery.
Aradi, Ildiko; Erdi, Péter
2006-05-01
Computational approaches that adopt dynamical models are widely accepted in basic and clinical neuroscience research as indispensable tools with which to understand normal and pathological neuronal mechanisms. Although computer-aided techniques have been used in pharmaceutical research (e.g. in structure- and ligand-based drug design), the power of dynamical models has not yet been exploited in drug discovery. We suggest that dynamical system theory and computational neuroscience--integrated with well-established, conventional molecular and electrophysiological methods--offer a broad perspective in drug discovery and in the search for novel targets and strategies for the treatment of neurological and psychiatric diseases.
An overview of expert systems. [artificial intelligence
NASA Technical Reports Server (NTRS)
Gevarter, W. B.
1982-01-01
An expert system is defined and its basic structure is discussed. The knowledge base, the inference engine, and uses of expert systems are discussed. Architecture is considered, including choice of solution direction, reasoning in the presence of uncertainty, searching small and large search spaces, handling large search spaces by transforming them and by developing alternative or additional spaces, and dealing with time. Existing expert systems are reviewed. Tools for building such systems, construction, and knowledge acquisition and learning are discussed. Centers of research and funding sources are listed. The state-of-the-art, current problems, required research, and future trends are summarized.
Translational Environmental Research: Improving the Usefulness and Usability of Research Results
NASA Astrophysics Data System (ADS)
Garfin, G.
2008-12-01
In recent years, requests for proposals more frequently emphasize outreach to stakeholder communities, decision support, and science that serves societal needs. Reports from the National Academy of Sciences and Western States Water Council emphasize the need for science translation and outreach, in order to address societal concerns with climate extremes, such as drought, the use of climate predictions, and the growing challenges of climate change. In the 1990s, the NOAA Climate Program Office developed its Regional Integrated Sciences and Asssessments program to help bridge the gap between climate science (notably, seasonal predictions) and society, to improve the flow of information to stakeholders, and to increase the relevance of climate science to inform decisions. During the same time period, the National Science Foundation initiated multi-year Science and Technology Centers and Decision Making Under Uncertainty Centers, with similar goals, but different metrics of success. Moreover, the combination of population growth, climate change, and environmental degradation has prompted numerous research initiatives on linking knowledge and action for sustainable development. This presentation reviews various models and methodologies for translating science results from field, lab, or modeling work to use by society. Lessons and approaches from cooperative extension, boundary organizations, co-production of science and policy, and medical translational research are examined. In particular, multi-step translation as practiced within the health care community is examined. For example, so- called "T1" (translation 1) research moves insights from basic science to clinical research; T2 research evaluates the effectiveness of clinical practice, who benefits from promising care regimens, and develops tools for clinicians, patients, and policy makers. T3 activities test the implementation, delivery, and spread of research results and clinical practices in order to foster policy changes and improve general health. Parallels in environmental sciences might be TER1 (translational environmental research 1), basic insights regarding environmental processes and relationships between environmental changes and their causes. TER2, applied environmental research, development of best practices, and development of decision support tools. TER3, might include usability and impact evaluation, effective outreach and implementation of best practices, and application of research insights to public policy and institutional change. According to the medical literature, and in anecdotal evidence from end-to-end environmental science, decision-maker and public involvement in these various forms of engaged research decreases the lag between scientific discovery and implementation of discoveries in operational practices, information tools, and organizational and public policies.
Djioua, Moussa; Plamondon, Réjean
2009-11-01
In this paper, we present a new analytical method for estimating the parameters of Delta-Lognormal functions and characterizing handwriting strokes. According to the Kinematic Theory of rapid human movements, these parameters contain information on both the motor commands and the timing properties of a neuromuscular system. The new algorithm, called XZERO, exploits relationships between the zero crossings of the first and second time derivatives of a lognormal function and its four basic parameters. The methodology is described and then evaluated under various testing conditions. The new tool allows a greater variety of stroke patterns to be processed automatically. Furthermore, for the first time, the extraction accuracy is quantified empirically, taking advantage of the exponential relationships that link the dispersion of the extraction errors with its signal-to-noise ratio. A new extraction system which combines this algorithm with two other previously published methods is also described and evaluated. This system provides researchers involved in various domains of pattern analysis and artificial intelligence with new tools for the basic study of single strokes as primitives for understanding rapid human movements.
Hypnosis and surgery: past, present, and future.
Wobst, Albrecht H K
2007-05-01
Hypnosis has been defined as the induction of a subjective state in which alterations of perception or memory can be elicited by suggestion. Ever since the first public demonstrations of "animal magnetism" by Mesmer in the 18th century, the use of this psychological tool has fascinated the medical community and public alike. The application of hypnosis to alter pain perception and memory dates back centuries. Yet little progress has been made to fully comprehend or appreciate its potential compared to the pharmacologic advances in anesthesiology. Recently, hypnosis has aroused interest, as hypnosis seems to complement and possibly enhance conscious sedation. Contemporary clinical investigators claim that the combination of analgesia and hypnosis is superior to conventional pharmacologic anesthesia for minor surgical cases, with patients and surgeons responding favorably. Simultaneously, basic research of pain pathways involving the nociceptive flexion reflex and positron emission tomography has yielded objective data regarding the physiologic correlates of hypnosis. In this article I review the history, basic scientific and clinical studies, and modern practical considerations of one of the oldest therapeutical tools: the power of suggestion.
Final Technical Report: "New Tools for Physics with Low-energy Antimatter"
DOE Office of Scientific and Technical Information (OSTI.GOV)
Surko, Clifford M.
2013-10-02
The objective of this research is to develop new tools to manipulate antimatter plasmas and to tailor them for specific scientific and technical uses. The work has two specific objectives. One is establishing the limits for positron accumulation and confinement in the form of single-component plasmas in Penning-Malmberg traps. This technique underpins a wealth of antimatter applications. A second objective is to develop an understanding of the limits for formation of cold, bright positron beams. The research done in this grant focused on particular facets of these goals. One focus was extracting tailored beams from a high-field Penning-Malmberg trap frommore » the magnetic field to form new kinds of high-quality electrostatic beams. A second goal was to develop the technology for colder trap-based beams using a cryogenically cooled buffer gas. A third objective was to conduct the basic plasma research to develop a new high-capacity multicell trap (MCT) for research with antimatter. Progress is reported here in all three areas. While the goal of this research is to develop new tools for manipulating positrons (i.e., the antiparticles of electrons), much of the work was done with test electron plasmas for increased data rate. Some of the techniques developed in the course of this work are also relevant to the manipulation and use of antiprotons.« less
Barriers to the Preclinical Development of Therapeutics that Target Aging Mechanisms.
Burd, Christin E; Gill, Matthew S; Niedernhofer, Laura J; Robbins, Paul D; Austad, Steven N; Barzilai, Nir; Kirkland, James L
2016-11-01
Through the progress of basic science research, fundamental mechanisms that contribute to age-related decline are being described with increasing depth and detail. Although these efforts have identified new drug targets and compounds that extend life span in model organisms, clinical trials of therapeutics that target aging processes remain scarce. Progress in aging research is hindered by barriers associated with the translation of basic science discoveries into the clinic. This report summarizes discussions held at a 2014 Geroscience Network retreat focused on identifying hurdles that currently impede the preclinical development of drugs targeting fundamental aging processes. From these discussions, it was evident that aging researchers have varied perceptions of the ideal preclinical pipeline. To forge a clear and cohesive path forward, several areas of controversy must first be resolved and new tools developed. Here, we focus on five key issues in preclinical drug development (drug discovery, lead compound development, translational preclinical biomarkers, funding, and integration between researchers and clinicians), expanding upon discussions held at the Geroscience Retreat and suggesting areas for further research. By bringing these findings to the attention of the aging research community, we hope to lay the foundation for a concerted preclinical drug development pipeline. © The Author 2016. Published by Oxford University Press on behalf of The Gerontological Society of America.
Gleissner, Christian A
2016-05-01
Atherosclerosis is the leading cause of death worldwide. Research on the pathophysiological mechanisms of atherogenesis has made tremendous progress over the past two decades. However, despite great advances there is still a lack of therapies that reduce adverse cardiovascular events to an acceptable degree. This review addresses successes, but also questions, challenges, and chances regarding the translation of basic science results into clinical practice, i.e. the capability to apply the results of basic and/or clinical research in order to design therapies suitable to improve patient outcome. Specifically, it discusses problems in translating findings from the most broadly used murine models of atherosclerosis into clinically feasible therapies and strategies potentially improving the results of clinical trials. Most likely, the key to success will be a multimodal approach employing novel imaging methods as well as large scale screening tools-summarized as "omics" approach. Using individually tailored therapies, plaque stabilization and regression could prevent adverse cardiovascular events thereby improving outcome of a large number of patients. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Koizumi, Amane; Nagata, Osamu; Togawa, Morio; Sazi, Toshiyuki
2014-01-01
Neuroscience is an expanding field of science to investigate enigmas of brain and human body function. However, the majority of the public have never had the chance to learn the basics of neuroscience and new knowledge from advanced neuroscience research through hands-on experience. Here, we report that we produced the Muscle Sensor, a simplified electromyography, to promote educational understanding in neuroscience. The Muscle Sensor can detect myoelectric potentials which are filtered and processed as 3-V pulse signals to shine a light bulb and emit beep sounds. With this educational tool, we delivered "On-Site Neuroscience Lectures" in Japanese junior-high schools to facilitate hands-on experience of neuroscientific electrophysiology and to connect their text-book knowledge to advanced neuroscience researches. On-site neuroscience lectures with the Muscle Sensor pave the way for a better understanding of the basics of neuroscience and the latest topics such as how brain-machine-interface technology could help patients with disabilities such as spinal cord injuries. Copyright © 2013 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.
Odeh, Hana; Miranda, Lisa; Rao, Abhi; Vaught, Jim; Greenman, Howard; McLean, Jeffrey; Reed, Daniel; Memon, Sarfraz; Fombonne, Benjamin; Guan, Ping
2015-01-01
Background: Biospecimens are essential resources for advancing basic and translational research. However, there are little data available regarding the costs associated with operating a biobank, and few resources to enable their long-term sustainability. To support the research community in this effort, the National Institutes of Health, National Cancer Institute's Biorepositories and Biospecimen Research Branch has developed the Biobank Economic Modeling Tool (BEMT). The tool is accessible at http://biospecimens.cancer.gov/resources/bemt.asp. Methods: To obtain market-based cost information and to inform the development of the tool, a survey was designed and sent to 423 biobank managers and directors across the world. The survey contained questions regarding infrastructure investments, salary costs, funding options, types of biospecimen resources and services offered, as well as biospecimen pricing and service-related costs. Results: A total of 106 responses were received. The data were anonymized, aggregated, and used to create a comprehensive database of cost and pricing information that was integrated into the web-based tool, the BEMT. The BEMT was built to allow the user to input cost and pricing data through a seven-step process to build a cost profile for their biobank, define direct and indirect costs, determine cost recovery fees, perform financial forecasting, and query the anonymized survey data from comparable biobanks. Conclusion: A survey was conducted to obtain a greater understanding of the costs involved in operating a biobank. The anonymized survey data was then used to develop the BEMT, a cost modeling tool for biobanks. Users of the tool will be able to create a cost profile for their biobanks' specimens, products and services, establish pricing, and allocate costs for biospecimens based on percent cost recovered, and perform project-specific cost analyses and financial forecasting. PMID:26697911
Odeh, Hana; Miranda, Lisa; Rao, Abhi; Vaught, Jim; Greenman, Howard; McLean, Jeffrey; Reed, Daniel; Memon, Sarfraz; Fombonne, Benjamin; Guan, Ping; Moore, Helen M
2015-12-01
Biospecimens are essential resources for advancing basic and translational research. However, there are little data available regarding the costs associated with operating a biobank, and few resources to enable their long-term sustainability. To support the research community in this effort, the National Institutes of Health, National Cancer Institute's Biorepositories and Biospecimen Research Branch has developed the Biobank Economic Modeling Tool (BEMT). The tool is accessible at http://biospecimens.cancer.gov/resources/bemt.asp. To obtain market-based cost information and to inform the development of the tool, a survey was designed and sent to 423 biobank managers and directors across the world. The survey contained questions regarding infrastructure investments, salary costs, funding options, types of biospecimen resources and services offered, as well as biospecimen pricing and service-related costs. A total of 106 responses were received. The data were anonymized, aggregated, and used to create a comprehensive database of cost and pricing information that was integrated into the web-based tool, the BEMT. The BEMT was built to allow the user to input cost and pricing data through a seven-step process to build a cost profile for their biobank, define direct and indirect costs, determine cost recovery fees, perform financial forecasting, and query the anonymized survey data from comparable biobanks. A survey was conducted to obtain a greater understanding of the costs involved in operating a biobank. The anonymized survey data was then used to develop the BEMT, a cost modeling tool for biobanks. Users of the tool will be able to create a cost profile for their biobanks' specimens, products and services, establish pricing, and allocate costs for biospecimens based on percent cost recovered, and perform project-specific cost analyses and financial forecasting.
The use of simulation in teaching the basic sciences.
Eason, Martin P
2013-12-01
To assess the current use of simulation in medical education, specifically, the teaching of the basic sciences to accomplish the goal of improved integration. Simulation is increasingly being used by the institutions to teach the basic sciences. Preliminary data suggest that it is an effective tool with increased retention and learner satisfaction. Medical education is undergoing tremendous change. One of the directions of that change is increasing integration of the basic and clinical sciences to improve the efficiency and quality of medical education, and ultimately to improve the patient care. Integration is thought to improve the understanding of basic science conceptual knowledge and to better prepare the learners for clinical practice. Simulation because of its unique effects on learning is currently being successfully used by many institutions as a means to produce that integration through its use in the teaching of the basic sciences. Preliminary data indicate that simulation is an effective tool for basic science education and garners high learner satisfaction.
Laurent, Pélozuelo; Frérot, Brigitte
2007-12-01
Since the identification of female European corn borer, Ostrinia nubilalis (Hübner) pheromone, pheromone-baited traps have been regarded as a promising tool to monitor populations of this pest. This article reviews the literature produced on this topic since the 1970s. Its aim is to provide extension entomologists and other researchers with all the necessary information to establish an efficient trapping procedure for this moth. The different pheromone races of the European corn borer are described, and research results relating to the optimization of pheromone blend, pheromone bait, trap design, and trap placement are summarized followed by a state-of-the-art summary of data comparing blacklight trap and pheromone-baited trap techniques to monitor European corn borer flight. Finally, we identify the information required to definitively validate/invalidate the pheromone-baited traps as an efficient decision support tool in European corn borer control.
Software systems for modeling articulated figures
NASA Technical Reports Server (NTRS)
Phillips, Cary B.
1989-01-01
Research in computer animation and simulation of human task performance requires sophisticated geometric modeling and user interface tools. The software for a research environment should present the programmer with a powerful but flexible substrate of facilities for displaying and manipulating geometric objects, yet insure that future tools have a consistent and friendly user interface. Jack is a system which provides a flexible and extensible programmer and user interface for displaying and manipulating complex geometric figures, particularly human figures in a 3D working environment. It is a basic software framework for high-performance Silicon Graphics IRIS workstations for modeling and manipulating geometric objects in a general but powerful way. It provides a consistent and user-friendly interface across various applications in computer animation and simulation of human task performance. Currently, Jack provides input and control for applications including lighting specification and image rendering, anthropometric modeling, figure positioning, inverse kinematics, dynamic simulation, and keyframe animation.
Translational Research and Plasma Proteomic in Cancer.
Santini, Annamaria Chiara; Giovane, Giancarlo; Auletta, Adelaide; Di Carlo, Angelina; Fiorelli, Alfonso; Cito, Letizia; Astarita, Carlo; Giordano, Antonio; Alfano, Roberto; Feola, Antonia; Di Domenico, Marina
2016-04-01
Proteomics is a recent field of research in molecular biology that can help in the fight against cancer through the search for biomarkers that can detect this disease in the early stages of its development. Proteomic is a speedily growing technology, also thanks to the development of even more sensitive and fast mass spectrometry analysis. Although this technique is the most widespread for the discovery of new cancer biomarkers, it still suffers of a poor sensitivity and insufficient reproducibility, essentially due to the tumor heterogeneity. Common technical shortcomings include limitations in the sensitivity of detecting low abundant biomarkers and possible systematic biases in the observed data. Current research attempts are trying to develop high-resolution proteomic instrumentation for high-throughput monitoring of protein changes that occur in cancer. In this review, we describe the basic features of the proteomic tools which have proven to be useful in cancer research, showing their advantages and disadvantages. The application of these proteomic tools could provide early biomarkers detection in various cancer types and could improve the understanding the mechanisms of tumor growth and dissemination. © 2015 Wiley Periodicals, Inc.
Smolinski, Tomasz G
2010-01-01
Computer literacy plays a critical role in today's life sciences research. Without the ability to use computers to efficiently manipulate and analyze large amounts of data resulting from biological experiments and simulations, many of the pressing questions in the life sciences could not be answered. Today's undergraduates, despite the ubiquity of computers in their lives, seem to be largely unfamiliar with how computers are being used to pursue and answer such questions. This article describes an innovative undergraduate-level course, titled Computer Literacy for Life Sciences, that aims to teach students the basics of a computerized scientific research pursuit. The purpose of the course is for students to develop a hands-on working experience in using standard computer software tools as well as computer techniques and methodologies used in life sciences research. This paper provides a detailed description of the didactical tools and assessment methods used in and outside of the classroom as well as a discussion of the lessons learned during the first installment of the course taught at Emory University in fall semester 2009.
A comprehensive crop genome research project: the Superhybrid Rice Genome Project in China.
Yu, Jun; Wong, Gane Ka-Shu; Liu, Siqi; Wang, Jian; Yang, Huanming
2007-06-29
In May 2000, the Beijing Institute of Genomics formally announced the launch of a comprehensive crop genome research project on rice genomics, the Chinese Superhybrid Rice Genome Project. SRGP is not simply a sequencing project targeted to a single rice (Oryza sativa L.) genome, but a full-swing research effort with an ultimate goal of providing inclusive basic genomic information and molecular tools not only to understand biology of the rice, both as an important crop species and a model organism of cereals, but also to focus on a popular superhybrid rice landrace, LYP9. We have completed the first phase of SRGP and provide the rice research community with a finished genome sequence of an indica variety, 93-11 (the paternal cultivar of LYP9), together with ample data on subspecific (between subspecies) polymorphisms, transcriptomes and proteomes, useful for within-species comparative studies. In the second phase, we have acquired the genome sequence of the maternal cultivar, PA64S, together with the detailed catalogues of genes uniquely expressed in the parental cultivars and the hybrid as well as allele-specific markers that distinguish parental alleles. Although SRGP in China is not an open-ended research programme, it has been designed to pave a way for future plant genomics research and application, such as to interrogate fundamentals of plant biology, including genome duplication, polyploidy and hybrid vigour, as well as to provide genetic tools for crop breeding and to carry along a social burden-leading a fight against the world's hunger. It began with genomics, the newly developed and industry-scale research field, and from the world's most populous country. In this review, we summarize our scientific goals and noteworthy discoveries that exploit new territories of systematic investigations on basic and applied biology of rice and other major cereal crops.
NASA Astrophysics Data System (ADS)
Aditya, B. R.; Permadi, A.
2018-03-01
This paper describes implementation of Unified Theory of Acceptance and User of Technology (UTAUT) model to assess the use of virtual classroom in support of teaching and learning in higher education. The purpose of this research is how virtual classroom that has fulfilled the basic principle can be accepted and used by students positively. This research methodology uses the quantitative and descriptive approach with a questionnaire as a tool for measuring the height of virtual classroom principle acception. This research uses a sample of 105 students in D3 Informatics Management at Telkom University. The result of this research is that the use of classroom virtual principle are positive and relevant to the students in higher education.
McIlvane, William J; Kledaras, Joanne B; Gerard, Christophe J; Wilde, Lorin; Smelson, David
2018-07-01
A few noteworthy exceptions notwithstanding, quantitative analyses of relational learning are most often simple descriptive measures of study outcomes. For example, studies of stimulus equivalence have made much progress using measures such as percentage consistent with equivalence relations, discrimination ratio, and response latency. Although procedures may have ad hoc variations, they remain fairly similar across studies. Comparison studies of training variables that lead to different outcomes are few. Yet to be developed are tools designed specifically for dynamic and/or parametric analyses of relational learning processes. This paper will focus on recent studies to develop (1) quality computer-based programmed instruction for supporting relational learning in children with autism spectrum disorders and intellectual disabilities and (2) formal algorithms that permit ongoing, dynamic assessment of learner performance and procedure changes to optimize instructional efficacy and efficiency. Because these algorithms have a strong basis in evidence and in theories of stimulus control, they may have utility also for basic and translational research. We present an overview of the research program, details of algorithm features, and summary results that illustrate their possible benefits. It also presents arguments that such algorithm development may encourage parametric research, help in integrating new research findings, and support in-depth quantitative analyses of stimulus control processes in relational learning. Such algorithms may also serve to model control of basic behavioral processes that is important to the design of effective programmed instruction for human learners with and without functional disabilities. Copyright © 2018 Elsevier B.V. All rights reserved.
Bayesian models: A statistical primer for ecologists
Hobbs, N. Thompson; Hooten, Mevin B.
2015-01-01
Bayesian modeling has become an indispensable tool for ecological research because it is uniquely suited to deal with complexity in a statistically coherent way. This textbook provides a comprehensive and accessible introduction to the latest Bayesian methods—in language ecologists can understand. Unlike other books on the subject, this one emphasizes the principles behind the computations, giving ecologists a big-picture understanding of how to implement this powerful statistical approach.Bayesian Models is an essential primer for non-statisticians. It begins with a definition of probability and develops a step-by-step sequence of connected ideas, including basic distribution theory, network diagrams, hierarchical models, Markov chain Monte Carlo, and inference from single and multiple models. This unique book places less emphasis on computer coding, favoring instead a concise presentation of the mathematical statistics needed to understand how and why Bayesian analysis works. It also explains how to write out properly formulated hierarchical Bayesian models and use them in computing, research papers, and proposals.This primer enables ecologists to understand the statistical principles behind Bayesian modeling and apply them to research, teaching, policy, and management.Presents the mathematical and statistical foundations of Bayesian modeling in language accessible to non-statisticiansCovers basic distribution theory, network diagrams, hierarchical models, Markov chain Monte Carlo, and moreDeemphasizes computer coding in favor of basic principlesExplains how to write out properly factored statistical expressions representing Bayesian models
CRISPR-Cas9: a promising tool for gene editing on induced pluripotent stem cells
Kim, Eun Ji; Kang, Ki Ho; Ju, Ji Hyeon
2017-01-01
Recent advances in genome editing with programmable nucleases have opened up new avenues for multiple applications, from basic research to clinical therapy. The ease of use of the technology—and particularly clustered regularly interspaced short palindromic repeats (CRISPR)—will allow us to improve our understanding of genomic variation in disease processes via cellular and animal models. Here, we highlight the progress made in correcting gene mutations in monogenic hereditary disorders and discuss various CRISPR-associated applications, such as cancer research, synthetic biology, and gene therapy using induced pluripotent stem cells. The challenges, ethical issues, and future prospects of CRISPR-based systems for human research are also discussed. PMID:28049282
CRISPR-Cas9: a promising tool for gene editing on induced pluripotent stem cells.
Kim, Eun Ji; Kang, Ki Ho; Ju, Ji Hyeon
2017-01-01
Recent advances in genome editing with programmable nucleases have opened up new avenues for multiple applications, from basic research to clinical therapy. The ease of use of the technology-and particularly clustered regularly interspaced short palindromic repeats (CRISPR)-will allow us to improve our understanding of genomic variation in disease processes via cellular and animal models. Here, we highlight the progress made in correcting gene mutations in monogenic hereditary disorders and discuss various CRISPR-associated applications, such as cancer research, synthetic biology, and gene therapy using induced pluripotent stem cells. The challenges, ethical issues, and future prospects of CRISPR-based systems for human research are also discussed.
Highway Maintenance Equipment Operator: Basic Core. Training Materials.
ERIC Educational Resources Information Center
Perky, Sandra Dutreau; And Others
This basic core curriculum is part of a three-part series of instructional guides designed for use in teaching a course in highway maintenance equipment operation. Addressed in the individual units of the curriculum, after an orientation unit, are safety; basic math; basic hand tools; procedures for loading. lashing, and unloading equipment;…
Sestoft, Peter
2011-01-01
Research relies on ever larger amounts of data from experiments, automated production equipment, questionnaries, times series such as weather records, and so on. A major task in science is to combine, process and analyse such data to obtain evidence of patterns and correlations.Most research data are on digital form, which in principle ensures easy processing and analysis, easy long-term preservation, and easy reuse in future research, perhaps in entirely unanticipated ways. However, in practice, obstacles such as incompatible or undocumented data formats, poor data quality and lack of familiarity with current technology prevent researchers from making full use of available data.This paper argues that relational databases are excellent tools for veterinary research and animal production; provides a small example to introduce basic database concepts; and points out some concerns that must be addressed when organizing data for research purposes.
Time management: a review for physicians.
Brunicardi, F. C.; Hobson, F. L.
1996-01-01
This article reviews the basic concepts and techniques of time management as they relate to a medical lifestyle. Essential tools are described to help the physician reassess and sharpen skills for handling intensifying demands and constraints of juggling patient care, research, teaching, and family responsibilities. The historical background and principles of time management for three popular "best selling" techniques are critiqued. In addition, a fourth technique, or model, of time management is introduced for physician use. PMID:8855650
Artificial intelligence in public health prevention of legionelosis in drinking water systems.
Sinčak, Peter; Ondo, Jaroslav; Kaposztasova, Daniela; Virčikova, Maria; Vranayova, Zuzana; Sabol, Jakub
2014-08-21
Good quality water supplies and safe sanitation in urban areas are a big challenge for governments throughout the world. Providing adequate water quality is a basic requirement for our lives. The colony forming units of the bacterium Legionella pneumophila in potable water represent a big problem which cannot be overlooked for health protection reasons. We analysed several methods to program a virtual hot water tank with AI (artificial intelligence) tools including neuro-fuzzy systems as a precaution against legionelosis. The main goal of this paper is to present research which simulates the temperature profile in the water tank. This research presents a tool for a water management system to simulate conditions which are able to prevent legionelosis outbreaks in a water system. The challenge is to create a virtual water tank simulator including the water environment which can simulate a situation which is common in building water distribution systems. The key feature of the presented system is its adaptation to any hot water tank. While respecting the basic parameters of hot water, a water supplier and building maintainer are required to ensure the predefined quality and water temperature at each sampling site and avoid the growth of Legionella. The presented system is one small contribution how to overcome a situation when legionelosis could find good conditions to spread and jeopardize human lives.
Artificial Intelligence in Public Health Prevention of Legionelosis in Drinking Water Systems
Sinčak, Peter; Ondo, Jaroslav; Kaposztasova, Daniela; Virčikova, Maria; Vranayova, Zuzana; Sabol, Jakub
2014-01-01
Good quality water supplies and safe sanitation in urban areas are a big challenge for governments throughout the world. Providing adequate water quality is a basic requirement for our lives. The colony forming units of the bacterium Legionella pneumophila in potable water represent a big problem which cannot be overlooked for health protection reasons. We analysed several methods to program a virtual hot water tank with AI (artificial intelligence) tools including neuro-fuzzy systems as a precaution against legionelosis. The main goal of this paper is to present research which simulates the temperature profile in the water tank. This research presents a tool for a water management system to simulate conditions which are able to prevent legionelosis outbreaks in a water system. The challenge is to create a virtual water tank simulator including the water environment which can simulate a situation which is common in building water distribution systems. The key feature of the presented system is its adaptation to any hot water tank. While respecting the basic parameters of hot water, a water supplier and building maintainer are required to ensure the predefined quality and water temperature at each sampling site and avoid the growth of Legionella. The presented system is one small contribution how to overcome a situation when legionelosis could find good conditions to spread and jeopardize human lives. PMID:25153475
Historical Contributions to Vertical Flight at the NASA Langley Research Center
NASA Technical Reports Server (NTRS)
Hodges, William T.; Gorton, Susan A.; Jackson, Karen E.
2016-01-01
The NASA Langley Research Center has had a long and distinguished history in powered lift technology development. This research has formed the foundation of knowledge for the powered lift community worldwide. From aerodynamics to structures, aeromechanics, powered lift, acoustics, materials, stability & control, structural dynamics and human factors, Langley has made significant contributions to the advancement of vertical lift technologies. This research has encompassed basic phenomenological studies through subscale laboratory testing, analytical tool development, applied demonstrations and full scale flight-testing. Since the dedication of Langley in 1920, it has contributed to the understanding, design, analysis, and flight test development of experimental and production V/STOL configurations. This paper will chronicle significant areas of research through the decades from 1920 to 2015 with historical photographs and references.
Initiating Young Children into Basic Astronomical Concepts and Phenomena
NASA Astrophysics Data System (ADS)
Kallery, M.
2010-07-01
In the present study we developed and implemented three units of activities aiming at acquainting very young children with basic astronomical concepts and phenomena such as the sphericity of the earth, the earth’s movements and the day/night cycle. The activities were developed by a group composed of a researcher/facilitator and six early-years teachers. In the activities children were presented with appropriate for their age scientific information along with conceptual tools such as a globe and an instructional video. Action research processes were used to optimize classroom practices and to gather useful information for the final shaping of the activities and the instruction materials. In these activities the adopted approach to learning can be characterized as socially constructed. The results indicated awareness of concepts and phenomena that the activities dealt with in high percentages of children, storage of the new knowledge in the long term memory and easy retrieval of it, and children’s enthusiasm for the subject.
Tools. Unit 9: A Core Curriculum of Related Instruction for Apprentices.
ERIC Educational Resources Information Center
New York State Education Dept., Albany. Bureau of Occupational and Career Curriculum Development.
The tool handling unit is presented to assist apprentices to acquire a general knowledge on the use of various basic tools. The unit consists of seven modules: (1) introduction to hand tools and small power tools; (2) measuring tools: layout and measuring tools for woodworking; (3) measuring tools: feeler gauge, micrometer, and torque wrench; (4)…
Microplasmas, a platform technology for a plethora of plasma applications
NASA Astrophysics Data System (ADS)
Becker, Kurt
2017-08-01
Publications describing microplasmas, which are commonly defined as plasmas with at least one dimension in the submillimeter range, began to appear to the scientific literature about 20 years ago. As discussed in a recent review by Schoenbach and Becker [1], interest and activities in basic microplasma research as well as in the use of microplasma for a variety of application has increased significatly over the past 20 years. The number of papers devoted to basic microplasma science increased by an order of magnitude between 1995 and 2015, a count that excludes publications dealing exclusively with technological applications of microplasmas, where the microplasma is used solely as a tool. In reference [1], the authors limited the topical coverage largely to the status of microplasma science and our understanding of the physics principles that enable microplasma operation and further stated that the rapid proliferation of microplasma applications made it impossible to cover both basic microplasma science and their application in a single review article.
The Empirical Foundations of Telepathology: Evidence of Feasibility and Intermediate Effects
Krupinski, Elizabeth A.; Weinstein, Ronald S.; Dunn, Matthew R.; Bashshur, Noura
2017-01-01
Abstract Introduction: Telepathology evolved from video microscopy (i.e., “television microscopy”) research in the early 1950s to video microscopy used in basic research in the biological sciences to a basic diagnostic tool in telemedicine clinical applications. Its genesis can be traced to pioneering feasibility studies regarding the importance of color and other image-based parameters for rendering diagnoses and a series of studies assessing concordance of virtual slide and light microscopy diagnoses. This article documents the empirical foundations of telepathology. Methods: A selective review of the research literature during the past decade (2005–2016) was conducted using robust research design and adequate sample size as criteria for inclusion. Conclusions: The evidence regarding feasibility/acceptance of telepathology and related information technology applications has been well documented for several decades. The majority of evidentiary studies focused on intermediate outcomes, as indicated by comparability between telepathology and conventional light microscopy. A consistent trend of concordance between the two modalities was observed in terms of diagnostic accuracy and reliability. Additional benefits include use of telepathology and whole slide imaging for teaching, research, and outreach to resource-limited countries. Challenges still exist, however, in terms of use of telepathology as an effective diagnostic modality in clinical practice. PMID:28170313
Collaborating and sharing data in epilepsy research.
Wagenaar, Joost B; Worrell, Gregory A; Ives, Zachary; Dümpelmann, Matthias; Matthias, Dümpelmann; Litt, Brian; Schulze-Bonhage, Andreas
2015-06-01
Technological advances are dramatically advancing translational research in Epilepsy. Neurophysiology, imaging, and metadata are now recorded digitally in most centers, enabling quantitative analysis. Basic and translational research opportunities to use these data are exploding, but academic and funding cultures prevent this potential from being realized. Research on epileptogenic networks, antiepileptic devices, and biomarkers could progress rapidly if collaborative efforts to digest this "big neuro data" could be organized. Higher temporal and spatial resolution data are driving the need for novel multidimensional visualization and analysis tools. Crowd-sourced science, the same that drives innovation in computer science, could easily be mobilized for these tasks, were it not for competition for funding, attribution, and lack of standard data formats and platforms. As these efforts mature, there is a great opportunity to advance Epilepsy research through data sharing and increase collaboration between the international research community.
32 CFR 272.3 - Definition of basic research.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 32 National Defense 2 2014-07-01 2014-07-01 false Definition of basic research. 272.3 Section 272...) MISCELLANEOUS ADMINISTRATION AND SUPPORT OF BASIC RESEARCH BY THE DEPARTMENT OF DEFENSE § 272.3 Definition of basic research. Basic research is systematic study directed toward greater knowledge or understanding of...
32 CFR 272.3 - Definition of basic research.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 32 National Defense 2 2012-07-01 2012-07-01 false Definition of basic research. 272.3 Section 272...) MISCELLANEOUS ADMINISTRATION AND SUPPORT OF BASIC RESEARCH BY THE DEPARTMENT OF DEFENSE § 272.3 Definition of basic research. Basic research is systematic study directed toward greater knowledge or understanding of...
NASA Technical Reports Server (NTRS)
Presser, L.
1978-01-01
An integrated set of FORTRAN tools that are commercially available is described. The basic purpose of various tools is summarized and their economic impact highlighted. The areas addressed by these tools include: code auditing, error detection, program portability, program instrumentation, documentation, clerical aids, and quality assurance.
Washington State water quality temperature standards as related to reactor operation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ballowe, J.W.
1968-08-14
The purpose of this report is to provide a basic working tool for determining the relationship between the allowable temperature increase within the Columbia River reach at the Hanford Site and the actual temperature increase as associated with various reactor operating modes. This basic tool can be utilized for day-to-day operating purposes or for the achievement of historical information.
ERIC Educational Resources Information Center
Anoka-Hennepin Technical Coll., Minneapolis, MN.
This set of two training outlines and one basic skills set list are designed for a machine tool technology program developed during a project to retrain defense industry workers at risk of job loss or dislocation because of conversion of the defense industry. The first troubleshooting training outline lists the categories of problems that develop…
Prony Ringdown GUI (CERTS Prony Ringdown, part of the DSI Tool Box)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tuffner, Francis; Marinovici, PNNL Laurentiu; Hauer, PNNL John
2014-02-21
The PNNL Prony Ringdown graphical user interface is one analysis tool included in the Dynamic System Identification toolbox (DSI Toolbox). The Dynamic System Identification toolbox is a MATLAB-based collection of tools for parsing and analyzing phasor measurement unit data, especially in regards to small signal stability. It includes tools to read the data, preprocess it, and perform small signal analysis. 5. Method of Solution: The Dynamic System Identification Toolbox (DSI Toolbox) is designed to provide a research environment for examining phasor measurement unit data and performing small signal stability analysis. The software uses a series of text-driven menus to helpmore » guide users and organize the toolbox features. Methods for reading in populate phasor measurement unit data are provided, with appropriate preprocessing options for small-signal-stability analysis. The toolbox includes the Prony Ringdown GUI and basic algorithms to estimate information on oscillatory modes of the system, such as modal frequency and damping ratio.« less
DATA-MEAns: an open source tool for the classification and management of neural ensemble recordings.
Bonomini, María P; Ferrandez, José M; Bolea, Jose Angel; Fernandez, Eduardo
2005-10-30
The number of laboratories using techniques that allow to acquire simultaneous recordings of as many units as possible is considerably increasing. However, the development of tools used to analyse this multi-neuronal activity is generally lagging behind the development of the tools used to acquire these data. Moreover, the data exchange between research groups using different multielectrode acquisition systems is hindered by commercial constraints such as exclusive file structures, high priced licenses and hard policies on intellectual rights. This paper presents a free open-source software for the classification and management of neural ensemble data. The main goal is to provide a graphical user interface that links the experimental data to a basic set of routines for analysis, visualization and classification in a consistent framework. To facilitate the adaptation and extension as well as the addition of new routines, tools and algorithms for data analysis, the source code and documentation are freely available.
Debecker, Damien P; Gaigneaux, Eric M; Busca, Guido
2009-01-01
Basic catalysis! The basic properties of hydrotalcites (see picture) make them attractive for numerous catalytic applications. Probing the basicity of the catalysts is crucial to understand the base-catalysed processes and to optimise the catalyst preparation. Various parameters can be employed to tune the basic properties of hydrotalcite-based catalysts towards the basicity demanded by each target chemical reaction.Hydrotalcites offer unique basic properties that make them very attractive for catalytic applications. It is of primary interest to make use of accurate tools for probing the basicity of hydrotalcite-based catalysts for the purpose of 1) fundamental understanding of base-catalysed processes with hydrotalcites and 2) optimisation of the catalytic performance achieved in reactions of industrial interest. Techniques based on probe molecules, titration techniques and test reactions along with physicochemical characterisation are overviewed in the first part of this review. The aim is to provide the tools for understanding how series of parameters involved in the preparation of hydrotalcite-based catalytic materials can be employed to control and adapt the basic properties of the catalyst towards the basicity demanded by each target chemical reaction. An overview of recent and significant achievements in that perspective is presented in the second part of the paper.
Reducing Information Overload in Large Seismic Data Sets
DOE Office of Scientific and Technical Information (OSTI.GOV)
HAMPTON,JEFFERY W.; YOUNG,CHRISTOPHER J.; MERCHANT,BION J.
2000-08-02
Event catalogs for seismic data can become very large. Furthermore, as researchers collect multiple catalogs and reconcile them into a single catalog that is stored in a relational database, the reconciled set becomes even larger. The sheer number of these events makes searching for relevant events to compare with events of interest problematic. Information overload in this form can lead to the data sets being under-utilized and/or used incorrectly or inconsistently. Thus, efforts have been initiated to research techniques and strategies for helping researchers to make better use of large data sets. In this paper, the authors present their effortsmore » to do so in two ways: (1) the Event Search Engine, which is a waveform correlation tool and (2) some content analysis tools, which area combination of custom-built and commercial off-the-shelf tools for accessing, managing, and querying seismic data stored in a relational database. The current Event Search Engine is based on a hierarchical clustering tool known as the dendrogram tool, which is written as a MatSeis graphical user interface. The dendrogram tool allows the user to build dendrogram diagrams for a set of waveforms by controlling phase windowing, down-sampling, filtering, enveloping, and the clustering method (e.g. single linkage, complete linkage, flexible method). It also allows the clustering to be based on two or more stations simultaneously, which is important to bridge gaps in the sparsely recorded event sets anticipated in such a large reconciled event set. Current efforts are focusing on tools to help the researcher winnow the clusters defined using the dendrogram tool down to the minimum optimal identification set. This will become critical as the number of reference events in the reconciled event set continually grows. The dendrogram tool is part of the MatSeis analysis package, which is available on the Nuclear Explosion Monitoring Research and Engineering Program Web Site. As part of the research into how to winnow the reference events in these large reconciled event sets, additional database query approaches have been developed to provide windows into these datasets. These custom built content analysis tools help identify dataset characteristics that can potentially aid in providing a basis for comparing similar reference events in these large reconciled event sets. Once these characteristics can be identified, algorithms can be developed to create and add to the reduced set of events used by the Event Search Engine. These content analysis tools have already been useful in providing information on station coverage of the referenced events and basic statistical, information on events in the research datasets. The tools can also provide researchers with a quick way to find interesting and useful events within the research datasets. The tools could also be used as a means to review reference event datasets as part of a dataset delivery verification process. There has also been an effort to explore the usefulness of commercially available web-based software to help with this problem. The advantages of using off-the-shelf software applications, such as Oracle's WebDB, to manipulate, customize and manage research data are being investigated. These types of applications are being examined to provide access to large integrated data sets for regional seismic research in Asia. All of these software tools would provide the researcher with unprecedented power without having to learn the intricacies and complexities of relational database systems.« less
[Tissue repositories for research at Sheba Medical Center(SMC].
Cohen, Yehudit; Barshack, Iris; Onn, Amir
2013-06-01
Cancer is the number one cause of death in both genders. Breakthroughs in the understanding of cancer biology, the identification of prognostic factors, and the development of new treatments are increasingly dependent on access to human cancer tissues with linked clinicopathological data. Access to human tumor samples and a large investment in translational research are needed to advance this research. The SMC tissue repositories provide researchers with biological materials, which are essential tools for cancer research. SMC tissue repositories for research aim to collect, document and preserve human biospecimens from patients with cancerous diseases. This is in order to provide the highest quality and well annotated biological biospecimens, used as essential tools to achieve the growing demands of scientific research needs. Such repositories are partners in acceLerating biomedical research and medical product development through clinical resources, in order to apply best options to the patients. Following Institutional Review Board approval and signing an Informed Consent Form, the tumor and tumor-free specimens are coLLected by a designated pathologist at the operating room only when there is a sufficient amount of the tumor, in excess of the routine needs. Blood samples are collected prior to the procedure. Other types of specimens collected include ascites fluid, pleural effusion, tissues for Optimal Cutting Temperature [OCT] and primary culture etc. Demographic, clinical, pathologicaL, and follow-up data are collected in a designated database. SMC has already established several organ or disease-specific tissue repositories within different departments. The foundation of tissue repositories requires the concentrated effort of a multidisciplinary team composed of paramedical, medical and scientific professionals. Research projects using these specimens facilitate the development of 'targeted therapy', accelerate basic research aimed at clarifying molecular mechanisms involved in cancer, and support the development of novel diagnostic tools.
Principle considerations for the use of transcriptomics in doping research.
Neuberger, Elmo W I; Moser, Dirk A; Simon, Perikles
2011-10-01
Over the course of the past decade, technical progress has enabled scientists to investigate genome-wide RNA expression using microarray platforms. This transcriptomic approach represents a promising tool for the discovery of basic gene expression patterns and for identification of cellular signalling pathways under various conditions. Since doping substances have been shown to influence mRNA expression, it has been suggested that these changes can be detected by screening the blood transcriptome. In this review, we critically discuss the potential but also the pitfalls of this application as a tool in doping research. Transcriptomic approaches were considered to potentially provide researchers with a unique gene expression signature or with a specific biomarker for various physiological and pathophysiological conditions. Since transcriptomic approaches are considerably prone to biological and technical confounding factors that act on study subjects or samples, very strict guidelines for the use of transcriptomics in human study subjects have been developed. Typical field conditions associated with doping controls limit the feasibility of following these strict guidelines as there are too many variables counteracting a standardized procedure. After almost a decade of research using transcriptomic tools, it still remains a matter of future technological progress to identify the ultimate biomarker using technologies and/or methodologies that are sufficiently robust against typical biological and technical bias and that are valid in a court of law. Copyright © 2011 John Wiley & Sons, Ltd.
Gammon, Deede; Strand, Monica; Eng, Lillian Sofie
2014-01-09
The involvement of persons with lived experiences of mental illness and service use is increasingly viewed as key to improving the relevance and utility of mental health research and service innovation. Guided by the principles of Community-Based Participatory Research we developed an online tool for assisted self-help in mental health. The resulting tool, PsyConnect, is ready for testing in two communities starting 2014. This case study reports from the design phase which entailed clarifying very basic questions: Who is the primary target group? What are the aims? What functions are priorities? Roles and responsibilities? What types of evidence can legitimize tool design decisions? Here we highlight the views of service users as a basis for discussing implications of user involvement for service design and research. PsyConnect has become a tool for those who expect to need assistance over long periods of time regardless of their specific condition(s). The aim is to support service users in gaining greater overview and control, legitimacy, and sense of continuity in relationships. It has a personalized "my control panel" which depicts status → process → goals. Functionality includes support for: mapping life domains; medication overview; crisis management; coping exercises; secure messaging; and social support. While the types of evidence that can legitimize design decisions are scattered and indirectly relevant, recent trends in recovery research will be used to guide further refinements. PsyConnect has undoubtedly become something other than it would have been without careful attention to the views of service users. The tool invites a proactive approach that is likely to challenge treatment cultures that are reactive, disorder-focused and consultation-based. Service user representatives will need to play central roles in training peers and clinicians in order to increase the likelihood of tool usage in line with intentions. Similarly, their influence on tool design has implications for choice of methods for evaluation. Starting down the path of service user involvement in intervention design fosters commitment to follow through in the remaining implementation and research phases. While this can be time-consuming and less meriting for researchers, it is probably vital to increasing the likelihood of success of person-centered service innovations.
Electrotransformation and clonal isolation of Rickettsia species
Riley, Sean P; Macaluso, Kevin R; Martinez, Juan J
2015-01-01
Genetic manipulation of obligate intracellular bacteria of the genus Rickettsia is currently undergoing a rapid period of change. The development of viable genetic tools, including replicative plasmids, transposons, homologous recombination, fluorescent protein-encoding genes, and antibiotic selectable markers has provided the impetus for future research development. This unit is designed to coalesce the basic methods pertaining to creation of genetically modified Rickettsia. The unit describes a series of methods, from inserting exogenous DNA into Rickettsia to the final isolation of genetically modified bacterial clones. Researchers working towards genetic manipulation of Rickettsia or similar obligate intracellular bacteria will find these protocols to be a valuable reference. PMID:26528784
Advances in natural language processing.
Hirschberg, Julia; Manning, Christopher D
2015-07-17
Natural language processing employs computational techniques for the purpose of learning, understanding, and producing human language content. Early computational approaches to language research focused on automating the analysis of the linguistic structure of language and developing basic technologies such as machine translation, speech recognition, and speech synthesis. Today's researchers refine and make use of such tools in real-world applications, creating spoken dialogue systems and speech-to-speech translation engines, mining social media for information about health or finance, and identifying sentiment and emotion toward products and services. We describe successes and challenges in this rapidly advancing area. Copyright © 2015, American Association for the Advancement of Science.
Research pressure instrumentation for NASA Space Shuttle main engine, modification no. 5
NASA Technical Reports Server (NTRS)
Anderson, P. J.; Nussbaum, P.; Gustafson, G.
1984-01-01
The objective of the research project described is to define and demonstrate methods to advance the state of the art of pressure sensors for the space shuttle main engine (SSME). Silicon piezoresistive technology was utilized in completing tasks: generation and testing of three transducer design concepts for solid state applications; silicon resistor characterization at cryogenic temperatures; experimental chip mounting characterization; frequency response optimization and prototype design and fabrication. Excellent silicon sensor performance was demonstrated at liquid nitrogen temperature. A silicon resistor ion implant dose was customized for SSME temperature requirements. A basic acoustic modeling software program was developed as a design tool to evaluate frequency response characteristics.
Computer assisted screening, correction, and analysis of historical weather measurements
NASA Astrophysics Data System (ADS)
Burnette, Dorian J.; Stahle, David W.
2013-04-01
A computer program, Historical Observation Tools (HOB Tools), has been developed to facilitate many of the calculations used by historical climatologists to develop instrumental and documentary temperature and precipitation datasets and makes them readily accessible to other researchers. The primitive methodology used by the early weather observers makes the application of standard techniques difficult. HOB Tools provides a step-by-step framework to visually and statistically assess, adjust, and reconstruct historical temperature and precipitation datasets. These routines include the ability to check for undocumented discontinuities, adjust temperature data for poor thermometer exposures and diurnal averaging, and assess and adjust daily precipitation data for undercount. This paper provides an overview of the Visual Basic.NET program and a demonstration of how it can assist in the development of extended temperature and precipitation datasets using modern and early instrumental measurements from the United States.
[Dietopro.com: a new tool for dietotherapeutical management based on cloud computing technology].
García, Candido Gabriel; Sebastià, Natividad; Blasco, Esther; Soriano, José Miguel
2014-09-01
dietotherapeutical softwares are now a basic tool in the dietary management of patients, either from a physiological point of view and / or pathological. New technologies and research in this regard, have favored the emergence of new applications for the dietary and nutritional management that facilitate the management of the dietotherapeutical company. To comparatively study the main dietotherapeutical applications on the market to give criteria to the professional users of diet and nutrition in the selection of one of the main tools for these. Dietopro.com is, from our point of view, one of the most comprehensive management of patients dietotherapeutical applications. Based on the need of the user, it has different dietary sofwares choice.We conclude that there is no better or worse than another application, but applications roughly adapted to the needs of professionals. Copyright AULA MEDICA EDICIONES 2014. Published by AULA MEDICA. All rights reserved.
NASA Technical Reports Server (NTRS)
Lickly, Ben
2005-01-01
Data from all current JPL missions are stored in files called SPICE kernels. At present, animators who want to use data from these kernels have to either read through the kernels looking for the desired data, or write programs themselves to retrieve information about all the needed objects for their animations. In this project, methods of automating the process of importing the data from the SPICE kernels were researched. In particular, tools were developed for creating basic scenes in Maya, a 3D computer graphics software package, from SPICE kernels.
NASA Astrophysics Data System (ADS)
Gordova, Yulia; Okladnikov, Igor; Titov, Alexander; Gordov, Evgeny
2016-04-01
While there is a strong demand for innovation in digital learning, available training programs in the environmental sciences have no time to adapt to rapid changes in the domain content. A joint group of scientists and university teachers develops and implements an educational environment for new learning experiences in basics of climatic science and its applications. This so-called virtual learning laboratory "Climate" contains educational materials and interactive training courses developed to provide undergraduate and graduate students with profound understanding of changes in regional climate and environment. The main feature of this Laboratory is that students perform their computational tasks on climate modeling and evaluation and assessment of climate change using the typical tools of the "Climate" information-computational system, which are usually used by real-life practitioners performing such kind of research. Students have an opportunity to perform computational laboratory works using information-computational tools of the system and improve skills of their usage simultaneously with mastering the subject. We did not create an artificial learning environment to pass the trainings. On the contrary, the main purpose of association of the educational block and computational information system was to familiarize students with the real existing technologies for monitoring and analysis of data on the state of the climate. Trainings are based on technologies and procedures which are typical for Earth system sciences. Educational courses are designed to permit students to conduct their own investigations of ongoing and future climate changes in a manner that is essentially identical to the techniques used by national and international climate research organizations. All trainings are supported by lectures, devoted to the basic aspects of modern climatology, including analysis of current climate change and its possible impacts ensuring effective links between theory and practice. Along with its usage in graduate and postgraduate education, "Climate" is used as a framework for a developed basic information course on climate change for common public. In this course basic concepts and problems of modern climate change and its possible consequences are described for non-specialists. The course will also include links to relevant information resources on topical issues of Earth Sciences and a number of case studies, which are carried out for a selected region to consolidate the received knowledge.
Molecular epidemiology: new rules for new tools?
Merlo, Domenico Franco; Sormani, Maria Pia; Bruzzi, Paolo
2006-08-30
Molecular epidemiology combines biological markers and epidemiological observations in the study of the environmental and genetic determinants of cancer and other diseases. The potential advantages associated with biomarkers are manifold and include: (a) increased sensitivity and specificity to carcinogenic exposures; (b) more precise evaluation of the interplay between genetic and environmental determinants of cancer; (c) earlier detection of carcinogenic effects of exposure; (d) characterization of disease subtypes-etiologies patterns; (e) evaluation of primary prevention measures. These, in turn, may translate into better tools for etiologic research, individual risk assessment, and, ultimately, primary and secondary prevention. An area that has not received sufficient attention concerns the validation of these biomarkers as surrogate endpoints for cancer risk. Validation of a candidate biomarker's surrogacy is the demonstration that it possesses the properties required for its use as a substitute for a true endpoint. The principles underlying the validation process underwent remarkable developments and discussion in therapeutic research. However, the challenges posed by the application of these principles to epidemiological research, where the basic tool for this validation (i.e., the randomized study) is seldom possible, have not been thoroughly explored. The validation process of surrogacy must be applied rigorously to intermediate biomarkers of cancer risk before using them as risk predictors at the individual as well as at the population level.
Nirmalanandhan, Victor Sanjit; Sittampalam, G Sitta
2009-08-01
Stem cells, irrespective of their origin, have emerged as valuable reagents or tools in human health in the past 2 decades. Initially, a research tool to study fundamental aspects of developmental biology is now the central focus of generating transgenic animals, drug discovery, and regenerative medicine to address degenerative diseases of multiple organ systems. This is because stem cells are pluripotent or multipotent cells that can recapitulate developmental paths to repair damaged tissues. However, it is becoming clear that stem cell therapy alone may not be adequate to reverse tissue and organ damage in degenerative diseases. Existing small-molecule drugs and biologicals may be needed as "molecular adjuvants" or enhancers of stem cells administered in therapy or adult stem cells in the diseased tissues. Hence, a combination of stem cell-based, high-throughput screening and 3D tissue engineering approaches is necessary to advance the next wave of tools in preclinical drug discovery. In this review, the authors have attempted to provide a basic account of various stem cells types, as well as their biology and signaling, in the context of research in regenerative medicine. An attempt is made to link stem cells as reagents, pharmacology, and tissue engineering as converging fields of research for the next decade.
Sjögren, Jonathan; Andersson, Linda; Mejàre, Malin; Olsson, Fredrik
2017-01-01
Fab fragments are valuable research tools in various areas of science including applications in imaging, binding studies, removal of Fc-mediated effector functions, mass spectrometry, infection biology, and many others. The enzymatic tools for the generation of Fab fragments have been discovered through basic research within the field of molecular bacterial pathogenesis. Today, these enzymes are widely applied as research tools and in this chapter, we describe methodologies based on bacterial enzymes to generate Fab fragments from both human and mouse IgG. For all human IgG subclasses, the IdeS enzyme from Streptococcus pyogenes has been applied to generate F(ab')2 fragments that subsequently can be reduced under mild conditions to generate a homogenous pool of Fab' fragments. The enzyme Kgp from Porphyromonas gingivalis has been applied to generate intact Fab fragments from human IgG1 and the Fab fragments can be purified using a CH1-specific affinity resin. The SpeB protease, also from S. pyogenes, is able to digest mouse IgGs and has been applied to digest antibodies and Fab fragments can be purified on light chain affinity resins. In this chapter, we describe methodologies that can be used to obtain Fab fragments from human and mouse IgG using bacterial proteases.
Enhance Your Science With Social Media: No ... Really
NASA Astrophysics Data System (ADS)
Goss, H.; Aiken, A. C.; Sams, A.
2016-12-01
The ability to communicate the societal value of basic research to nonacademic audiences is morphing from an optional soft skill to a crucial tool for scientists who are competing over finite or shrinking resources for research. Former National Academy of Sciences President Ralph Cicerone argued as early as 2006 that "scientists themselves must do a better job of communicating directly to the public," taking advantage of "new, non-traditional outlets" on the Internet. Findings suggest that scientists have begun to embrace social media as a viable tool for communicating research and keeping abreast of advancements in their fields. Social media is changing the way that scientists are interacting with each other and with the global community. Scientists are taking to popular social media (Twitter, Facebook, etc.) to challenge weak research, share replication attempts in real time, and counteract hype. Incorporating social media into the different stages of a scientific publication: Accelerates the pace of scientific communication and collaboration Facilitates interdisciplinary collaboration Makes it possible to communicate results to a large and diverse audience Encourages post-publication conversations about findings Accelerates research evaluation Makes science more transparent Amplifies the positive effects of scientists' interactions with more traditional media Our presentation will demonstrate how scientists can use social media as a tool to support their work, collaborate with peers around the world, and advance the cause of science. Information will be presented by communications experts and research librarians in collaboration with scientists who are already active on social media. Content will focus on pragmatic best practices for engaging peers, other stakeholders, promoting science and scientific research, and measuring success.
Latifi, Rifat; Ziemba, Michelle; Leppäniemi, Ari; Dasho, Erion; Dogjani, Agron; Shatri, Zhaneta; Kociraj, Agim; Oldashi, Fatos; Shosha, Lida
2014-08-01
Trauma continues to be a major health problem worldwide, particularly in the developing world, with high mortality and morbidity. Yet most developing countries lack an organized trauma system. Furthermore, developing countries do not have in place any accreditation process for trauma centers; thus, no accepted standard assessment tools exist to evaluate their trauma services. The aims of this study were to evaluate the trauma system in Albania, using the basic trauma criteria of the American College of Surgeons/Committee on Trauma (ACS/COT) as assessment tools, and to provide the Government with a situational analysis relative to these criteria. We used the ACS/COT basic criteria as assessment tools to evaluate the trauma system in Albania. We conducted a series of semi-structured interviews, unstructured interviews, and focus groups with all stakeholders at the Ministry of Health, at the University Trauma Hospital (UTH) based in Tirana (the capital city), and at ten regional hospitals across the country. Albania has a dedicated national trauma center that serves as the only tertiary center, plus ten regional hospitals that provide some trauma care. However, overall, its trauma system is in need of major reforms involving all essential elements in order to meet the basic requirements of a structured trauma system. The ACS/COT basic criteria can be used as assessment tools to evaluate trauma care in developing countries. Further studies are needed in other developing countries to validate the applicability of these criteria.
CRISPR/Cas9 Immune System as a Tool for Genome Engineering.
Hryhorowicz, Magdalena; Lipiński, Daniel; Zeyland, Joanna; Słomski, Ryszard
2017-06-01
CRISPR/Cas (clustered regularly interspaced short palindromic repeats/CRISPR-associated) adaptive immune systems constitute a bacterial defence against invading nucleic acids derived from bacteriophages or plasmids. This prokaryotic system was adapted in molecular biology and became one of the most powerful and versatile platforms for genome engineering. CRISPR/Cas9 is a simple and rapid tool which enables the efficient modification of endogenous genes in various species and cell types. Moreover, a modified version of the CRISPR/Cas9 system with transcriptional repressors or activators allows robust transcription repression or activation of target genes. The simplicity of CRISPR/Cas9 has resulted in the widespread use of this technology in many fields, including basic research, biotechnology and biomedicine.
Lake Pontchartrain Basin: bottom sediments and related environmental resources
Manheim, Frank T.; Hayes, Laura
2002-01-01
Lake Pontchartrain is the largest estuary southern Louisiana. It is an important recreational, commercial, and environmental resource for New Orleans and southwestern Louisiana. This publication is part of a 5-year cooperative program led by the USGS on the geological framework and sedimentary processes of the Lake Pontchartrain Basin.This presentation is divided into two main parts:- Scientific Research and Assessments- Multimedia Tools and Regional ResourcesThe scientific sections include historical information on the area; shipboard, field, and remote sensing studies; and a comprehensive sediment database with geological and chemical discussions of the region.The multimedia and resources sections include Geographic Information System (GIS) tools and data, a video demonstrating vibracore sampling techniques in Lake Pontchartrain, and abstracts from four Basics of the Basin symposia.
Integration of Multidisciplinary Sensory Data:
Miller, Perry L.; Nadkarni, Prakash; Singer, Michael; Marenco, Luis; Hines, Michael; Shepherd, Gordon
2001-01-01
The paper provides an overview of neuroinformatics research at Yale University being performed as part of the national Human Brain Project. This research is exploring the integration of multidisciplinary sensory data, using the olfactory system as a model domain. The neuroinformatics activities fall into three main areas: 1) building databases and related tools that support experimental olfactory research at Yale and can also serve as resources for the field as a whole, 2) using computer models (molecular models and neuronal models) to help understand data being collected experimentally and to help guide further laboratory experiments, 3) performing basic neuroinformatics research to develop new informatics technologies, including a flexible data model (EAV/CR, entity-attribute-value with classes and relationships) designed to facilitate the integration of diverse heterogeneous data within a single unifying framework. PMID:11141511
The pits and falls of graphical presentation.
Sperandei, Sandro
2014-01-01
Graphics are powerful tools to communicate research results and to gain information from data. However, researchers should be careful when deciding which data to plot and the type of graphic to use, as well as other details. The consequence of bad decisions in these features varies from making research results unclear to distortions of these results, through the creation of "chartjunk" with useless information. This paper is not another tutorial about "good graphics" and "bad graphics". Instead, it presents guidelines for graphic presentation of research results and some uncommon, but useful examples to communicate basic and complex data types, especially multivariate model results, which are commonly presented only by tables. By the end, there are no answers here, just ideas meant to inspire others on how to create their own graphics.
The pits and falls of graphical presentation
Sperandei, Sandro
2014-01-01
Graphics are powerful tools to communicate research results and to gain information from data. However, researchers should be careful when deciding which data to plot and the type of graphic to use, as well as other details. The consequence of bad decisions in these features varies from making research results unclear to distortions of these results, through the creation of “chartjunk” with useless information. This paper is not another tutorial about “good graphics” and “bad graphics”. Instead, it presents guidelines for graphic presentation of research results and some uncommon, but useful examples to communicate basic and complex data types, especially multivariate model results, which are commonly presented only by tables. By the end, there are no answers here, just ideas meant to inspire others on how to create their own graphics. PMID:25351349
Lausberg, Hedda; Sloetjes, Han
2016-09-01
As visual media spread to all domains of public and scientific life, nonverbal behavior is taking its place as an important form of communication alongside the written and spoken word. An objective and reliable method of analysis for hand movement behavior and gesture is therefore currently required in various scientific disciplines, including psychology, medicine, linguistics, anthropology, sociology, and computer science. However, no adequate common methodological standards have been developed thus far. Many behavioral gesture-coding systems lack objectivity and reliability, and automated methods that register specific movement parameters often fail to show validity with regard to psychological and social functions. To address these deficits, we have combined two methods, an elaborated behavioral coding system and an annotation tool for video and audio data. The NEUROGES-ELAN system is an effective and user-friendly research tool for the analysis of hand movement behavior, including gesture, self-touch, shifts, and actions. Since its first publication in 2009 in Behavior Research Methods, the tool has been used in interdisciplinary research projects to analyze a total of 467 individuals from different cultures, including subjects with mental disease and brain damage. Partly on the basis of new insights from these studies, the system has been revised methodologically and conceptually. The article presents the revised version of the system, including a detailed study of reliability. The improved reproducibility of the revised version makes NEUROGES-ELAN a suitable system for basic empirical research into the relation between hand movement behavior and gesture and cognitive, emotional, and interactive processes and for the development of automated movement behavior recognition methods.
Trinkaus, Hans L; Gaisser, Andrea E
2010-09-01
Nearly 30,000 individual inquiries are answered annually by the telephone cancer information service (CIS, KID) of the German Cancer Research Center (DKFZ). The aim was to develop a tool for evaluating these calls, and to support the complete counseling process interactively. A novel software tool is introduced, based on a structure similar to a music score. Treating the interaction as a "duet", guided by the CIS counselor, the essential contents of the dialogue are extracted automatically. For this, "trained speech recognition" is applied to the (known) counselor's part, and "keyword spotting" is used on the (unknown) client's part to pick out specific items from the "word streams". The outcomes fill an abstract score representing the dialogue. Pilot tests performed on a prototype of SACA (Software Assisted Call Analysis) resulted in a basic proof of concept: Demographic data as well as information regarding the situation of the caller could be identified. The study encourages following up on the vision of an integrated SACA tool for supporting calls online and performing statistics on its knowledge database offline. Further research perspectives are to check SACA's potential in comparison with established interaction analysis systems like RIAS. Copyright (c) 2010 Elsevier Ireland Ltd. All rights reserved.
ERIC Educational Resources Information Center
Fortney, Clarence; Gregory, Mike
This curriculum guide provides six units of instruction on basic welding. Addressed in the individual units of instruction are the following topics: employment opportunities for welders, welding safety and first aid, welding tools and equipment, basic metals and metallurgy, basic math and measuring, and procedures for applying for a welding job.…
Operational Research during the Ebola Emergency.
Fitzpatrick, Gabriel; Decroo, Tom; Draguez, Bertrand; Crestani, Rosa; Ronsse, Axelle; Van den Bergh, Rafael; Van Herp, Michel
2017-07-01
Operational research aims to identify interventions, strategies, or tools that can enhance the quality, effectiveness, or coverage of programs where the research is taking place. Médecins Sans Frontières admitted ≈5,200 patients with confirmed Ebola virus disease during the Ebola outbreak in West Africa and from the beginning nested operational research within its emergency response. This research covered critical areas, such as understanding how the virus spreads, clinical trials, community perceptions, challenges within Ebola treatment centers, and negative effects on non-Ebola healthcare. Importantly, operational research questions were decided to a large extent by returning volunteers who had first-hand knowledge of the immediate issues facing teams in the field. Such a method is appropriate for an emergency medical organization. Many challenges were also identified while carrying out operational research across 3 different countries, including the basic need for collecting data in standardized format to enable comparison of findings among treatment centers.
NASA Astrophysics Data System (ADS)
Naumova, A. V.; Khodanovich, M. Y.; Yarnykh, V. L.
2016-02-01
The Second International Conference and Young Scientist School ''Magnetic resonance imaging in biomedical research'' was held on the campus of the National Research Tomsk State University (Tomsk, Russia) on September 7-9, 2015. The conference was focused on magnetic resonance imaging (MRI) applications for biomedical research. The main goal was to bring together basic scientists, clinical researchers and developers of new MRI techniques to bridge the gap between clinical/research needs and advanced technological solutions. The conference fostered research and development in basic and clinical MR science and its application to health care. It also had an educational purpose to promote understanding of cutting-edge MR developments. The conference provided an opportunity for researchers and clinicians to present their recent theoretical developments, practical applications, and to discuss unsolved problems. The program of the conference was divided into three main topics. First day of the conference was devoted to educational lectures on the fundamentals of MRI physics and image acquisition/reconstruction techniques, including recent developments in quantitative MRI. The second day was focused on developments and applications of new contrast agents. Multinuclear and spectroscopic acquisitions as well as functional MRI were presented during the third day of the conference. We would like to highlight the main developments presented at the conference and introduce the prominent speakers. The keynote speaker of the conference Dr. Vasily Yarnykh (University of Washington, Seattle, USA) presented a recently developed MRI method, macromolecular proton fraction (MPF) mapping, as a unique tool for modifying image contrast and a unique tool for quantification of the myelin content in neural tissues. Professor Yury Pirogov (Lomonosov Moscow State University) described development of new fluorocarbon compounds and applications for biomedicine. Drs. Julia Velikina and Alexey Samsonov (University of Wisconsin-Madison, USA) demonstrated new image reconstruction methods for accelerated quantitative parameter mapping and magnetic resonance angiography. Finally, we would like to thank the scientific committee, the local organizing committee and the National Research Tomsk State University for giving an opportunity to share scientific ideas and new developments at the conference and the Russian Science Foundation (project № 14-45-00040) for financial support.
Thermal Insulation System Analysis Tool (TISTool) User's Manual. Version 1.0.0
NASA Technical Reports Server (NTRS)
Johnson, Wesley; Fesmire, James; Leucht, Kurt; Demko, Jonathan
2010-01-01
The Thermal Insulation System Analysis Tool (TISTool) was developed starting in 2004 by Jonathan Demko and James Fesmire. The first edition was written in Excel and Visual BasIc as macros. It included the basic shapes such as a flat plate, cylinder, dished head, and sphere. The data was from several KSC tests that were already in the public literature realm as well as data from NIST and other highly respectable sources. More recently, the tool has been updated with more test data from the Cryogenics Test Laboratory and the tank shape was added. Additionally, the tool was converted to FORTRAN 95 to allow for easier distribution of the material and tool. This document reviews the user instructions for the operation of this system.
Current advances in research and clinical applications of PLGA-based nanotechnology
Lü, Jian-Ming; Wang, Xinwen; Marin-Muller, Christian; Wang, Hao; Lin, Peter H; Yao, Qizhi; Chen, Changyi
2009-01-01
Co-polymer poly(lactic-co-glycolic acid) (PLGA) nanotechnology has been developed for many years and has been approved by the US FDA for the use of drug delivery, diagnostics and other applications of clinical and basic science research, including cardiovascular disease, cancer, vaccine and tissue engineering. This article presents the more recent successes of applying PLGA-based nanotechnologies and tools in these medicine-related applications. It focuses on the possible mechanisms, diagnosis and treatment effects of PLGA preparations and devices. This updated information will benefit to both new and established research scientists and clinical physicians who are interested in the development and application of PLGA nanotechnology as new therapeutic and diagnostic strategies for many diseases. PMID:19435455
Methodological challenges collecting parent phone-call healthcare utilization data.
Moreau, Paula; Crawford, Sybil; Sullivan-Bolyai, Susan
2016-02-01
Recommendations by the National Institute of Nursing Research and other groups have strongly encouraged nurses to pay greater attention to cost-effectiveness analysis when conducting research. Given the increasing prominence of translational science and comparative effective research, cost-effective analysis has become a basic tool in determining intervention value in research. Tracking phone-call communication (number of calls and context) with cross-checks between parents and healthcare providers is an example of this type of healthcare utilization data collection. This article identifies some methodological challenges that have emerged in the process of collecting this type of data in a randomized controlled trial: Parent education Through Simulation-Diabetes (PETS-D). We also describe ways in which those challenges have been addressed with comparison data results, and make recommendations for future research. Copyright © 2015 Elsevier Inc. All rights reserved.
TRI Fotonovela Slideshow - English
Presentation designed to introduce the basic concepts of the Toxics Release Inventory, including why TRI is an important resource for commmunities and which tool provides the easiest access to basic TRI data.
TRI Fotonovela Slideshow - Spanish
Presentation designed to introduce the basic concepts of the Toxics Release Inventory, including why TRI is an important resource for commmunities and which tool provides the easiest access to basic TRI data.
compomics-utilities: an open-source Java library for computational proteomics.
Barsnes, Harald; Vaudel, Marc; Colaert, Niklaas; Helsens, Kenny; Sickmann, Albert; Berven, Frode S; Martens, Lennart
2011-03-08
The growing interest in the field of proteomics has increased the demand for software tools and applications that process and analyze the resulting data. And even though the purpose of these tools can vary significantly, they usually share a basic set of features, including the handling of protein and peptide sequences, the visualization of (and interaction with) spectra and chromatograms, and the parsing of results from various proteomics search engines. Developers typically spend considerable time and effort implementing these support structures, which detracts from working on the novel aspects of their tool. In order to simplify the development of proteomics tools, we have implemented an open-source support library for computational proteomics, called compomics-utilities. The library contains a broad set of features required for reading, parsing, and analyzing proteomics data. compomics-utilities is already used by a long list of existing software, ensuring library stability and continued support and development. As a user-friendly, well-documented and open-source library, compomics-utilities greatly simplifies the implementation of the basic features needed in most proteomics tools. Implemented in 100% Java, compomics-utilities is fully portable across platforms and architectures. Our library thus allows the developers to focus on the novel aspects of their tools, rather than on the basic functions, which can contribute substantially to faster development, and better tools for proteomics.
Basic Education and Policy Support Activity: Tools and Publications.
ERIC Educational Resources Information Center
Creative Associates International, Inc., Washington, DC.
The Basic Education and Policy Support (BEPS) Activity is a United States Agency for International Development (USAID)-sponsored, multi-year initiative designed to further improve the quality of, effectiveness of, and access to formal and nonformal basic education. This catalog is one element of the BEPS information dissemination process. The…
Industrial Electronics. Performance Objectives. Basic Course.
ERIC Educational Resources Information Center
Tiffany, Earl
Several intermediate performance objectives and corresponding criterion measures are listed for each of 30 terminal objectives for a two-semester (2 hours daily) high school course in basic industrial electronics. The objectives cover instruction in basic electricity including AC-DC theory, magnetism, electrical safety, care and use of hand tools,…
Basic Skills Support in Business and Industry.
ERIC Educational Resources Information Center
Byatt, Janet; Davies, Karen
This guide is designed as a tool for English and Welsh businesses wanting to provide basic skills training for their employees. It provides practical solutions to the problems of identifying employees' basic skills needs and selecting the best model of training delivery to address identified training needs. The introductory section discusses basic…
Easy Ergonomics: A Guide to Selecting Non-Powered Hand Tools
... identifying the presence or absence of basic ergonomic design features (Dababneh et al.*). The right tool will ... Cal/OSHA). Both agencies recognize the importance of design and selection of hand tools in strategies to ...
Introduction to TRI for Communities
Presentation designed to introduce the basic concepts of the Toxics Release Inventory, including why TRI is an important resource for commmunities and which tool provides the easiest access to basic TRI data.
[The significance of Karl Landsteiner's works for syphilis research].
Luger, A
1991-01-01
On January 7th 1905, more than five months before the detection of T. pallidum, Karl Landsteiner began his work on syphilis research together with notable members of the Viennese School of Medicine, namely Ernest Finger, Rudolf Müller, Viktor Mucha, Otto Pötzl and others. Extensive animal experiments led to the formulation of the Finger-Landsteiner Law and provided the basic facts for the Jadasson-Lewandowsky Law. Attempts of active or passive immunization were unsuccessful and, indeed, were still a failure in 1990 after implementation of the latest tools of modern research, including gene technology. Dark-field microscopy was introduced for the detection of T. pallidum by Landsteiner and Mucha. These authors noted that serum of syphilitic patients inhibited the movements of T. pallidum and, thus, observed the basic principle underlying the T. pallidum immobilization test (= TPI = Nelson-Mayer test). Finally, Landsteiner, Müller and Pötzl discovered that it was not an antibody specific to T. pallidum that reacted in the Wassermann reaction, but "autotoxic" substances, which they called reagines. During the 1970's and 1980's it was discovered that these reagines are autoantibodies directed against parts of the inner envelope of the mitochondria.
Govindaraj, Mahalingam
2015-01-01
The number of sequenced crop genomes and associated genomic resources is growing rapidly with the advent of inexpensive next generation sequencing methods. Databases have become an integral part of all aspects of science research, including basic and applied plant and animal sciences. The importance of databases keeps increasing as the volume of datasets from direct and indirect genomics, as well as other omics approaches, keeps expanding in recent years. The databases and associated web portals provide at a minimum a uniform set of tools and automated analysis across a wide range of crop plant genomes. This paper reviews some basic terms and considerations in dealing with crop plant databases utilization in advancing genomic era. The utilization of databases for variation analysis with other comparative genomics tools, and data interpretation platforms are well described. The major focus of this review is to provide knowledge on platforms and databases for genome-based investigations of agriculturally important crop plants. The utilization of these databases in applied crop improvement program is still being achieved widely; otherwise, the end for sequencing is not far away. PMID:25874133
Prat, Maria; Oltolina, Francesca; Basilico, Cristina
2014-01-01
Monoclonal antibodies can be seen as valuable tools for many aspects of basic as well as applied sciences. In the case of MET/HGFR, they allowed the identification of truncated isoforms of the receptor, as well as the dissection of different epitopes, establishing structure–function relationships. Antibodies directed against MET extracellular domain were found to be full or partial receptor agonists or antagonists. The agonists can mimic the effects of the different isoforms of the natural ligand, but with the advantage of being more stable than the latter. Thus, some agonist antibodies promote all the biological responses triggered by MET activation, including motility, proliferation, morphogenesis, and protection from apoptosis, while others can induce only a migratory response. On the other hand, antagonists can inhibit MET-driven biological functions either by competing with the ligand or by removing the receptor from the cell surface. Since MET/HGFR is often over-expressed and/or aberrantly activated in tumors, monoclonal antibodies can be used as probes for MET detection or as “bullets” to target MET-expressing tumor cells, thus pointing to their use in diagnosis and therapy. PMID:28548076
NASA Astrophysics Data System (ADS)
Kogan, Lori R.; Dowers, Kristy L.; Cerda, Jacey R.; Schoenfeld-Tacher, Regina M.; Stewart, Sherry M.
2014-12-01
Veterinary schools, similar to many professional health programs, face a myriad of evolving challenges in delivering their professional curricula including expansion of class size, costs to maintain expensive laboratories, and increased demands on veterinary educators to use curricular time efficiently and creatively. Additionally, exponential expansion of the knowledge base through ongoing biomedical research, educational goals to increase student engagement and clinical reasoning earlier in the curriculum, and students' desire to access course materials and enhance their educational experience through the use of technology all support the need to reassess traditional microscope laboratories within Professional Veterinary Medical (PVM) educational programs. While there is clear justification for teaching veterinary students how to use a microscope for clinical evaluation of cytological preparations (i.e., complete blood count, urinalysis, fecal analysis, fine needle aspirates, etc.), virtual microscopy may be a viable alternative to using light microscopy for teaching and learning fundamental histological concepts. This article discusses results of a survey given to assess Professional Veterinary Medical students' perceptions of using virtual microscope for learning basic histology/microscopic anatomy and implications of these results for using virtual microscopy as a pedagogical tool in teaching first-year Professional Veterinary Medical students' basic histology.
Huang, Ying; Li, Cao; Liu, Linhai; Jia, Xianbo; Lai, Song-Jia
2016-01-01
Although various computer tools have been elaborately developed to calculate a series of statistics in molecular population genetics for both small- and large-scale DNA data, there is no efficient and easy-to-use toolkit available yet for exclusively focusing on the steps of mathematical calculation. Here, we present PopSc, a bioinformatic toolkit for calculating 45 basic statistics in molecular population genetics, which could be categorized into three classes, including (i) genetic diversity of DNA sequences, (ii) statistical tests for neutral evolution, and (iii) measures of genetic differentiation among populations. In contrast to the existing computer tools, PopSc was designed to directly accept the intermediate metadata, such as allele frequencies, rather than the raw DNA sequences or genotyping results. PopSc is first implemented as the web-based calculator with user-friendly interface, which greatly facilitates the teaching of population genetics in class and also promotes the convenient and straightforward calculation of statistics in research. Additionally, we also provide the Python library and R package of PopSc, which can be flexibly integrated into other advanced bioinformatic packages of population genetics analysis. PMID:27792763
NASA Astrophysics Data System (ADS)
Gama Goicochea, A.; Balderas Altamirano, M. A.; Lopez-Esparza, R.; Waldo-Mendoza, Miguel A.; Perez, E.
2015-09-01
The connection between fundamental interactions acting in molecules in a fluid and macroscopically measured properties, such as the viscosity between colloidal particles coated with polymers, is studied here. The role that hydrodynamic and Brownian forces play in colloidal dispersions is also discussed. It is argued that many-body systems in which all these interactions take place can be accurately solved using computational simulation tools. One of those modern tools is the technique known as dissipative particle dynamics, which incorporates Brownian and hydrodynamic forces, as well as basic conservative interactions. A case study is reported, as an example of the applications of this technique, which consists of the prediction of the viscosity and friction between two opposing parallel surfaces covered with polymer chains, under the influence of a steady flow. This work is intended to serve as an introduction to the subject of colloidal dispersions and computer simulations, for final-year undergraduate students and beginning graduate students who are interested in beginning research in soft matter systems. To that end, a computational code is included that students can use right away to study complex fluids in equilibrium.
Chen, Shi-Yi; Deng, Feilong; Huang, Ying; Li, Cao; Liu, Linhai; Jia, Xianbo; Lai, Song-Jia
2016-01-01
Although various computer tools have been elaborately developed to calculate a series of statistics in molecular population genetics for both small- and large-scale DNA data, there is no efficient and easy-to-use toolkit available yet for exclusively focusing on the steps of mathematical calculation. Here, we present PopSc, a bioinformatic toolkit for calculating 45 basic statistics in molecular population genetics, which could be categorized into three classes, including (i) genetic diversity of DNA sequences, (ii) statistical tests for neutral evolution, and (iii) measures of genetic differentiation among populations. In contrast to the existing computer tools, PopSc was designed to directly accept the intermediate metadata, such as allele frequencies, rather than the raw DNA sequences or genotyping results. PopSc is first implemented as the web-based calculator with user-friendly interface, which greatly facilitates the teaching of population genetics in class and also promotes the convenient and straightforward calculation of statistics in research. Additionally, we also provide the Python library and R package of PopSc, which can be flexibly integrated into other advanced bioinformatic packages of population genetics analysis.
Autism research funding allocation: can economics tell us if we have got it right?
Zwicker, Jennifer D; Emery, J C Herbert
2014-12-01
There is a concern that the allocation of autism spectrum disorder (ASD) research funding may be misallocating resources, overemphasizing basic science at the expense of translational and clinical research. Anthony Bailey has proposed that an economic evaluation of autism research funding allocations could be beneficial for funding agencies by identifying under- or overfunded areas of research. In response to Bailey, we illustrate why economics cannot provide an objective, technical solution for identifying the "best" allocation of research resources. Economic evaluation has its greatest power as a late-stage research tool for interventions with identified objectives, outcomes, and data. This is not the case for evaluating whether research areas are over- or underfunded. Without an understanding of how research funding influences the likelihood and value of a discovery, or without a statement of the societal objectives for ASD research and level of risk aversion, economic analysis cannot provide a useful normative evaluation of ASD research. © 2014 International Society for Autism Research, Wiley Periodicals, Inc.
32 CFR Appendix A to Part 272 - Principles for the Conduct and Support of Basic Research
Code of Federal Regulations, 2010 CFR
2010-07-01
... Research A Appendix A to Part 272 National Defense Department of Defense (Continued) OFFICE OF THE SECRETARY OF DEFENSE (CONTINUED) MISCELLANEOUS ADMINISTRATION AND SUPPORT OF BASIC RESEARCH BY THE... Research 1. Basic research is an investment. The DoD Components are to view and manage basic research...
NASA Astrophysics Data System (ADS)
Gruska, Jozef
2012-06-01
One of the most basic tasks in quantum information processing, communication and security (QIPCC) research, theoretically deep and practically important, is to find bounds on how really important are inherently quantum resources for speeding up computations. This area of research is bringing a variety of results that imply, often in a very unexpected and counter-intuitive way, that: (a) surprisingly large classes of quantum circuits and algorithms can be efficiently simulated on classical computers; (b) the border line between quantum processes that can and cannot be efficiently simulated on classical computers is often surprisingly thin; (c) the addition of a seemingly very simple resource or a tool often enormously increases the power of available quantum tools. These discoveries have put also a new light on our understanding of quantum phenomena and quantum physics and on the potential of its inherently quantum and often mysteriously looking phenomena. The paper motivates and surveys research and its outcomes in the area of de-quantisation, especially presents various approaches and their outcomes concerning efficient classical simulations of various families of quantum circuits and algorithms. To motivate this area of research some outcomes in the area of de-randomization of classical randomized computations.
NASA Technical Reports Server (NTRS)
Voellmer, George M.
1992-01-01
Mechanism enables robot to change tools on end of arm. Actuated by motion of robot: requires no additional electrical or pneumatic energy to make or break connection between tool and wrist at end of arm. Includes three basic subassemblies: wrist interface plate attached to robot arm at wrist, tool interface plate attached to tool, and holster. Separate tool interface plate and holster provided for each tool robot uses.
Amin, Waqas; Parwani, Anil V; Schmandt, Linda; Mohanty, Sambit K; Farhat, Ghada; Pople, Andrew K; Winters, Sharon B; Whelan, Nancy B; Schneider, Althea M; Milnes, John T; Valdivieso, Federico A; Feldman, Michael; Pass, Harvey I; Dhir, Rajiv; Melamed, Jonathan; Becich, Michael J
2008-08-13
Advances in translational research have led to the need for well characterized biospecimens for research. The National Mesothelioma Virtual Bank is an initiative which collects annotated datasets relevant to human mesothelioma to develop an enterprising biospecimen resource to fulfill researchers' need. The National Mesothelioma Virtual Bank architecture is based on three major components: (a) common data elements (based on College of American Pathologists protocol and National North American Association of Central Cancer Registries standards), (b) clinical and epidemiologic data annotation, and (c) data query tools. These tools work interoperably to standardize the entire process of annotation. The National Mesothelioma Virtual Bank tool is based upon the caTISSUE Clinical Annotation Engine, developed by the University of Pittsburgh in cooperation with the Cancer Biomedical Informatics Grid (caBIG, see http://cabig.nci.nih.gov). This application provides a web-based system for annotating, importing and searching mesothelioma cases. The underlying information model is constructed utilizing Unified Modeling Language class diagrams, hierarchical relationships and Enterprise Architect software. The database provides researchers real-time access to richly annotated specimens and integral information related to mesothelioma. The data disclosed is tightly regulated depending upon users' authorization and depending on the participating institute that is amenable to the local Institutional Review Board and regulation committee reviews. The National Mesothelioma Virtual Bank currently has over 600 annotated cases available for researchers that include paraffin embedded tissues, tissue microarrays, serum and genomic DNA. The National Mesothelioma Virtual Bank is a virtual biospecimen registry with robust translational biomedical informatics support to facilitate basic science, clinical, and translational research. Furthermore, it protects patient privacy by disclosing only de-identified datasets to assure that biospecimens can be made accessible to researchers.
Sentiment Analysis of Health Care Tweets: Review of the Methods Used.
Gohil, Sunir; Vuik, Sabine; Darzi, Ara
2018-04-23
Twitter is a microblogging service where users can send and read short 140-character messages called "tweets." There are several unstructured, free-text tweets relating to health care being shared on Twitter, which is becoming a popular area for health care research. Sentiment is a metric commonly used to investigate the positive or negative opinion within these messages. Exploring the methods used for sentiment analysis in Twitter health care research may allow us to better understand the options available for future research in this growing field. The first objective of this study was to understand which tools would be available for sentiment analysis of Twitter health care research, by reviewing existing studies in this area and the methods they used. The second objective was to determine which method would work best in the health care settings, by analyzing how the methods were used to answer specific health care questions, their production, and how their accuracy was analyzed. A review of the literature was conducted pertaining to Twitter and health care research, which used a quantitative method of sentiment analysis for the free-text messages (tweets). The study compared the types of tools used in each case and examined methods for tool production, tool training, and analysis of accuracy. A total of 12 papers studying the quantitative measurement of sentiment in the health care setting were found. More than half of these studies produced tools specifically for their research, 4 used open source tools available freely, and 2 used commercially available software. Moreover, 4 out of the 12 tools were trained using a smaller sample of the study's final data. The sentiment method was trained against, on an average, 0.45% (2816/627,024) of the total sample data. One of the 12 papers commented on the analysis of accuracy of the tool used. Multiple methods are used for sentiment analysis of tweets in the health care setting. These range from self-produced basic categorizations to more complex and expensive commercial software. The open source and commercial methods are developed on product reviews and generic social media messages. None of these methods have been extensively tested against a corpus of health care messages to check their accuracy. This study suggests that there is a need for an accurate and tested tool for sentiment analysis of tweets trained using a health care setting-specific corpus of manually annotated tweets first. ©Sunir Gohil, Sabine Vuik, Ara Darzi. Originally published in JMIR Public Health and Surveillance (http://publichealth.jmir.org), 23.04.2018.
Application of hazard and effects management tools and links to the HSE case
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gower-Jones, A.D.; Graaf, G.C. van der; Milne, D.J.
1996-12-31
Many tools and techniques are promoted for the analysis and management of hazards and their effects. The proliferation in the last 5-6 years of these tools has resulted in an overload on designers, engineers and operators of E&P activities and assets to the extent that they are unsure what to do when and how this fits together. This paper starts from the basic E&P business (a business model) the basic structure of any accidental event (bow tie) and maps the tools and techniques to analyze the hazards and effects for both asset and activity HSE management. The links to developingmore » an HSE case within the HSE-MS for assets and activities are given.« less
Modular magnetic tweezers for single-molecule characterizations of helicases.
Kemmerich, Felix E; Kasaciunaite, Kristina; Seidel, Ralf
2016-10-01
Magnetic tweezers provide a versatile toolkit supporting the mechanistic investigation of helicases. In the present article, we show that custom magnetic tweezers setups are straightforward to construct and can easily be extended to provide adaptable platforms, capable of addressing a multitude of enquiries regarding the functions of these fascinating molecular machines. We first address the fundamental components of a basic magnetic tweezers scheme and review some previous results to demonstrate the versatility of this instrument. We then elaborate on several extensions to the basic magnetic tweezers scheme, and demonstrate their applications with data from ongoing research. As our methodological overview illustrates, magnetic tweezers are an extremely useful tool for the characterization of helicases and a custom built instrument can be specifically tailored to suit the experimenter's needs. Copyright © 2016 Elsevier Inc. All rights reserved.
Batzir, Nurit Assia; Tovin, Adi; Hendel, Ayal
2017-06-01
Genome editing with engineered nucleases is a rapidly growing field thanks to transformative technologies that allow researchers to precisely alter genomes for numerous applications including basic research, biotechnology, and human gene therapy. The genome editing process relies on creating a site-specific DNA double-strand break (DSB) by engineered nucleases and then allowing the cell's repair machinery to repair the break such that precise changes are made to the DNA sequence. The recent development of CRISPR-Cas systems as easily accessible and programmable tools for genome editing accelerates the progress towards using genome editing as a new approach to human therapeutics. Here we review how genome editing using engineered nucleases works and how using different genome editing outcomes can be used as a tool set for treating human diseases. We then review the major challenges of therapeutic genome editing and we discuss how its potential enhancement through CRISPR guide RNA and Cas9 protein modifications could resolve some of these challenges. Copyright© of YS Medical Media ltd.
Nicotine uses and abuses: from brain probe to public health menace.
Pomerleau, O F; Pomerleau, C S
1989-01-01
There has been a notable lack of dialogue between neuroscientists, who use nicotine in their work as they would any other pharmacological tool, and public policy and health researchers, who view nicotine dependence with increasing dismay and see the continued use of tobacco products as a modern day scourge. This special journal issue attempts to foster communication among nicotine researchers working along the continuum from basic to applied science. An additional objective is to convey a sense for the special problems and opportunities in the study of nicotine and tobacco use that may be of general interest to those concerned with substance abuse. The articles that follow explore two themes, (1) nicotine as a tool to probe neural activity, and (2) tobacco use as a health hazard and societal problem, by examining nicotine from pharmacochemical, biobehavioral, and econo-social perspectives. The rationale for the integration is that there may be benefits from viewing nicotine in a context broader than those dictated by custom and technological specialization.
Fang, Bin; Hoffman, Melissa A.; Mirza, Abu-Sayeef; Mishall, Katie M.; Li, Jiannong; Peterman, Scott M.; Smalley, Keiran S. M.; Shain, Kenneth H.; Weinberger, Paul M.; Wu, Jie; Rix, Uwe; Haura, Eric B.; Koomen, John M.
2015-01-01
Cancer biologists and other healthcare researchers face an increasing challenge in addressing the molecular complexity of disease. Biomarker measurement tools and techniques now contribute to both basic science and translational research. In particular, liquid chromatography-multiple reaction monitoring mass spectrometry (LC-MRM) for multiplexed measurements of protein biomarkers has emerged as a versatile tool for systems biology. Assays can be developed for specific peptides that report on protein expression, mutation, or post-translational modification; discovery proteomics data rapidly translated into multiplexed quantitative approaches. Complementary advances in affinity purification enrich classes of enzymes or peptides representing post-translationally modified or chemically labeled substrates. Here, we illustrate the process for the relative quantification of hundreds of peptides in a single LC-MRM experiment. Desthiobiotinylated peptides produced by activity-based protein profiling (ABPP) using ATP probes and tyrosine-phosphorylated peptides are used as examples. These targeted quantification panels can be applied to further understand the biology of human disease. PMID:25782629
Comparison of software packages for detecting differential expression in RNA-seq studies
Seyednasrollah, Fatemeh; Laiho, Asta
2015-01-01
RNA-sequencing (RNA-seq) has rapidly become a popular tool to characterize transcriptomes. A fundamental research problem in many RNA-seq studies is the identification of reliable molecular markers that show differential expression between distinct sample groups. Together with the growing popularity of RNA-seq, a number of data analysis methods and pipelines have already been developed for this task. Currently, however, there is no clear consensus about the best practices yet, which makes the choice of an appropriate method a daunting task especially for a basic user without a strong statistical or computational background. To assist the choice, we perform here a systematic comparison of eight widely used software packages and pipelines for detecting differential expression between sample groups in a practical research setting and provide general guidelines for choosing a robust pipeline. In general, our results demonstrate how the data analysis tool utilized can markedly affect the outcome of the data analysis, highlighting the importance of this choice. PMID:24300110
Comparison of software packages for detecting differential expression in RNA-seq studies.
Seyednasrollah, Fatemeh; Laiho, Asta; Elo, Laura L
2015-01-01
RNA-sequencing (RNA-seq) has rapidly become a popular tool to characterize transcriptomes. A fundamental research problem in many RNA-seq studies is the identification of reliable molecular markers that show differential expression between distinct sample groups. Together with the growing popularity of RNA-seq, a number of data analysis methods and pipelines have already been developed for this task. Currently, however, there is no clear consensus about the best practices yet, which makes the choice of an appropriate method a daunting task especially for a basic user without a strong statistical or computational background. To assist the choice, we perform here a systematic comparison of eight widely used software packages and pipelines for detecting differential expression between sample groups in a practical research setting and provide general guidelines for choosing a robust pipeline. In general, our results demonstrate how the data analysis tool utilized can markedly affect the outcome of the data analysis, highlighting the importance of this choice. © The Author 2013. Published by Oxford University Press.
Time and the rivers flowing: Fluvial geomorphology since 1960
NASA Astrophysics Data System (ADS)
Wohl, Ellen
2014-07-01
Fluvial geomorphology has been the largest single subdiscipline within geomorphology for many decades. Fluvial geomorphic expertise is integral to understanding and managing rivers and to developing strategies for sustainable development. This paper provides an overview of some of the significant advances in fluvial geomorphology between 1960 and 2010 with respect to: conceptual models; fluvial features and environments being studied; tools used by fluvial geomorphologists; geomorphic specialty groups within professional societies; journals in which fluvial geomorphic research is published; and textbooks of fluvial geomorphology. During this half century, fluvial geomorphology broadened considerably in scope, from a focus primarily on physical principles underlying process and form in lower gradient channels with limited grain size range, to a more integrative view of rivers as ecosystems with nonlinear behavior and great diversity of gradient, substrate composition, and grain size. The array of tools for making basic observations, analyzing data, and disseminating research results also expanded considerably during this period, as did the diversity of the fluvial geomorphic community.
Bax, Leon; Yu, Ly-Mee; Ikeda, Noriaki; Tsuruta, Harukazu; Moons, Karel GM
2006-01-01
Background Meta-analysis has become a well-known method for synthesis of quantitative data from previously conducted research in applied health sciences. So far, meta-analysis has been particularly useful in evaluating and comparing therapies and in assessing causes of disease. Consequently, the number of software packages that can perform meta-analysis has increased over the years. Unfortunately, it can take a substantial amount of time to get acquainted with some of these programs and most contain little or no interactive educational material. We set out to create and validate an easy-to-use and comprehensive meta-analysis package that would be simple enough programming-wise to remain available as a free download. We specifically aimed at students and researchers who are new to meta-analysis, with important parts of the development oriented towards creating internal interactive tutoring tools and designing features that would facilitate usage of the software as a companion to existing books on meta-analysis. Results We took an unconventional approach and created a program that uses Excel as a calculation and programming platform. The main programming language was Visual Basic, as implemented in Visual Basic 6 and Visual Basic for Applications in Excel 2000 and higher. The development took approximately two years and resulted in the 'MIX' program, which can be downloaded from the program's website free of charge. Next, we set out to validate the MIX output with two major software packages as reference standards, namely STATA (metan, metabias, and metatrim) and Comprehensive Meta-Analysis Version 2. Eight meta-analyses that had been published in major journals were used as data sources. All numerical and graphical results from analyses with MIX were identical to their counterparts in STATA and CMA. The MIX program distinguishes itself from most other programs by the extensive graphical output, the click-and-go (Excel) interface, and the educational features. Conclusion The MIX program is a valid tool for performing meta-analysis and may be particularly useful in educational environments. It can be downloaded free of charge via or . PMID:17038197
Bax, Leon; Yu, Ly-Mee; Ikeda, Noriaki; Tsuruta, Harukazu; Moons, Karel G M
2006-10-13
Meta-analysis has become a well-known method for synthesis of quantitative data from previously conducted research in applied health sciences. So far, meta-analysis has been particularly useful in evaluating and comparing therapies and in assessing causes of disease. Consequently, the number of software packages that can perform meta-analysis has increased over the years. Unfortunately, it can take a substantial amount of time to get acquainted with some of these programs and most contain little or no interactive educational material. We set out to create and validate an easy-to-use and comprehensive meta-analysis package that would be simple enough programming-wise to remain available as a free download. We specifically aimed at students and researchers who are new to meta-analysis, with important parts of the development oriented towards creating internal interactive tutoring tools and designing features that would facilitate usage of the software as a companion to existing books on meta-analysis. We took an unconventional approach and created a program that uses Excel as a calculation and programming platform. The main programming language was Visual Basic, as implemented in Visual Basic 6 and Visual Basic for Applications in Excel 2000 and higher. The development took approximately two years and resulted in the 'MIX' program, which can be downloaded from the program's website free of charge. Next, we set out to validate the MIX output with two major software packages as reference standards, namely STATA (metan, metabias, and metatrim) and Comprehensive Meta-Analysis Version 2. Eight meta-analyses that had been published in major journals were used as data sources. All numerical and graphical results from analyses with MIX were identical to their counterparts in STATA and CMA. The MIX program distinguishes itself from most other programs by the extensive graphical output, the click-and-go (Excel) interface, and the educational features. The MIX program is a valid tool for performing meta-analysis and may be particularly useful in educational environments. It can be downloaded free of charge via http://www.mix-for-meta-analysis.info or http://sourceforge.net/projects/meta-analysis.
Visual Basic Applications to Physics Teaching
ERIC Educational Resources Information Center
Chitu, Catalin; Inpuscatu, Razvan Constantin; Viziru, Marilena
2011-01-01
Derived from basic language, VB (Visual Basic) is a programming language focused on the video interface component. With graphics and functional components implemented, the programmer is able to bring and use their components to achieve the desired application in a relatively short time. Language VB is a useful tool in physics teaching by creating…
Learning Genetics with Paper Pets
ERIC Educational Resources Information Center
Finnerty, Valerie Raunig
2006-01-01
By the end of the eighth grade, students are expected to have a basic understanding of the mechanism of basic genetic inheritance. However, these concepts can be difficult to teach. In this article, the author introduces a new learning tool that will help facilitate student learning and enthusiasm to the basic concepts of genetic inheritance. This…
Clinical Correlations as a Tool in Basic Science Medical Education
Klement, Brenda J.; Paulsen, Douglas F.; Wineski, Lawrence E.
2016-01-01
Clinical correlations are tools to assist students in associating basic science concepts with a medical application or disease. There are many forms of clinical correlations and many ways to use them in the classroom. Five types of clinical correlations that may be embedded within basic science courses have been identified and described. (1) Correlated examples consist of superficial clinical information or stories accompanying basic science concepts to make the information more interesting and relevant. (2) Interactive learning and demonstrations provide hands-on experiences or the demonstration of a clinical topic. (3) Specialized workshops have an application-based focus, are more specialized than typical laboratory sessions, and range in complexity from basic to advanced. (4) Small-group activities require groups of students, guided by faculty, to solve simple problems that relate basic science information to clinical topics. (5) Course-centered problem solving is a more advanced correlation activity than the others and focuses on recognition and treatment of clinical problems to promote clinical reasoning skills. Diverse teaching activities are used in basic science medical education, and those that include clinical relevance promote interest, communication, and collaboration, enhance knowledge retention, and help develop clinical reasoning skills. PMID:29349328
Teaching Basic Business: An Entrepreneurial Perspective.
ERIC Educational Resources Information Center
Smith, Marsha O.
2003-01-01
Suggests that by incorporating entrepreneurship into the basic business curriculum now, business educators will better prepare students for a changing environment. Offers the business plan as a tool for integrating entrepreneurship into the curriculum. (SK)
TRI Fotonovela (Latino/Hispanic novella-style introduction to TRI)
Presentation designed to introduce the basic concepts of the Toxics Release Inventory, including why TRI is an important resource for commmunities and which tool provides the easiest access to basic TRI data.
Is the use of sentient animals in basic research justifiable?
2010-01-01
Animals can be used in many ways in science and scientific research. Given that society values sentient animals and that basic research is not goal oriented, the question is raised: "Is the use of sentient animals in basic research justifiable?" We explore this in the context of funding issues, outcomes from basic research, and the position of society as a whole on using sentient animals in research that is not goal oriented. We conclude that the use of sentient animals in basic research cannot be justified in light of society's priorities. PMID:20825676
Is the use of sentient animals in basic research justifiable?
Greek, Ray; Greek, Jean
2010-09-08
Animals can be used in many ways in science and scientific research. Given that society values sentient animals and that basic research is not goal oriented, the question is raised: "Is the use of sentient animals in basic research justifiable?" We explore this in the context of funding issues, outcomes from basic research, and the position of society as a whole on using sentient animals in research that is not goal oriented. We conclude that the use of sentient animals in basic research cannot be justified in light of society's priorities.
Participant comprehension of research for which they volunteer: a systematic review.
Montalvo, Wanda; Larson, Elaine
2014-11-01
Evidence indicates that research participants often do not fully understand the studies for which they have volunteered. The aim of this systematic review was to examine the relationship between the process of obtaining informed consent for research and participant comprehension and satisfaction with the research. Systematic review of published research on informed consent and participant comprehension of research for which they volunteer using the Preferred Reporting Items for Systematic Review and Meta-Analysis (PRISMA) Statement as a guide. PubMed, Cumulative Index for Nursing and Allied Health Literature, Cochrane Central Register of Controlled Trails, and Cochrane Database of Systematic Reviews were used to search the literature for studies meeting the following inclusion criteria: (a) published between January 1, 2006, and December 31, 2013, (b) interventional or descriptive quantitative design, (c) published in a peer-reviewed journal, (d) written in English, and (e) assessed participant comprehension or satisfaction with the research process. Studies were assessed for quality using seven indicators: sampling method, use of controls or comparison groups, response rate, description of intervention, description of outcome, statistical method, and health literacy assessment. Of 176 studies identified, 27 met inclusion criteria: 13 (48%) were randomized interventional designs and 14 (52%) were descriptive. Three categories of studies included projects assessing (a) enhanced consent process or form, (b) multimedia methods, and (c) education to improve participant understanding. Most (78%) used investigator-developed tools to assess participant comprehension, did not assess participant health literacy (74%), or did not assess the readability level of the consent form (89%). Researchers found participants lacked basic understanding of research elements: randomization, placebo, risks, and therapeutic misconception. Findings indicate (a) inconsistent assessment of participant reading or health literacy level, (b) measurement variation associated with use of nonstandardized tools, and (c) continued therapeutic misconception and lack of understanding among research participants of randomization, placebo, benefit, and risk. While the Agency for Healthcare and Quality and National Quality Forum have published informed consent and authorization toolkits, previously published validated tools are underutilized. Informed consent requires the assessment of health literacy, reading level, and comprehension of research participants using validated assessment tools and methods. © 2014 Sigma Theta Tau International.
Sustainability in care through an ethical practice model.
Nyholm, Linda; Salmela, Susanne; Nyström, Lisbet; Koskinen, Camilla
2018-03-01
While sustainability is a key concept in many different domains today, it has not yet been sufficiently emphasized in the healthcare sector. Earlier research shows that ethical values and evidence-based care models create sustainability in care practice. The aim of this study was to gain further understanding of the ethical values central to the realization of sustainability in care and to create an ethical practice model whereby these basic values can be made perceptible and active in care practice. Part of the ongoing "Ethical Sustainable Caring Cultures" research project, a hermeneutical application research design was employed in this study. Dialogues were used, where scientific researchers and co-researchers were given the opportunity to reflect on ethical values in relation to sustainability in care. An ethical practice model with ethos as its core was created from the results of the dialogues. In the model, ethos is encircled by the ethical values central to sustainability: dignity, responsibility, respect, invitation, and vows. The model can be used as a starting point for ethical conversations that support carers' reflections on the ethical issues seen in day-to-day care work and the work community, allowing ethical values to become visible throughout the entire care culture. It is intended as a tool whereby carers can more deeply understand an organization's common basic values and what they entail in regard to sustainability in care.
Energy Frontier Research Centers: Science for Our Nation's Energy Future, September 2016
DOE Office of Scientific and Technical Information (OSTI.GOV)
None, None
As world demand for energy rapidly expands, transforming the way energy is collected, stored, and used has become a defining challenge of the 21st century. At its heart, this challenge is a scientific one, inspiring the U.S. Department of Energy’s (DOE) Office of Basic Energy Sciences (BES) to establish the Energy Frontier Research Center (EFRC) program in 2009. The EFRCs represent a unique approach, bringing together creative, multidisciplinary scientific teams to perform energy-relevant basic research with a complexity beyond the scope of single-investigator projects. These centers take full advantage of powerful new tools for characterizing, understanding, modeling, and manipulating mattermore » from atomic to macroscopic length scales. They also train the next-generation scientific workforce by attracting talented students and postdoctoral researchers interested in energy science. The EFRCs have collectively demonstrated the potential to substantially advance the scientific understanding underpinning transformational energy technologies. Both a BES Committee of Visitors and a Secretary of Energy Advisory Board Task Force have found the EFRC program to be highly successful in meeting its goals. The scientific output from the EFRCs is impressive, and many centers have reported that their results are already impacting both technology research and industry. This report on the EFRC program includes selected highlights from the initial 46 EFRCs and the current 36 EFRCs.« less
Chimpanzees create and modify probe tools functionally: A study with zoo-housed chimpanzees.
Hopper, Lydia M; Tennie, Claudio; Ross, Stephen R; Lonsdorf, Elizabeth V
2015-02-01
Chimpanzees (Pan troglodytes) use tools to probe for out-of-reach food, both in the wild and in captivity. Beyond gathering appropriately-sized materials to create tools, chimpanzees also perform secondary modifications in order to create an optimized tool. In this study, we recorded the behavior of a group of zoo-housed chimpanzees when presented with opportunities to use tools to probe for liquid foods in an artificial termite mound within their enclosure. Previous research with this group of chimpanzees has shown that they are proficient at gathering materials from within their environment in order to create tools to probe for the liquid food within the artificial mound. Extending beyond this basic question, we first asked whether they only made and modified probe tools when it was appropriate to do so (i.e. when the mound was baited with food). Second, by collecting continuous data on their behavior, we also asked whether the chimpanzees first (intentionally) modified their tools prior to probing for food or whether such modifications occurred after tool use, possibly as a by-product of chewing and eating the food from the tools. Following our predictions, we found that tool modification predicted tool use; the chimpanzees began using their tools within a short delay of creating and modifying them, and the chimpanzees performed more tool modifying behaviors when food was available than when they could not gain food through the use of probe tools. We also discuss our results in terms of the chimpanzees' acquisition of the skills, and their flexibility of tool use and learning. © 2014 Wiley Periodicals, Inc.
Genetics and Genomics in Oncology Nursing: What Does Every Nurse Need to Know?
Eggert, Julie
2017-03-01
In addition to the need for basic education about genetics/genomics, other approaches are suggested to include awareness campaigns, continuing education courses, policy review, and onsite clinical development. These alternative learning strategies encourage oncology nurses across the continuum of care, from the bedside/seatside to oncology nurse research, to integrate genomics into all levels of practice and research in the specialty of oncology nursing. All nurses are warriors in the fight against cancer. The goal of this article is to identify genomic information that oncology nurses, at all levels of care, need to know and use as tools in the war against cancer. Copyright © 2016 Elsevier Inc. All rights reserved.
Evaluating community and campus environmental public health programs.
Pettibone, Kristianna G; Parras, Juan; Croisant, Sharon Petronella; Drew, Christina H
2014-01-01
The National Institute of Environmental Health Sciences' (NIEHS) Partnerships for Environmental Public Health (PEPH) program created the Evaluation Metrics Manual as a tool to help grantees understand how to map out their programs using a logic model, and to identify measures for documenting their achievements in environmental public health research. This article provides an overview of the manual, describing how grantees and community partners contributed to the manual, and how the basic components of a logic model can be used to identify metrics. We illustrate how the approach can be implemented, using a real-world case study from the University of Texas Medical Branch, where researchers worked with community partners to develop a network to address environmental justice issues.
Synthetic biology expands chemical control of microorganisms.
Ford, Tyler J; Silver, Pamela A
2015-10-01
The tools of synthetic biology allow researchers to change the ways engineered organisms respond to chemical stimuli. Decades of basic biology research and new efforts in computational protein and RNA design have led to the development of small molecule sensors that can be used to alter organism function. These new functions leap beyond the natural propensities of the engineered organisms. They can range from simple fluorescence or growth reporting to pathogen killing, and can involve metabolic coordination among multiple cells or organisms. Herein, we discuss how synthetic biology alters microorganisms' responses to chemical stimuli resulting in the development of microbes as toxicity sensors, disease treatments, and chemical factories. Copyright © 2015 Elsevier Ltd. All rights reserved.
The plastid genomes of flowering plants.
Ruhlman, Tracey A; Jansen, Robert K
2014-01-01
The plastid genome (plastome) has proved a valuable source of data for evaluating evolutionary relationships among angiosperms. Through basic and applied approaches, plastid transformation technology offers the potential to understand and improve plant productivity, providing food, fiber, energy and medicines to meet the needs of a burgeoning global population. The growing genomic resources available to both phylogenetic and biotechnological investigations are allowing novel insights and expanding the scope of plastome research to encompass new species. In this chapter we present an overview of some of the seminal and contemporary research that has contributed to our current understanding of plastome evolution and attempt to highlight the relationship between evolutionary mechanisms and tools of plastid genetic engineering.
From Research to Flight: Thinking About Implementation While Performing Fundamental Research
NASA Technical Reports Server (NTRS)
Johnson, Les
2010-01-01
This slide presentation calls for a strategy to implement new technologies. Such a strategy would allow advanced space transportation technologies to mature for exploration beyond Earth orbit. It discusses the difference between technology push versus technology pull. It also reviews the three basic technology readiness levels (TRL). The presentation traces examples of technology development to flight application: the Space Shuttle Main Engine Advanced Health Management System, the Friction Stir Welding technology the (auto-adjustable pin tool). A couple of technologies currently not in flight, but are being reviewed for potential use are: cryogenic fluid management (CFM), and solar sail propulsion. There is also an attempt to explain why new technologies are so difficult to field.
Overview of recent DNA vaccine development for fish
Kurath, G.; ,
2005-01-01
Since the first description of DNA vaccines for fish in 1996, numerous studies of genetic immunisation against the rhabdovirus pathogens infectious haematopoietic necrosis virus (IHNV) and viral haemorrhagic septicaemia virus (VHSV) have established their potential as both highly efficacious biologicals and useful basic research tools. Single small doses of rhabdovirus DNA constructs provide extremely strong protection against severe viral challenge under a variety of conditions. DNA vaccines for several other important fish viruses, bacteria, and parasites are under investigation, but they have not yet shown high efficacy. Therefore, current research is focussed on mechanistic studies to understand the basis of protection, and on improvement of the nucleic acid vaccine applications against a wider range of fish pathogens.
psiTurk: An open-source framework for conducting replicable behavioral experiments online.
Gureckis, Todd M; Martin, Jay; McDonnell, John; Rich, Alexander S; Markant, Doug; Coenen, Anna; Halpern, David; Hamrick, Jessica B; Chan, Patricia
2016-09-01
Online data collection has begun to revolutionize the behavioral sciences. However, conducting carefully controlled behavioral experiments online introduces a number of new of technical and scientific challenges. The project described in this paper, psiTurk, is an open-source platform which helps researchers develop experiment designs which can be conducted over the Internet. The tool primarily interfaces with Amazon's Mechanical Turk, a popular crowd-sourcing labor market. This paper describes the basic architecture of the system and introduces new users to the overall goals. psiTurk aims to reduce the technical hurdles for researchers developing online experiments while improving the transparency and collaborative nature of the behavioral sciences.
Medical Physics Panel Discussion
NASA Astrophysics Data System (ADS)
Guèye, Paul; Avery, Steven; Baird, Richard; Soares, Christopher; Amols, Howard; Tripuraneni, Prabhakar; Majewski, Stan; Weisenberger, Drew
2006-03-01
The panel discussion will explore opportunities and vistas in medical physics research and practice, medical imaging, teaching medical physics to undergraduates, and medical physics curricula as a recruiting tool for physics departments. Panel members consist of representatives from NSBP (Paul Guèye and Steven Avery), NIH/NIBIB (Richard Baird), NIST (Christopher Soares), AAPM (Howard Amols), ASTRO (Prabhakar Tripuraneni), and Jefferson Lab (Stan Majewski and Drew Weisenberger). Medical Physicists are part of Departments of Radiation Oncology at hospitals and medical centers. The field of medical physics includes radiation therapy physics, medical diagnostic and imaging physics, nuclear medicine physics, and medical radiation safety. It also ranges from basic researcher (at college institutions, industries, and laboratories) to applications in clinical environments.
Expanding the Scope of Site-Specific Recombinases for Genetic and Metabolic Engineering
Gaj, Thomas; Sirk, Shannon J.; Barbas, Carlos F.
2014-01-01
Site-specific recombinases are tremendously valuable tools for basic research and genetic engineering. By promoting high-fidelity DNA modifications, site-specific recombination systems have empowered researchers with unprecedented control over diverse biological functions, enabling countless insights into cellular structure and function. The rigid target specificities of many sites-specific recombinases, however, have limited their adoption in fields that require highly flexible recognition abilities. As a result, intense effort has been directed toward altering the properties of site-specific recombination systems by protein engineering. Here, we review key developments in the rational design and directed molecular evolution of site-specific recombinases, highlighting the numerous applications of these enzymes across diverse fields of study. PMID:23982993
NASA Technical Reports Server (NTRS)
Chimiak, Reine; Harris, Bernard; Williams, Phillip
2013-01-01
Basic Common Data Format (CDF) tools (e.g., cdfedit) provide no specific support for creating International Solar-Terrestrial Physics/Space Physics Data Facility (ISTP/SPDF) standard files. While it is possible for someone who is familiar with the ISTP/SPDF metadata guidelines to create compliant files using just the basic tools, the process is error-prone and unreasonable for someone without ISTP/SPDF expertise. The key problem is the lack of a tool with specific support for creating files that comply with the ISTP/SPDF guidelines. There are basic CDF tools such as cdfedit and skeletoncdf for creating CDF files, but these have no specific support for creating ISTP/ SPDF compliant files. The SPDF ISTP CDF skeleton editor is a cross-platform, Java-based GUI editor program that allows someone with only a basic understanding of the ISTP/SPDF guidelines to easily create compliant files. The editor is a simple graphical user interface (GUI) application for creating and editing ISTP/SPDF guideline-compliant skeleton CDF files. The SPDF ISTP CDF skeleton editor consists of the following components: A swing-based Java GUI program, JavaHelp-based manual/ tutorial, Image/Icon files, and HTML Web page for distribution. The editor is available as a traditional Java desktop application as well as a Java Network Launching Protocol (JNLP) application. Once started, it functions like a typical Java GUI file editor application for creating/editing application-unique files.
Somatic Cell Nuclear Transfer in the Mouse
NASA Astrophysics Data System (ADS)
Kishigami, Satoshi; Wakayama, Teruhiko
Somatic cell nuclear transfer (SCNT) has become a unique and powerful tool for epigenetic reprogramming research and gene manipulation in animals since “Dolly,” the first animal cloned from an adult cell was reported in 1997. Although the success rates of somatic cloning have been inefficient and the mechanism of reprogramming is still largely unknown, this technique has been proven to work in more than 10 mammalian species. Among them, the mouse provides the best model for both basic and applied research of somatic cloning because of its abounding genetic resources, rapid sexual maturity and propagation, minimal requirements for housing, etc. This chapter describes a basic protocol for mouse cloning using cumulus cells, the most popular cell type for NT, in which donor nuclei are directly injected into the oocyte using a piezo-actuated micromanipulator. In particular, we focus on a new, more efficient mouse cloning protocol using trichostatin A (TSA), a histone deacetylase (HDAC) inhibitor, which increases both in vitro and in vivo developmental rates from twofold to fivefold. This new method including TSA will be helpful to establish mouse cloning in many laboratories.
General purpose optimization software for engineering design
NASA Technical Reports Server (NTRS)
Vanderplaats, G. N.
1990-01-01
The author has developed several general purpose optimization programs over the past twenty years. The earlier programs were developed as research codes and served that purpose reasonably well. However, in taking the formal step from research to industrial application programs, several important lessons have been learned. Among these are the importance of clear documentation, immediate user support, and consistent maintenance. Most important has been the issue of providing software that gives a good, or at least acceptable, design at minimum computational cost. Here, the basic issues developing optimization software for industrial applications are outlined and issues of convergence rate, reliability, and relative minima are discussed. Considerable feedback has been received from users, and new software is being developed to respond to identified needs. The basic capabilities of this software are outlined. A major motivation for the development of commercial grade software is ease of use and flexibility, and these issues are discussed with reference to general multidisciplinary applications. It is concluded that design productivity can be significantly enhanced by the more widespread use of optimization as an everyday design tool.
Colour thresholding and objective quantification in bioimaging
NASA Technical Reports Server (NTRS)
Fermin, C. D.; Gerber, M. A.; Torre-Bueno, J. R.
1992-01-01
Computer imaging is rapidly becoming an indispensable tool for the quantification of variables in research and medicine. Whilst its use in medicine has largely been limited to qualitative observations, imaging in applied basic sciences, medical research and biotechnology demands objective quantification of the variables in question. In black and white densitometry (0-256 levels of intensity) the separation of subtle differences between closely related hues from stains is sometimes very difficult. True-colour and real-time video microscopy analysis offer choices not previously available with monochrome systems. In this paper we demonstrate the usefulness of colour thresholding, which has so far proven indispensable for proper objective quantification of the products of histochemical reactions and/or subtle differences in tissue and cells. In addition, we provide interested, but untrained readers with basic information that may assist decisions regarding the most suitable set-up for a project under consideration. Data from projects in progress at Tulane are shown to illustrate the advantage of colour thresholding over monochrome densitometry and for objective quantification of subtle colour differences between experimental and control samples.
Introducing evidence-based dentistry to dental students using histology.
Lallier, Thomas E
2014-03-01
The expansion of evidence-based dentistry (EBD) is essential to the continued growth and development of the dental profession. Expanding EBD requires increased emphasis on critical thinking skills during dental education, as noted in the American Dental Education Association's Competencies for the New General Dentist. In order to achieve this goal, educational exercises must be introduced to increase the use of critical thinking skills early in the dental curriculum, with continued reinforcement as students progress through subsequent years. Described in this article is one approach to increasing student exposure to critical thinking during the early basic science curriculum-specifically, within the confines of a traditional histology course. A method of utilizing the medical and dental research literature to reinforce and enliven the concepts taught in histology is described, along with an approach for using peer-to-peer presentations to demonstrate the tools needed to critically evaluate research studies and their presentation in published articles. This approach, which could be applied to any basic science course, will result in a stronger foundation on which students can build their EBD and critical thinking skills.
The neural mediators of kindness-based meditation: a theoretical model
Mascaro, Jennifer S.; Darcher, Alana; Negi, Lobsang T.; Raison, Charles L.
2015-01-01
Although kindness-based contemplative practices are increasingly employed by clinicians and cognitive researchers to enhance prosocial emotions, social cognitive skills, and well-being, and as a tool to understand the basic workings of the social mind, we lack a coherent theoretical model with which to test the mechanisms by which kindness-based meditation may alter the brain and body. Here, we link contemplative accounts of compassion and loving-kindness practices with research from social cognitive neuroscience and social psychology to generate predictions about how diverse practices may alter brain structure and function and related aspects of social cognition. Contingent on the nuances of the practice, kindness-based meditation may enhance the neural systems related to faster and more basic perceptual or motor simulation processes, simulation of another’s affective body state, slower and higher-level perspective-taking, modulatory processes such as emotion regulation and self/other discrimination, and combinations thereof. This theoretical model will be discussed alongside best practices for testing such a model and potential implications and applications of future work. PMID:25729374
Kohl, Kevin D
2017-10-01
Research on host-associated microbial communities has grown rapidly. Despite the great body of work, inclusion of microbiota-related questions into integrative and comparative biology is still lagging behind other disciplines. The purpose of this paper is to offer an introduction into the basic tools and techniques of host-microbe research. Specifically, what considerations should be made before embarking on such projects (types of samples, types of controls)? How is microbiome data analyzed and integrated with data measured from the hosts? How can researchers experimentally manipulate the microbiome? With this information, integrative and comparative biologists should be able to include host-microbe studies into their research and push the boundaries of both fields. © The Author 2017. Published by Oxford University Press on behalf of the Society for Integrative and Comparative Biology. All rights reserved. For permissions please email: journals.permissions@oup.com.
Johnston, Sharon; Wong, Sabrina T; Blackman, Stephanie; Chau, Leena W; Grool, Anne M; Hogg, William
2017-11-16
Recruiting family physicians into primary care research studies requires researchers to continually manage information coming in, going out, and coming in again. In many research groups, Microsoft Excel and Access are the usual data management tools, but they are very basic and do not support any automation, linking, or reminder systems to manage and integrate recruitment information and processes. We explored whether a commercial customer relationship management (CRM) software program - designed for sales people in businesses to improve customer relations and communications - could be used to make the research recruitment system faster, more effective, and more efficient. We found that while there was potential for long-term studies, it simply did not adapt effectively enough for our shorter study and recruitment budget. The amount of training required to master the software and our need for ongoing flexible and timely support were greater than the benefit of using CRM software for our study.
Neuroscience in Nigeria: the past, the present and the future.
Balogun, Wasiu Gbolahan; Cobham, Ansa Emmanuel; Amin, Abdulbasit
2018-04-01
The science of the brain and nervous system cuts across almost all aspects of human life and is one of the fastest growing scientific fields worldwide. This necessitates the demand for pragmatic investment by all nations to ensure improved education and quality of research in Neurosciences. Although obvious efforts are being made in advancing the field in developed societies, there is limited data addressing the state of neuroscience in sub-Saharan Africa. Here, we review the state of neuroscience development in Nigeria, Africa's most populous country and its largest economy, critically evaluating the history, the current situation and future projections. This review specifically addresses trends in clinical and basic neuroscience research and education. We conclude by highlighting potentially helpful strategies that will catalyse development in neuroscience education and research in Nigeria, among which are an increase in research funding, provision of tools and equipment for training and research, and upgrading of the infrastructure at hand.
Caenorhabditis elegans: nature and nurture gift to nematode parasitologists.
Salinas, Gustavo; Risi, Gastón
2017-12-06
The free-living nematode Caenorhabditis elegans is the simplest animal model organism to work with. Substantial knowledge and tools have accumulated over 50 years of C. elegans research. The use of C. elegans relating to parasitic nematodes from a basic biology standpoint or an applied perspective has increased in recent years. The wealth of information gained on the model organism, the use of the powerful approaches and technologies that have advanced C. elegans research to parasitic nematodes and the enormous success of the omics fields have contributed to bridge the divide between C. elegans and parasite nematode researchers. We review key fields, such as genomics, drug discovery and genetics, where C. elegans and nematode parasite research have convened. We advocate the use of C. elegans as a model to study helminth metabolism, a neglected area ready to advance. How emerging technologies being used in C. elegans can pave the way for parasitic nematode research is discussed.
Content analysis of science material in junior school-based inquiry and science process skills
NASA Astrophysics Data System (ADS)
Patonah, S.; Nuvitalia, D.; Saptaningrum, E.
2018-03-01
The purpose of this research is to obtain the characteristic map of science material content in Junior School which can be optimized using inquiry learning model to tone the science process skill. The research method used in the form of qualitative research on SMP science curriculum document in Indonesia. Documents are reviewed on the basis of the basic competencies of each level as well as their potential to trace the skills of the science process using inquiry learning models. The review was conducted by the research team. The results obtained, science process skills in grade 7 have the potential to be trained using the model of inquiry learning by 74%, 8th grade by 83%, and grade 9 by 75%. For the dominant process skills in each chapter and each level is the observing skill. Follow-up research is used to develop instructional inquiry tools to trace the skills of the science process.
A flexible flight display research system using a ground-based interactive graphics terminal
NASA Technical Reports Server (NTRS)
Hatfield, J. J.; Elkins, H. C.; Batson, V. M.; Poole, W. L.
1975-01-01
Requirements and research areas for the air transportation system of the 1980 to 1990's were reviewed briefly to establish the need for a flexible flight display generation research tool. Specific display capabilities required by aeronautical researchers are listed and a conceptual system for providing these capabilities is described. The conceptual system uses a ground-based interactive graphics terminal driven by real-time radar and telemetry data to generate dynamic, experimental flight displays. These displays are scan converted to television format, processed, and transmitted to the cockpits of evaluation aircraft. The attendant advantages of a Flight Display Research System (FDRS) designed to employ this concept are presented. The detailed implementation of an FDRS is described. The basic characteristics of the interactive graphics terminal and supporting display electronic subsystems are presented and the resulting system capability is summarized. Finally, the system status and utilization are reviewed.
AGU Journals Among Most Cited Publications in Climate Change Research
NASA Astrophysics Data System (ADS)
Sears, Jon
2010-03-01
Geophysical Research Letters (GRL) and Journal of Geophysical Research-Atmospheres (JGR-D) both ranked among the top 10 of the most highly cited research publications on climate change over the past decade in a recent analysis by sciencewatch.com, an Internet tool published by the Thomson Reuters Web of Science® that tracks trends and performances in basic research. Although Nature and Science—the multidisciplinary heavyweights—led the field, GRL ranked fifth and JGR-D ranked sixth. The study was conducted by searching the Web of Science® database for terms such as “global warming,” “climate change,” “human impact,” and other key phrases in journal articles published and cited between 1999 and the spring of 2009. The analysis produced over 28,000 papers, from which sciencewatch.com identified the most cited institutions, authors, and journals. To see the analysis in full, visit http://sciencewatch.com/ana/fea/09novdecFea/.
Geospatial Service Platform for Education and Research
NASA Astrophysics Data System (ADS)
Gong, J.; Wu, H.; Jiang, W.; Guo, W.; Zhai, X.; Yue, P.
2014-04-01
We propose to advance the scientific understanding through applications of geospatial service platforms, which can help students and researchers investigate various scientific problems in a Web-based environment with online tools and services. The platform also offers capabilities for sharing data, algorithm, and problem-solving knowledge. To fulfil this goal, the paper introduces a new course, named "Geospatial Service Platform for Education and Research", to be held in the ISPRS summer school in May 2014 at Wuhan University, China. The course will share cutting-edge achievements of a geospatial service platform with students from different countries, and train them with online tools from the platform for geospatial data processing and scientific research. The content of the course includes the basic concepts of geospatial Web services, service-oriented architecture, geoprocessing modelling and chaining, and problem-solving using geospatial services. In particular, the course will offer a geospatial service platform for handson practice. There will be three kinds of exercises in the course: geoprocessing algorithm sharing through service development, geoprocessing modelling through service chaining, and online geospatial analysis using geospatial services. Students can choose one of them, depending on their interests and background. Existing geoprocessing services from OpenRS and GeoPW will be introduced. The summer course offers two service chaining tools, GeoChaining and GeoJModelBuilder, as instances to explain specifically the method for building service chains in view of different demands. After this course, students can learn how to use online service platforms for geospatial resource sharing and problem-solving.
Gillies, Katie; Huang, Wan; Skea, Zoë; Brehaut, Jamie; Cotton, Seonaidh
2014-02-18
Informed consent is regarded as a cornerstone of ethical healthcare research and is a requirement for most clinical research studies. Guidelines suggest that prospective randomised controlled trial (RCT) participants should understand a basic amount of key information about the RCTs they are being asked to enrol in in order to provide valid informed consent. This information is usually provided to potential participants in a patient information leaflet (PIL). There is evidence that some trial participants fail to understand key components of trial processes or rationale. As such, the existing approach to information provision for potential RCT participants may not be optimal. Decision aids have been used for a variety of treatment and screening decisions to improve knowledge, but focus more on overall decision quality, and may be helpful to those making decisions about participating in an RCT. We investigated the feasibility of using a tool to identify which items recommended for good quality decision making are present in UK PILs. PILs were sampled from UK registered Clinical Trials Unit websites across a range of clinical areas. The evaluation tool, which is based on standards for supporting decision making, was applied to 20 PILs. Two researchers independently rated each PIL using the tool. In addition, word count and readability were assessed. PILs scored poorly on the evaluation tool with the majority of leaflets scoring less than 50%. Specifically, presenting probabilities, clarifying and expressing values and structured guidance in deliberation and communication sub-sections scored consistently poorly. Tool score was associated with word count (r=0.802, P <0.01); there was no association between score and readability (r=-0.372, P=0.106). The tool was feasible to use to evaluate PILs for UK RCTs. PILs did not meet current standards of information to support good quality decision making. Writers of information leaflets could use the evaluation tool as a framework during PIL development to help ensure that items are included which promote and support more informed decisions about trial participation. Further research is required to evaluate the inclusion of such information.
Cheminformatics Research at the Unilever Centre for Molecular Science Informatics Cambridge.
Fuchs, Julian E; Bender, Andreas; Glen, Robert C
2015-09-01
The Centre for Molecular Informatics, formerly Unilever Centre for Molecular Science Informatics (UCMSI), at the University of Cambridge is a world-leading driving force in the field of cheminformatics. Since its opening in 2000 more than 300 scientific articles have fundamentally changed the field of molecular informatics. The Centre has been a key player in promoting open chemical data and semantic access. Though mainly focussing on basic research, close collaborations with industrial partners ensured real world feedback and access to high quality molecular data. A variety of tools and standard protocols have been developed and are ubiquitous in the daily practice of cheminformatics. Here, we present a retrospective of cheminformatics research performed at the UCMSI, thereby highlighting historical and recent trends in the field as well as indicating future directions.
Cheminformatics Research at the Unilever Centre for Molecular Science Informatics Cambridge
Fuchs, Julian E; Bender, Andreas; Glen, Robert C
2015-01-01
The Centre for Molecular Informatics, formerly Unilever Centre for Molecular Science Informatics (UCMSI), at the University of Cambridge is a world-leading driving force in the field of cheminformatics. Since its opening in 2000 more than 300 scientific articles have fundamentally changed the field of molecular informatics. The Centre has been a key player in promoting open chemical data and semantic access. Though mainly focussing on basic research, close collaborations with industrial partners ensured real world feedback and access to high quality molecular data. A variety of tools and standard protocols have been developed and are ubiquitous in the daily practice of cheminformatics. Here, we present a retrospective of cheminformatics research performed at the UCMSI, thereby highlighting historical and recent trends in the field as well as indicating future directions. PMID:26435758
ERIC Educational Resources Information Center
Binder, Katherine S.; Snyder, Melissa A.; Ardoin, Scott P.; Morris, Robin K.
2011-01-01
This study examined the reliability and validity of administering Dynamic Indicators of Basic Early Literacy Skills (DIBELS) to adult basic education (ABE) students. Ninety ABE participants were administered DIBELS measures, the Woodcock-Johnson III Broad Reading (WJ III BR) measures, and four orthographic ability tests. Since ABE students are a…
Giraldi, Annamaria; Rellini, Alessandra; Pfaus, James G; Bitzer, Johannes; Laan, Ellen; Jannini, Emmanuele A; Fugl-Meyer, Axel R
2011-10-01
There are many methods to evaluate female sexual function and dysfunction (FSD) in clinical and research settings, including questionnaires, structured interviews, and detailed case histories. Of these, questionnaires have become an easy first choice to screen individuals into different categories of FSD. The aim of this study was to review the strengths and weaknesses of different questionnaires currently available to assess different dimensions of women's sexual function and dysfunction, and to suggest a simple screener for FSD. A literature search of relevant databases, books, and articles in journals was used to identify questionnaires that have been used in basic or epidemiological research, clinical trials, or in clinical settings. Measures were grouped in four levels based on their purposes and degree of development, and were reviewed for their psychometric properties and utility in clinical or research settings. A Sexual Complaints Screener for Women (SCS-W) was then proposed based on epidemiological methods. Although many questionnaires are adequate for their own purposes, our review revealed a serious lack of standardized, internationally (culturally) acceptable questionnaires that are truly epidemiologically validated in general populations and that can be used to assess FSD in women with or without a partner and independent of the partner's gender. The SCS-W is proposed as a 10-item screener to aid clinicians in making a preliminary assessment of FSD. The definition of FSD continues to change and basic screening tools are essential to help advance clinical diagnosis and treatment, or to slate patients adequately into the right diagnostic categories for basic and epidemiological research or clinical trials. © 2011 International Society for Sexual Medicine.
Endodontic Microbiology and Pathobiology: Current State of Knowledge.
Fouad, Ashraf F
2017-01-01
Newer research tools and basic science knowledge base have allowed the exploration of endodontic diseases in the pulp and periapical tissues in novel ways. The use of next generation sequencing, bioinformatics analyses, genome-wide association studies, to name just a few of these innovations, has allowed the identification of hundreds of microorganisms and of host response factors. This review addresses recent advances in endodontic microbiology and the host response and discusses the potential for future innovations in this area. Copyright © 2016 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Kluchnikova, O.; Pobegaylov, O.
2017-11-01
The article focuses on the basic theory and practical aspects of the strategic management improving in terms of enhancing the quality of a technological process: these aspects have been proven experimentally by their introduction in company operations. The authors have worked out some proposals aimed at the selection of an optimal supplier for building companies as well as the algorithm for the analysis and optimization of a construction company basing on scientific and practical research as well as on the experimental data obtained in the experiment.
Online Interactive Tutorials for Creating Graphs With Excel 2007 or 2010
Vanselow, Nicholas R
2012-01-01
Graphic display of clinical data is a useful tool for the behavior-analytic clinician. However, graphs can sometimes be difficult to create. We describe how to access and use an online interactive tutorial that teaches the user to create a variety of graphs often used by behavior analysts. Three tutorials are provided that cover the basics of Microsoft Excel 2007 or 2010, creating graphs for clinical purposes, and creating graphs for research purposes. The uses for this interactive tutorial and other similar programs are discussed. PMID:23326629
Online interactive tutorials for creating graphs with excel 2007 or 2010.
Vanselow, Nicholas R; Bourret, Jason C
2012-01-01
Graphic display of clinical data is a useful tool for the behavior-analytic clinician. However, graphs can sometimes be difficult to create. We describe how to access and use an online interactive tutorial that teaches the user to create a variety of graphs often used by behavior analysts. Three tutorials are provided that cover the basics of Microsoft Excel 2007 or 2010, creating graphs for clinical purposes, and creating graphs for research purposes. The uses for this interactive tutorial and other similar programs are discussed.
The Biotechnology Facility for International Space Station.
Goodwin, Thomas; Lundquist, Charles; Tuxhorn, Jennifer; Hurlbert, Katy
2004-03-01
The primary mission of the Cellular Biotechnology Program is to advance microgravity as a tool in basic and applied cell biology. The microgravity environment can be used to study fundamental principles of cell biology and to achieve specific applications such as tissue engineering. The Biotechnology Facility (BTF) will provide a state-of-the-art facility to perform cellular biotechnology research onboard the International Space Station (ISS). The BTF will support continuous operation, which will allow performance of long-duration experiments and will significantly increase the on-orbit science throughput.
2014-09-18
Erdogan , 1963). 26 Paris’s Law Under a fatigue stress regime Paris’s Law relates sub-critical crack growth to stress intensity factor. The basic...Paris and Erdogan , 1963). After takeoff, the model generates a probability distribution for the crack length in that specific sortie based on the...Law is one of the most widely used fatigue crack growth models and was used in this research effort (Paris and Erdogan , 1963). Paris’s Law Under a
Catalan, Pilar; Chalhoub, Boulos; Chochois, Vincent; Garvin, David F; Hasterok, Robert; Manzaneda, Antonio J; Mur, Luis A J; Pecchioni, Nicola; Rasmussen, Søren K; Vogel, John P; Voxeur, Aline
2014-07-01
The scientific presentations at the First International Brachypodium Conference (abstracts available at http://www.brachy2013.unimore.it) are evidence of the widespread adoption of Brachypodium distachyon as a model system. Furthermore, the wide range of topics presented (genome evolution, roots, abiotic and biotic stress, comparative genomics, natural diversity, and cell walls) demonstrates that the Brachypodium research community has achieved a critical mass of tools and has transitioned from resource development to addressing biological questions, particularly those unique to grasses. Copyright © 2014 Elsevier Ltd. All rights reserved.
Lee, Ciaran M; Zhu, Haibao; Davis, Timothy H; Deshmukh, Harshahardhan; Bao, Gang
2017-01-01
The CRISPR/Cas9 system is a powerful tool for precision genome editing. The ability to accurately modify genomic DNA in situ with single nucleotide precision opens up new possibilities for not only basic research but also biotechnology applications and clinical translation. In this chapter, we outline the procedures for design, screening, and validation of CRISPR/Cas9 systems for targeted modification of coding sequences in the human genome and how to perform genome editing in induced pluripotent stem cells with high efficiency and specificity.
Survey of decentralized control methods. [for large scale dynamic systems
NASA Technical Reports Server (NTRS)
Athans, M.
1975-01-01
An overview is presented of the types of problems that are being considered by control theorists in the area of dynamic large scale systems with emphasis on decentralized control strategies. Approaches that deal directly with decentralized decision making for large scale systems are discussed. It is shown that future advances in decentralized system theory are intimately connected with advances in the stochastic control problem with nonclassical information pattern. The basic assumptions and mathematical tools associated with the latter are summarized, and recommendations concerning future research are presented.
The Biotechnology Facility for International Space Station
NASA Technical Reports Server (NTRS)
Goodwin, Thomas; Lundquist, Charles; Tuxhorn, Jennifer; Hurlbert, Katy
2004-01-01
The primary mission of the Cellular Biotechnology Program is to advance microgravity as a tool in basic and applied cell biology. The microgravity environment can be used to study fundamental principles of cell biology and to achieve specific applications such as tissue engineering. The Biotechnology Facility (BTF) will provide a state-of-the-art facility to perform cellular biotechnology research onboard the International Space Station (ISS). The BTF will support continuous operation, which will allow performance of long-duration experiments and will significantly increase the on-orbit science throughput.
Using artificial intelligence to control fluid flow computations
NASA Technical Reports Server (NTRS)
Gelsey, Andrew
1992-01-01
Computational simulation is an essential tool for the prediction of fluid flow. Many powerful simulation programs exist today. However, using these programs to reliably analyze fluid flow and other physical situations requires considerable human effort and expertise to set up a simulation, determine whether the output makes sense, and repeatedly run the simulation with different inputs until a satisfactory result is achieved. Automating this process is not only of considerable practical importance but will also significantly advance basic artificial intelligence (AI) research in reasoning about the physical world.
`New insight into statistical hydrology' preface to the special issue
NASA Astrophysics Data System (ADS)
Kochanek, Krzysztof
2018-04-01
Statistical methods are still the basic tool for investigating random, extreme events occurring in hydrosphere. On 21-22 September 2017, in Warsaw (Poland) the international workshop of the Statistical Hydrology (StaHy) 2017 took place under the auspices of the International Association of Hydrological Sciences. The authors of the presentations proposed to publish their research results in the Special Issue of the Acta Geophysica-`New Insight into Statistical Hydrology'. Five papers were selected for publication, touching on the most crucial issues of statistical methodology in hydrology.
Online Learning Tools as Supplements for Basic and Clinical Science Education.
Ellman, Matthew S; Schwartz, Michael L
2016-01-01
Undergraduate medical educators are increasingly incorporating online learning tools into basic and clinical science curricula. In this paper, we explore the diversity of online learning tools and consider the range of applications for these tools in classroom and bedside learning. Particular advantages of these tools are highlighted, such as delivering foundational knowledge as part of the "flipped classroom" pedagogy and for depicting unusual physical examination findings and advanced clinical communication skills. With accelerated use of online learning, educators and administrators need to consider pedagogic and practical challenges posed by integrating online learning into individual learning activities, courses, and curricula as a whole. We discuss strategies for faculty development and the role of school-wide resources for supporting and using online learning. Finally, we consider the role of online learning in interprofessional, integrated, and competency-based applications among other contemporary trends in medical education are considered.
Lange, Alissa A; Mulhern, Gerry; Wylie, Judith
2009-01-01
The present study investigated the effects of using an assistive software homophone tool on the assisted proofreading performance and unassisted basic skills of secondary-level students with reading difficulties. Students aged 13 to 15 years proofread passages for homophonic errors under three conditions: with the homophone tool, with homophones highlighted only, or with no help. The group using the homophone tool significantly outperformed the other two groups on assisted proofreading and outperformed the others on unassisted spelling, although not significantly. Remedial (unassisted) improvements in automaticity of word recognition, homophone proofreading, and basic reading were found over all groups. Results elucidate the differential contributions of each function of the homophone tool and suggest that with the proper training, assistive software can help not only students with diagnosed disabilities but also those with generally weak reading skills.
Online Learning Tools as Supplements for Basic and Clinical Science Education
Ellman, Matthew S.; Schwartz, Michael L.
2016-01-01
Undergraduate medical educators are increasingly incorporating online learning tools into basic and clinical science curricula. In this paper, we explore the diversity of online learning tools and consider the range of applications for these tools in classroom and bedside learning. Particular advantages of these tools are highlighted, such as delivering foundational knowledge as part of the “flipped classroom” pedagogy and for depicting unusual physical examination findings and advanced clinical communication skills. With accelerated use of online learning, educators and administrators need to consider pedagogic and practical challenges posed by integrating online learning into individual learning activities, courses, and curricula as a whole. We discuss strategies for faculty development and the role of school-wide resources for supporting and using online learning. Finally, we consider the role of online learning in interprofessional, integrated, and competency-based applications among other contemporary trends in medical education are considered. PMID:29349323
Enabling a Scientific Cloud Marketplace: VGL (Invited)
NASA Astrophysics Data System (ADS)
Fraser, R.; Woodcock, R.; Wyborn, L. A.; Vote, J.; Rankine, T.; Cox, S. J.
2013-12-01
The Virtual Geophysics Laboratory (VGL) provides a flexible, web based environment where researchers can browse data and use a variety of scientific software packaged into tool kits that run in the Cloud. Both data and tool kits are published by multiple researchers and registered with the VGL infrastructure forming a data and application marketplace. The VGL provides the basic work flow of Discovery and Access to the disparate data sources and a Library for tool kits and scripting to drive the scientific codes. Computation is then performed on the Research or Commercial Clouds. Provenance information is collected throughout the work flow and can be published alongside the results allowing for experiment comparison and sharing with other researchers. VGL's "mix and match" approach to data, computational resources and scientific codes, enables a dynamic approach to scientific collaboration. VGL allows scientists to publish their specific contribution, be it data, code, compute or work flow, knowing the VGL framework will provide other components needed for a complete application. Other scientists can choose the pieces that suit them best to assemble an experiment. The coarse grain workflow of the VGL framework combined with the flexibility of the scripting library and computational toolkits allows for significant customisation and sharing amongst the community. The VGL utilises the cloud computational and storage resources from the Australian academic research cloud provided by the NeCTAR initiative and a large variety of data accessible from national and state agencies via the Spatial Information Services Stack (SISS - http://siss.auscope.org). VGL v1.2 screenshot - http://vgl.auscope.org
Montecucco, Fabrizio; Carbone, Federico; Dini, Frank Lloyd; Fiuza, Manuela; Pinto, Fausto J; Martelli, Antonietta; Palombo, Domenico; Sambuceti, Gianmario; Mach, François; De Caterina, Raffaele
2014-11-01
Insights from the "-omics" science have recently emphasized the need to implement an overall strategy in medical research. Here, the development of Systems Medicine has been indicated as a potential tool for clinical translation of basic research discoveries. Systems Medicine also gives the opportunity of improving different steps in medical practice, from diagnosis to healthcare management, including clinical research. The development of Systems Medicine is still hampered however by several challenges, the main one being the development of computational tools adequate to record, analyze and share a large amount of disparate data. In addition, available informatics tools appear not yet fully suitable for the challenge because they are not standardized, not universally available, or with ethical/legal concerns. Cardiovascular diseases (CVD) are a very promising area for translating Systems Medicine into clinical practice. By developing clinically applied technologies, the collection and analysis of data may improve CV risk stratification and prediction. Standardized models for data recording and analysis can also greatly broaden data exchange, thus promoting a uniform management of CVD patients also useful for clinical research. This advance however requires a great organizational effort by both physicians and health institutions, as well as the overcoming of ethical problems. This narrative review aims at providing an update on the state-of-art knowledge in the area of Systems Medicine as applied to CVD, focusing on current critical issues, providing a road map for its practical implementation. Copyright © 2014 European Federation of Internal Medicine. Published by Elsevier B.V. All rights reserved.
Methods for transition toward computer assisted cognitive examination.
Jurica, P; Valenzi, S; Struzik, Z R; Cichocki, A
2015-01-01
We present a software framework which enables the extension of current methods for the assessment of cognitive fitness using recent technological advances. Screening for cognitive impairment is becoming more important as the world's population grows older. Current methods could be enhanced by use of computers. Introduction of new methods to clinics requires basic tools for collection and communication of collected data. To develop tools that, with minimal interference, offer new opportunities for the enhancement of the current interview based cognitive examinations. We suggest methods and discuss process by which established cognitive tests can be adapted for data collection through digitization by pen enabled tablets. We discuss a number of methods for evaluation of collected data, which promise to increase the resolution and objectivity of the common scoring strategy based on visual inspection. By involving computers in the roles of both instructing and scoring, we aim to increase the precision and reproducibility of cognitive examination. The tools provided in Python framework CogExTools available at http://bsp. brain.riken.jp/cogextools/ enable the design, application and evaluation of screening tests for assessment of cognitive impairment. The toolbox is a research platform; it represents a foundation for further collaborative development by the wider research community and enthusiasts. It is free to download and use, and open-source. We introduce a set of open-source tools that facilitate the design and development of new cognitive tests for modern technology. We provide these tools in order to enable the adaptation of technology for cognitive examination in clinical settings. The tools provide the first step in a possible transition toward standardized mental state examination using computers.
Lopez-Doriga, Adriana; Feliubadaló, Lídia; Menéndez, Mireia; Lopez-Doriga, Sergio; Morón-Duran, Francisco D; del Valle, Jesús; Tornero, Eva; Montes, Eva; Cuesta, Raquel; Campos, Olga; Gómez, Carolina; Pineda, Marta; González, Sara; Moreno, Victor; Capellá, Gabriel; Lázaro, Conxi
2014-03-01
Next-generation sequencing (NGS) has revolutionized genomic research and is set to have a major impact on genetic diagnostics thanks to the advent of benchtop sequencers and flexible kits for targeted libraries. Among the main hurdles in NGS are the difficulty of performing bioinformatic analysis of the huge volume of data generated and the high number of false positive calls that could be obtained, depending on the NGS technology and the analysis pipeline. Here, we present the development of a free and user-friendly Web data analysis tool that detects and filters sequence variants, provides coverage information, and allows the user to customize some basic parameters. The tool has been developed to provide accurate genetic analysis of targeted sequencing of common high-risk hereditary cancer genes using amplicon libraries run in a GS Junior System. The Web resource is linked to our own mutation database, to assist in the clinical classification of identified variants. We believe that this tool will greatly facilitate the use of the NGS approach in routine laboratories.
Multi-Sector Sustainability Browser (MSSB) User Manual: A ...
EPA’s Sustainable and Healthy Communities (SHC) Research Program is developing methodologies, resources, and tools to assist community members and local decision makers in implementing policy choices that facilitate sustainable approaches in managing their resources affecting the built environment, natural environment, and human health. In order to assist communities and decision makers in implementing sustainable practices, EPA is developing computer-based systems including models, databases, web tools, and web browsers to help communities decide upon approaches that support their desired outcomes. Communities need access to resources that will allow them to achieve their sustainability objectives through intelligent decisions in four key sustainability areas: • Land Use • Buildings and Infrastructure • Transportation • Materials Management (i.e., Municipal Solid Waste [MSW] processing and disposal) The Multi-Sector Sustainability Browser (MSSB) is designed to support sustainable decision-making for communities, local and regional planners, and policy and decision makers. Document is an EPA Technical Report, which is the user manual for the Multi-Sector Sustainability Browser (MSSB) tool. The purpose of the document is to provide basic guidance on use of the tool for users
TACT: A Set of MSC/PATRAN- and MSC/NASTRAN- based Modal Correlation Tools
NASA Technical Reports Server (NTRS)
Marlowe, Jill M.; Dixon, Genevieve D.
1998-01-01
This paper describes the functionality and demonstrates the utility of the Test Analysis Correlation Tools (TACT), a suite of MSC/PATRAN Command Language (PCL) tools which automate the process of correlating finite element models to modal survey test data. The initial release of TACT provides a basic yet complete set of tools for performing correlation totally inside the PATRAN/NASTRAN environment. Features include a step-by-step menu structure, pre-test accelerometer set evaluation and selection, analysis and test result export/import in Universal File Format, calculation of frequency percent difference and cross-orthogonality correlation results using NASTRAN, creation and manipulation of mode pairs, and five different ways of viewing synchronized animations of analysis and test modal results. For the PATRAN-based analyst, TACT eliminates the repetitive, time-consuming and error-prone steps associated with transferring finite element data to a third-party modal correlation package, which allows the analyst to spend more time on the more challenging task of model updating. The usefulness of this software is presented using a case history, the correlation for a NASA Langley Research Center (LaRC) low aspect ratio research wind tunnel model. To demonstrate the improvements that TACT offers the MSC/PATRAN- and MSC/DIASTRAN- based structural analysis community, a comparison of the modal correlation process using TACT within PATRAN versus external third-party modal correlation packages is presented.
Estimation of risk management effects on revenue and purchased feed costs on US dairy farms.
Hadrich, Joleen C; Johnson, Kamina K
2015-09-01
Variations in milk and feed prices directly affect dairy farm risk management decisions. This research used data from the 2010 US Department of Agriculture-Agricultural Resource Management Surveys phase III dairy survey to examine how risk management tools affected revenues and expenses across US dairy farms. The survey was sent to 26 states and collected information on costs and returns to individual dairy farms. This research used the information from milk sales, crops sales, feed expenses, and farm and operator characteristics, as well as the use of risk management tools. Matching methodology was used to evaluate the effect of 5 independent risk management tools on revenues and expenses: selling milk to a cooperative, using a commodity contract to sell grain, feeding homegrown forage at a basic and intensive level, and use of a nutritionist. Results showed that dairy farms located in the Midwest and East benefit from selling milk to a cooperative and using commodity contracts to sell grain. Across the United States, using a nutritionist increased total feed costs, whereas a feeding program that included more than 65% homegrown forages decreased total feed costs. Results point to benefits from educational programming on risk management tools that are region specific rather than a broad generalization to all US dairy farmers. Copyright © 2015 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Science Teacher, 1988
1988-01-01
Reviews two software programs for Apple series computers. Includes "Orbital Mech," a basic planetary orbital simulation for the Macintosh, and "START: Stimulus and Response Tools for Experiments in Memory, Learning, Cognition, and Perception," a program that demonstrates basic psychological principles and experiments. (CW)
An Interior Signage System for the USAF Academy Hospital
1979-08-01
manner. Graphic Design - Graphic design is a design for visual communication . Graphic Design Tools - There are four basic graphic design tools available...specializes in the design of two dimensional visual communication components. The graphic designer utilizes the four graphic design tools in developing
Basic nursing care: The most provided, the least evidence based - A discussion paper.
Zwakhalen, Sandra M G; Hamers, Jan P H; Metzelthin, Silke F; Ettema, Roelof; Heinen, Maud; de Man-Van Ginkel, Janneke M; Vermeulen, Hester; Huisman-de Waal, Getty; Schuurmans, Marieke J
2018-06-01
To describe and discuss the "Basic Care Revisited" (BCR) research programme, a collaborative initiative that contributes to evidence-based basic nursing care and raises awareness about the importance of basic nursing care activities. While basic nursing care serves nearly all people at some point in their lifetime, it is poorly informed by evidence. There is a need to prioritise and evaluate basic nursing care activities to improve patient outcomes and improve the quality of care. Discussion paper METHOD: The discussion presented in this paper is based on nursing literature and theory and supported by the authors' clinical and research experiences. We present the developmental process and content of a research programme called "Basic Care Revisited" (BCR) as a solution to move forward and improve basic nursing care. To prioritise basic nursing care, we propose a research programme entitled "Basic Care Revisited" that aims to create awareness and expand knowledge on evidence-based basic nursing care by addressing four basic nursing care themes (bathing and dressing, communication, mobility, and nutrition) in different settings. The paper discusses a pathway to create a sustainable and productive research collaborative on basic nursing care and addresses issues to build research capacity. Revaluation of these important nursing activities will not only positively influence patient outcomes, but also have an impact on staff outcomes and organisational outcomes. © 2018 John Wiley & Sons Ltd.
Time-lapse microscopy and image analysis in basic and clinical embryo development research.
Wong, C; Chen, A A; Behr, B; Shen, S
2013-02-01
Mammalian preimplantation embryo development is a complex process in which the exact timing and sequence of events are as essential as the accurate execution of the events themselves. Time-lapse microscopy (TLM) is an ideal tool to study this process since the ability to capture images over time provides a combination of morphological, dynamic and quantitative information about developmental events. Here, we systematically review the application of TLM in basic and clinical embryo research. We identified all relevant preimplantation embryo TLM studies published in English up to May 2012 using PubMed and Google Scholar. We then analysed the technical challenges involved in embryo TLM studies and how these challenges may be overcome with technological innovations. Finally, we reviewed the different types of TLM embryo studies, with a special focus on how TLM can benefit clinical assisted reproduction. Although new parameters predictive of embryo development potential may be discovered and used clinically to potentially increase the success rate of IVF, adopting TLM to routine clinical practice will require innovations in both optics and image analysis. Combined with such innovations, TLM may provide embryologists and clinicians with an important tool for making critical decisions in assisted reproduction. In this review, we perform a literature search of all published early embryo development studies that used time-lapse microscopy (TLM). From the literature, we discuss the benefits of TLM over traditional time-point analysis, as well as the technical difficulties and solutions involved in implementing TLM for embryo studies. We further discuss research that has successfully derived non-invasive markers that may increase the success rate of assisted reproductive technologies, primarily IVF. Most notably, we extend our discussion to highlight important considerations for the practical use of TLM in research and clinical settings. Copyright © 2012 Reproductive Healthcare Ltd. Published by Elsevier Ltd. All rights reserved.
Oertel, Bruno Georg; Lötsch, Jörn
2013-01-01
The medical impact of pain is such that much effort is being applied to develop novel analgesic drugs directed towards new targets and to investigate the analgesic efficacy of known drugs. Ongoing research requires cost-saving tools to translate basic science knowledge into clinically effective analgesic compounds. In this review we have re-examined the prediction of clinical analgesia by human experimental pain models as a basis for model selection in phase I studies. The overall prediction of analgesic efficacy or failure of a drug correlated well between experimental and clinical settings. However, correct model selection requires more detailed information about which model predicts a particular clinical pain condition. We hypothesized that if an analgesic drug was effective in an experimental pain model and also a specific clinical pain condition, then that model might be predictive for that particular condition and should be selected for development as an analgesic for that condition. The validity of the prediction increases with an increase in the numbers of analgesic drug classes for which this agreement was shown. From available evidence, only five clinical pain conditions were correctly predicted by seven different pain models for at least three different drugs. Most of these models combine a sensitization method. The analysis also identified several models with low impact with respect to their clinical translation. Thus, the presently identified agreements and non-agreements between analgesic effects on experimental and on clinical pain may serve as a solid basis to identify complex sets of human pain models that bridge basic science with clinical pain research. PMID:23082949
Revisiting the NIH Taskforce on the Research needs of Eosinophil-Associated Diseases (RE-TREAD).
Khoury, Paneez; Akuthota, Praveen; Ackerman, Steven J; Arron, Joseph R; Bochner, Bruce S; Collins, Margaret H; Kahn, Jean-Emmanuel; Fulkerson, Patricia C; Gleich, Gerald J; Gopal-Srivastava, Rashmi; Jacobsen, Elizabeth A; Leiferman, Kristen M; Francesca, Levi-Schaffer; Mathur, Sameer K; Minnicozzi, Michael; Prussin, Calman; Rothenberg, Marc E; Roufosse, Florence; Sable, Kathleen; Simon, Dagmar; Simon, Hans-Uwe; Spencer, Lisa A; Steinfeld, Jonathan; Wardlaw, Andrew J; Wechsler, Michael E; Weller, Peter F; Klion, Amy D
2018-04-19
Eosinophil-associated diseases (EADs) are rare, heterogeneous disorders characterized by the presence of eosinophils in tissues and/or peripheral blood resulting in immunopathology. The heterogeneity of tissue involvement, lack of sufficient animal models, technical challenges in working with eosinophils, and lack of standardized histopathologic approaches have hampered progress in basic research. Additionally, clinical trials and drug development for rare EADs are limited by the lack of primary and surrogate endpoints, biomarkers, and validated patient-reported outcomes. Researchers with expertise in eosinophil biology and eosinophil-related diseases reviewed the state of current eosinophil research, resources, progress, and unmet needs in the field since the 2012 meeting of the NIH Taskforce on the Research of Eosinophil-Associated Diseases (TREAD). RE-TREAD focused on gaps in basic science, translational, and clinical research on eosinophils and eosinophil-related pathogenesis. Improved recapitulation of human eosinophil biology and pathogenesis in murine models was felt to be of importance. Characterization of eosinophil phenotypes, the role of eosinophil subsets in tissues, identification of biomarkers of eosinophil activation and tissue load, and a better understanding of the role of eosinophils in human disease were prioritized. Finally, an unmet need for tools for use in clinical trials was emphasized. Histopathologic scoring, patient- and clinician-reported outcomes, and appropriate coding were deemed of paramount importance for research collaborations, drug development, and approval by regulatory agencies. Further exploration of the eosinophil genome, epigenome, and proteome was also encouraged. Although progress has been made since 2012, unmet needs in eosinophil research remain a priority. ©2018 Society for Leukocyte Biology.
Building a sustainable complementary and alternative medicine research network in Europe.
Reiter, Bettina; Baumhöfener, Franziska; Dlaboha, Meike; Odde Madsen, Jesper; Regenfelder, Stephanie; Weidenhammer, Wolfgang
2012-01-01
Since CAMbrella is a networking project funded by the European Commission explicitly to build and sustain a complementary and alternative medicine (CAM) research network in Europe, communication and dissemination play a large role and form a work package of their own. The present article gives an outline of the communication and dissemination work in the CAMbrella consortium. The intensive building of sound internal communication is an essential part in establishing a functioning structure for collaboration in a diverse group of 16 partner institutions from 12 countries, as exists in the CAMbrella project. The means and tools for dissemination of results to the scientific community and the European public at large, as well as to the European policy makers, are presented. The development of the corporate design and a dissemination strategy are described in detail. In addition, some basic information regarding previous CAM research efforts, which might be interesting for future consortium building in the field of CAM research, is given. Internal communication within a heterogeneous research group, the maintenance of a work-oriented style of communication and a consensus oriented effort in establishing dissemination tools and products will be essential for any future consortium in the CAM field. The outlook shows the necessity for active political encouragement of CAM research and the desideratum of a Pan-European institution analogous to the NIH (National Institutes of Health) in the USA.
Muka, Samantha K
2016-09-01
This paper seeks to contribute to understandings of practice and place in the history of early American neurophysiology by exploring research with jellyfish at marine stations. Jellyfish became a particularly important research tool to experimental physiologists studying neurological subjects at the turn of the twentieth century. But their enthusiasm for the potential of this organism was constrained by its delicacy in captivity. The discovery of hardier species made experimentation at the shore possible and resulted in two epicenters of neurophysiological research on the American East Coast: the Marine Biological Laboratory and the Carnegie Institution's Dry Tortugas Laboratory. Work done in these locations had impacts on a wide range of physiological questions. These centers were short lived-researchers at the MBL eventually focused on the squid giant axon and the Tortugas lab closed after the death of Mayer-but the development of basic requirements and best practices to sustain these organisms paints an important picture of early experimental neurophysiology. Marine organisms and locations have played an integral role in the development of experimental life sciences in America. By understanding the earliest experimental research done at these locations, and the organisms that lured researchers from the campus to the coastline, we can begin to integrate marine stations into the larger historical narrative of American physiology.
NASA Astrophysics Data System (ADS)
Wibowo, F. C.; Suhandi, A.; Rusdiana, D.; Darman, D. R.; Ruhiat, Y.; Denny, Y. R.; Suherman; Fatah, A.
2016-08-01
A Study area in physics learning is purposeful on the effects of various types of learning interventions to help students construct the basic of scientific conception about physics. Microscopic Virtual Media (MVM) are applications for physics learning to support powerful modelling microscopic involving physics concepts and processes. In this study groups (experimental) of 18±20 years old, students were studied to determine the role of MVM in the development of functional understanding of the concepts of thermal expansion in heat transfer. The experimental group used MVM in learning process. The results show that students who learned with virtual media exhibited significantly higher scores in the research tasks. Our findings proved that the MVM may be used as an alternative instructional tool, in order to help students to confront and constructed their basic of scientific conception and developed their understanding.
The dynamics of coastal models
Hearn, Clifford J.
2008-01-01
Coastal basins are defined as estuaries, lagoons, and embayments. This book deals with the science of coastal basins using simple models, many of which are presented in either analytical form or Microsoft Excel or MATLAB. The book introduces simple hydrodynamics and its applications, from the use of simple box and one-dimensional models to flow over coral reefs. The book also emphasizes models as a scientific tool in our understanding of coasts, and introduces the value of the most modern flexible mesh combined wave-current models. Examples from shallow basins around the world illustrate the wonders of the scientific method and the power of simple dynamics. This book is ideal for use as an advanced textbook for graduate students and as an introduction to the topic for researchers, especially those from other fields of science needing a basic understanding of the basic ideas of the dynamics of coastal basins.
Scenario for concurrent conceptual assembly line design: A case study
NASA Astrophysics Data System (ADS)
Mas, F.; Ríos, J.; Menéndez, J. L.
2012-04-01
The decision to design and build a new aircraft is preceded by years of research and study. Different disciplines work together throughout the lifecycle to ensure not only a complete functional definition of the product, but also a complete industrialization, a marketing plan, a maintenance plan, etc. This case study focuses on the conceptual design phase. During this phase, the design solutions that will meet the functional and industrial requirements are defined, i.e.: the basic requirements of industrialization. During this phase, several alternatives are studied, and the most attractive in terms of performance and cost requirements is selected. As a result of the study of these alternatives, it is possible to define an early conceptual design of the assembly line and its basic parameters. The plant needs, long cycle jigs & tools or industrial means and human resources with the necessary skills can be determined in advance.
Foundations of radiation hydrodynamics
NASA Astrophysics Data System (ADS)
Mihalas, D.; Mihalas, B. W.
This book is the result of an attempt, over the past few years, to gather the basic tools required to do research on radiating flows in astrophysics. The microphysics of gases is discussed, taking into account the equation of state of a perfect gas, the first and second law of thermodynamics, the thermal properties of a perfect gas, the distribution function and Boltzmann's equation, the collision integral, the Maxwellian velocity distribution, Boltzmann's H-theorem, the time of relaxation, and aspects of classical statistical mechanics. Other subjects explored are related to the dynamics of ideal fluids, the dynamics of viscous and heat-conducting fluids, relativistic fluid flow, waves, shocks, winds, radiation and radiative transfer, the equations of radiation hydrodynamics, and radiating flows. Attention is given to small-amplitude disturbances, nonlinear flows, the interaction of radiation and matter, the solution of the transfer equation, acoustic waves, acoustic-gravity waves, basic concepts of special relativity, and equations of motion and energy.
Johnson, R C; Mason, F O; Sims, R H
1997-01-01
A basic list of 133 book and journal titles in dentistry is presented. The list is intended as a bibliographic selection tool for those libraries and health institutions that support clinical dentistry programs and services in the nondental school environment in the United States and Canada. The book and journal titles were selected by the membership of the Dental Section of the Medical Library Association (MLA). The Dental Section membership represents dental and other health sciences libraries and dental research institutions from the United States and Canada, as well as from other countries. The list was compiled and edited by the Ad Hoc Publications Committee of the Dental Section of MLA. The final list was reviewed and subsequently was approved for publication and distribution by the Dental Section of MLA during the section's 1996 annual meeting in Kansas City, Missouri. PMID:9285122
26 CFR 1.41-5A - Basic research for taxable years beginning before January 1, 1987.
Code of Federal Regulations, 2010 CFR
2010-04-01
... was for basic research performed in the United States). (2) Research in the social sciences or humanities. Basic research does not include research in the social sciences or humanities, within the meaning...
26 CFR 1.41-5A - Basic research for taxable years beginning before January 1, 1987.
Code of Federal Regulations, 2011 CFR
2011-04-01
... was for basic research performed in the United States). (2) Research in the social sciences or humanities. Basic research does not include research in the social sciences or humanities, within the meaning...
26 CFR 1.41-5A - Basic research for taxable years beginning before January 1, 1987.
Code of Federal Regulations, 2014 CFR
2014-04-01
... was for basic research performed in the United States). (2) Research in the social sciences or humanities. Basic research does not include research in the social sciences or humanities, within the meaning...
26 CFR 1.41-5A - Basic research for taxable years beginning before January 1, 1987.
Code of Federal Regulations, 2013 CFR
2013-04-01
... was for basic research performed in the United States). (2) Research in the social sciences or humanities. Basic research does not include research in the social sciences or humanities, within the meaning...
26 CFR 1.41-5A - Basic research for taxable years beginning before January 1, 1987.
Code of Federal Regulations, 2012 CFR
2012-04-01
... was for basic research performed in the United States). (2) Research in the social sciences or humanities. Basic research does not include research in the social sciences or humanities, within the meaning...
Activity Catalog Tool (ACT) user manual, version 2.0
NASA Technical Reports Server (NTRS)
Segal, Leon D.; Andre, Anthony D.
1994-01-01
This report comprises the user manual for version 2.0 of the Activity Catalog Tool (ACT) software program, developed by Leon D. Segal and Anthony D. Andre in cooperation with NASA Ames Aerospace Human Factors Research Division, FLR branch. ACT is a software tool for recording and analyzing sequences of activity over time that runs on the Macintosh platform. It was designed as an aid for professionals who are interested in observing and understanding human behavior in field settings, or from video or audio recordings of the same. Specifically, the program is aimed at two primary areas of interest: human-machine interactions and interactions between humans. The program provides a means by which an observer can record an observed sequence of events, logging such parameters as frequency and duration of particular events. The program goes further by providing the user with a quantified description of the observed sequence, through application of a basic set of statistical routines, and enables merging and appending of several files and more extensive analysis of the resultant data.
Tools for Atmospheric Radiative Transfer: Streamer and FluxNet. Revised
NASA Technical Reports Server (NTRS)
Key, Jeffrey R.; Schweiger, Axel J.
1998-01-01
Two tools for the solution of radiative transfer problems are presented. Streamer is a highly flexible medium spectral resolution radiative transfer model based on the plane-parallel theory of radiative transfer. Capable of computing either fluxes or radiances, it is suitable for studying radiative processes at the surface or within the atmosphere and for the development of remote-sensing algorithms. FluxNet is a fast neural network-based implementation of Streamer for computing surface fluxes. It allows for a sophisticated treatment of radiative processes in the analysis of large data sets and potential integration into geophysical models where computational efficiency is an issue. Documentation and tools for the development of alternative versions of Fluxnet are available. Collectively, Streamer and FluxNet solve a wide variety of problems related to radiative transfer: Streamer provides the detail and sophistication needed to perform basic research on most aspects of complex radiative processes while the efficiency and simplicity of FluxNet make it ideal for operational use.
A systematic writing program as a tool in the grief process: part 1.
Furnes, Bodil; Dysvik, Elin
2010-12-06
The basic aim of this paper is to suggest a flexible and individualized writing program as a tool for use during the grief process of bereaved adults. An open, qualitative approach following distinct steps was taken to gain a broad perspective on the grief and writing processes, as a platform for the writing program. Following several systematic methodological steps, we arrived at suggestions for the initiation of a writing program and its structure and substance, with appropriate guidelines. We believe that open and expressive writing, including free writing and focused writing, may have beneficial effects on a person experiencing grief. These writing forms may be undertaken and systematized through a writing program, with participation in a grief writing group and with diary writing, to achieve optimal results. A structured writing program might be helpful in promoting thought activities and as a tool to increase the coherence and understanding of individuals in the grief process. Our suggested program may also be a valuable guide to future program development and research.
Basic Skills, Basic Writing, Basic Research.
ERIC Educational Resources Information Center
Trimmer, Joseph F.
1987-01-01
Overviews basic writing instruction and research by briefly discussing the history of remediation, results of a survey of basic writing programs in U.S. colleges and universities, and interviews with developmental textbook editors at major publishing houses. Finds that basic writing instruction continues to focus on sentence grammar. (MM)
34 CFR 350.5 - What definitions apply?
Code of Federal Regulations, 2010 CFR
2010-07-01
... classified on a continuum from basic to applied: (1) Basic research is research in which the investigator is... immediate application or utility. (2) Applied research is research in which the investigator is primarily... rehabilitation problem or need. Applied research builds on selected findings from basic research. (Authority: Sec...
General Mission Analysis Tool (GMAT)
NASA Technical Reports Server (NTRS)
Hughes, Steven P. (Compiler)
2016-01-01
This is a software tutorial and presentation demonstrating the application of the General Mission Analysis Tool (GMAT) to the critical design phase of NASA missions. The demonstration discusses GMAT basics, then presents a detailed example of GMAT application to the Transiting Exoplanet Survey Satellite (TESS) mission. Other examples include OSIRIS-Rex. This talk is a combination of existing presentations; a GMAT basics and overview, and technical presentations from the TESS and OSIRIS-REx projects on their application of GMAT to critical mission design. The GMAT basics slides are taken from the open source training material. The OSIRIS-REx slides are from a previous conference presentation. The TESS slides are a streamlined version of the CDR package provided by the project with SBU and ITAR data removed by the TESS project.
"The Dilemma That Still Counts": Basic Writing at a Political Crossroads.
ERIC Educational Resources Information Center
Harrington, Susanmarie; Adler-Kassner, Linda
1998-01-01
Reviews definitions of basic writers and basic writing over the last 20 years. Argues that basic writers are not defined only in terms of institutional convenience. Offers future directions for basic writing research, suggesting that to learn more about basic writers, researchers must return to studies of error informed by basic writing's rich…
Casiraghi, Elena; Cossa, Mara; Huber, Veronica; Rivoltini, Licia; Tozzi, Matteo; Villa, Antonello; Vergani, Barbara
2017-11-02
In the clinical practice, automatic image analysis methods quickly quantizing histological results by objective and replicable methods are getting more and more necessary and widespread. Despite several commercial software products are available for this task, they are very little flexible, and provided as black boxes without modifiable source code. To overcome the aforementioned problems, we employed the commonly used MATLAB platform to develop an automatic method, MIAQuant, for the analysis of histochemical and immunohistochemical images, stained with various methods and acquired by different tools. It automatically extracts and quantifies markers characterized by various colors and shapes; furthermore, it aligns contiguous tissue slices stained by different markers and overlaps them with differing colors for visual comparison of their localization. Application of MIAQuant for clinical research fields, such as oncology and cardiovascular disease studies, has proven its efficacy, robustness and flexibility with respect to various problems; we highlight that, the flexibility of MIAQuant makes it an important tool to be exploited for basic researches where needs are constantly changing. MIAQuant software and its user manual are freely available for clinical studies, pathological research, and diagnosis.
Chimpanzees create and modify probe tools functionally: A study with zoo-housed chimpanzees
Hopper, Lydia M; Tennie, Claudio; Ross, Stephen R; Lonsdorf, Elizabeth V
2015-01-01
Chimpanzees (Pan troglodytes) use tools to probe for out-of-reach food, both in the wild and in captivity. Beyond gathering appropriately-sized materials to create tools, chimpanzees also perform secondary modifications in order to create an optimized tool. In this study, we recorded the behavior of a group of zoo-housed chimpanzees when presented with opportunities to use tools to probe for liquid foods in an artificial termite mound within their enclosure. Previous research with this group of chimpanzees has shown that they are proficient at gathering materials from within their environment in order to create tools to probe for the liquid food within the artificial mound. Extending beyond this basic question, we first asked whether they only made and modified probe tools when it was appropriate to do so (i.e. when the mound was baited with food). Second, by collecting continuous data on their behavior, we also asked whether the chimpanzees first (intentionally) modified their tools prior to probing for food or whether such modifications occurred after tool use, possibly as a by-product of chewing and eating the food from the tools. Following our predictions, we found that tool modification predicted tool use; the chimpanzees began using their tools within a short delay of creating and modifying them, and the chimpanzees performed more tool modifying behaviors when food was available than when they could not gain food through the use of probe tools. We also discuss our results in terms of the chimpanzees’ acquisition of the skills, and their flexibility of tool use and learning. Am. J. Primatol. 77:162–170, 2015. © 2014 The Authors. American Journal of Primatology Published by Wiley Periodicals Inc. PMID:25220050
Challenges and Opportunities in Interdisciplinary Materials Research Experiences for Undergraduates
NASA Astrophysics Data System (ADS)
Vohra, Yogesh; Nordlund, Thomas
2009-03-01
The University of Alabama at Birmingham (UAB) offer a broad range of interdisciplinary materials research experiences to undergraduate students with diverse backgrounds in physics, chemistry, applied mathematics, and engineering. The research projects offered cover a broad range of topics including high pressure physics, microelectronic materials, nano-materials, laser materials, bioceramics and biopolymers, cell-biomaterials interactions, planetary materials, and computer simulation of materials. The students welcome the opportunity to work with an interdisciplinary team of basic science, engineering, and biomedical faculty but the challenge is in learning the key vocabulary for interdisciplinary collaborations, experimental tools, and working in an independent capacity. The career development workshops dealing with the graduate school application process and the entrepreneurial business activities were found to be most effective. The interdisciplinary university wide poster session helped student broaden their horizons in research careers. The synergy of the REU program with other concurrently running high school summer programs on UAB campus will also be discussed.
Translational Research and Medicine at NASA: From Earth to Space and Back Again
NASA Technical Reports Server (NTRS)
Goodwin, Thomas J.; Cohrs, Randall; Crucian, Brian A,; Levine Benjamin; Otto, Christian; Ploutz-Schneider, Lori; Shackelford, Linda C.
2014-01-01
The Space Environment provides many challenges to the human physiology and therefore to extended habitation and exploration. Translational research and medical strategies are meeting these challenges by combining Earth based medical solutions with innovative and developmental engineering approaches. Translational methodologies are current applied to spaceflight related dysregulations in the areas of: (1) cardiovascular fluid shifts, intracranial hypertension and neuro-ocular impairment 2) immune insufficiency and suppression/viral re-expression, 3) bone loss and fragility (osteopenia/osteoporosis) and muscle wasting, and finally 4) radiation sensitivity and advanced ageing. Over 40 years of research into these areas have met with limited success due to lack of tools and basic understanding of central issues that cause physiologic maladaptaion and distrupt homeostatis. I will discuss the effects of living in space (reduced gravity, increased radiation and varying atmospheric conditions [EVA]) during long-duration, exploration-class missions and how translational research has benefited not only space exploration but also Earth based medicine. Modern tools such as telemedicine advances in genomics, proteomics, and metabolomics (Omicssciences) has helped address syndromes, at the systemic level by enlisting a global approach to assessing spaceflight physiology and to develop countermeasures thereby permitting our experience in space to be translated to the Earth's medical community.
Appendix W. Cost Analysis in Teacher Education Programs.
ERIC Educational Resources Information Center
Sell, G. Roger; And Others
This paper is an introduction to the basic cost-related tools available to management for planning, evaluating, and organizing resources for the purpose of achieving objectives within a teacher education preparation program. Three tools are presented in separate sections. Part I on the cost accounting tool for identifying, categorizing, and…
Problems in Choosing Tools and Methods for Teaching Programming
ERIC Educational Resources Information Center
Vitkute-Adžgauskiene, Davia; Vidžiunas, Antanas
2012-01-01
The paper analyses the problems in selecting and integrating tools for delivering basic programming knowledge at the university level. Discussion and analysis of teaching the programming disciplines, the main principles of study programme design, requirements for teaching tools, methods and corresponding languages is presented, based on literature…
Experimental Evaluation of the Tools of the Mind Preschool Curriculum
ERIC Educational Resources Information Center
Wilson, Sandra Jo; Farran, Dale C.
2012-01-01
The aim of the "Tools of the Mind" prekindergarten curriculum is to enhance children's executive function skills within an instructional context that promotes the basic academic and social skills that prepare them for kindergarten and beyond. To investigate the effectiveness of "Tools" in achieving this aim, the authors are…
Assessment Tools for Adult Education.
ERIC Educational Resources Information Center
Shefrin, Carol; Shafer, Dehra; Forlizzi, Lori
The Assessment Tools for Adult Education project was designed to provide training and support to staff of the Pennsylvania Bureau of Adult Basic and Literacy Education (ABLE) funded programs to help them use assessment tools and procedures to document the learning gains of the adult students they serve. The following candidate assessment…
Educating medical staff about responding to a radiological or nuclear emergency.
McCurley, M Carol; Miller, Charles W; Tucker, Florie E; Guinn, Amy; Donnelly, Elizabeth; Ansari, Armin; Holcombe, Maire; Nemhauser, Jeffrey B; Whitcomb, Robert C
2009-05-01
A growing body of audience research reveals medical personnel in hospitals are unprepared for a large-scale radiological emergency such as a terrorist event involving radioactive or nuclear materials. Also, medical personnel in hospitals lack a basic understanding of radiation principles, as well as diagnostic and treatment guidelines for radiation exposure. Clinicians have indicated that they lack sufficient training on radiological emergency preparedness; they are potentially unwilling to treat patients if those patients are perceived to be radiologically contaminated; and they have major concerns about public panic and overloading of clinical systems. In response to these findings, the Centers for Disease Control and Prevention (CDC) has developed a tool kit for use by hospital medical personnel who may be called on to respond to unintentional or intentional mass-casualty radiological and nuclear events. This tool kit includes clinician fact sheets, a clinician pocket guide, a digital video disc (DVD) of just-in-time basic skills training, a CD-ROM training on mass-casualty management, and a satellite broadcast dealing with medical management of radiological events. CDC training information emphasizes the key role that medical health physicists can play in the education and support of emergency department activities following a radiological or nuclear mass-casualty event.
Development of the updated system of city underground pipelines based on Visual Studio
NASA Astrophysics Data System (ADS)
Zhang, Jianxiong; Zhu, Yun; Li, Xiangdong
2009-10-01
Our city has owned the integrated pipeline network management system with ArcGIS Engine 9.1 as the bottom development platform and with Oracle9i as basic database for storaging data. In this system, ArcGIS SDE9.1 is applied as the spatial data engine, and the system was a synthetic management software developed with Visual Studio visualization procedures development tools. As the pipeline update function of the system has the phenomenon of slower update and even sometimes the data lost, to ensure the underground pipeline data can real-time be updated conveniently and frequently, and the actuality and integrity of the underground pipeline data, we have increased a new update module in the system developed and researched by ourselves. The module has the powerful data update function, and can realize the function of inputting and outputting and rapid update volume of data. The new developed module adopts Visual Studio visualization procedures development tools, and uses access as the basic database to storage data. We can edit the graphics in AutoCAD software, and realize the database update using link between the graphics and the system. Practice shows that the update module has good compatibility with the original system, reliable and high update efficient of the database.
A LabVIEW model incorporating an open-loop arterial impedance and a closed-loop circulatory system.
Cole, R T; Lucas, C L; Cascio, W E; Johnson, T A
2005-11-01
While numerous computer models exist for the circulatory system, many are limited in scope, contain unwanted features or incorporate complex components specific to unique experimental situations. Our purpose was to develop a basic, yet multifaceted, computer model of the left heart and systemic circulation in LabVIEW having universal appeal without sacrificing crucial physiologic features. The program we developed employs Windkessel-type impedance models in several open-loop configurations and a closed-loop model coupling a lumped impedance and ventricular pressure source. The open-loop impedance models demonstrate afterload effects on arbitrary aortic pressure/flow inputs. The closed-loop model catalogs the major circulatory waveforms with changes in afterload, preload, and left heart properties. Our model provides an avenue for expanding the use of the ventricular equations through closed-loop coupling that includes a basic coronary circuit. Tested values used for the afterload components and the effects of afterload parameter changes on various waveforms are consistent with published data. We conclude that this model offers the ability to alter several circulatory factors and digitally catalog the most salient features of the pressure/flow waveforms employing a user-friendly platform. These features make the model a useful instructional tool for students as well as a simple experimental tool for cardiovascular research.
A Cognitive Approach to the Education of Retarded Children
ERIC Educational Resources Information Center
Haywood, H. Carl
1977-01-01
Moderately mentally retarded children can acquire the necessary basic mental operations through a proper progression of mediated learning experiences; once the basic mental operations have been acquired, complex learning can occur because the necessary cognitive tools are present. (JD)
Focus on Basics: Connecting Research & Practice. Volume 7, Issue D
ERIC Educational Resources Information Center
Garner, Barbara, Ed.
2005-01-01
"Focus on Basics" is the quarterly publication of the National Center for the Study of Adult Learning and Literacy. It presents best practices, current research on adult learning and literacy, and how research is used by adult basic education teachers, counselors, program administrators, and policymakers. "Focus on Basics" is…
Focus on Basics: Connecting Research & Practice. Volume 8, Issue B
ERIC Educational Resources Information Center
Garner, Barbara, Ed.
2006-01-01
"Focus on Basics" is the quarterly publication of the National Center for the Study of Adult Learning and Literacy. It presents best practices, current research on adult learning and literacy, and how research is used by adult basic education teachers, counselors, program administrators, and policymakers. "Focus on Basics" is…
Focus on Basics: Connecting Research & Practice. Volume 6, Issue A
ERIC Educational Resources Information Center
Garner, Barbara, Ed.
2002-01-01
"Focus on Basics" is the quarterly publication of the National Center for the Study of Adult Learning and Literacy. It presents best practices, current research on adult learning and literacy, and how research is used by adult basic education teachers, counselors, program administrators, and policymakers. "Focus on Basics" is…
Focus on Basics: Connecting Research & Practice. Volume 9, Issue B
ERIC Educational Resources Information Center
Garner, Barbara, Ed.
2008-01-01
"Focus on Basics" is a publication of the U.S. Division of World Education, Inc. It presents best practices, current research on adult learning and literacy, and how research is used by adult basic education teachers, counselors, program administrators, and policymakers. "Focus on Basics" is dedicated to connecting research…
Basic science conferences in residency training: a national survey.
Cruz, P D; Charley, M R; Bergstresser, P R
1987-02-01
Basic science teaching is an important component of dermatology residency training, and the basic science conference is the major tool utilized by departments of dermatology for its implementation. To characterize the role of basic science conferences in dermatology training, a national survey of chief residents was conducted. Although the survey confirmed that a high value is placed on basic science conferences, a surprising finding was a significant level of dissatisfaction among chief residents, particularly those from university-based programs. Results of the survey have been used to redefine our own objectives in basic science teaching and to propose elements of methodology and curriculum.
Functional phosphoproteomic mass spectrometry-based approaches
2012-01-01
Mass Spectrometry (MS)-based phosphoproteomics tools are crucial for understanding the structure and dynamics of signaling networks. Approaches such as affinity purification followed by MS have also been used to elucidate relevant biological questions in health and disease. The study of proteomes and phosphoproteomes as linked systems, rather than research studies of individual proteins, are necessary to understand the functions of phosphorylated and un-phosphorylated proteins under spatial and temporal conditions. Phosphoproteome studies also facilitate drug target protein identification which may be clinically useful in the near future. Here, we provide an overview of general principles of signaling pathways versus phosphorylation. Likewise, we detail chemical phosphoproteomic tools, including pros and cons with examples where these methods have been applied. In addition, basic clues of electrospray ionization and collision induced dissociation fragmentation are detailed in a simple manner for successful phosphoproteomic clinical studies. PMID:23369623
An introduction to Item Response Theory and Rasch Analysis of the Eating Assessment Tool (EAT-10).
Kean, Jacob; Brodke, Darrel S; Biber, Joshua; Gross, Paul
2018-03-01
Item response theory has its origins in educational measurement and is now commonly applied in health-related measurement of latent traits, such as function and symptoms. This application is due in large part to gains in the precision of measurement attributable to item response theory and corresponding decreases in response burden, study costs, and study duration. The purpose of this paper is twofold: introduce basic concepts of item response theory and demonstrate this analytic approach in a worked example, a Rasch model (1PL) analysis of the Eating Assessment Tool (EAT-10), a commonly used measure for oropharyngeal dysphagia. The results of the analysis were largely concordant with previous studies of the EAT-10 and illustrate for brain impairment clinicians and researchers how IRT analysis can yield greater precision of measurement.
Fundamentals of flow cytometry.
Jaroszeski, M J; Radcliff, G
1999-02-01
Flow cytometers are instruments that are used primarily to measure the physical and biochemical characteristics of biological particles. This technology is used to perform measurements on whole cells as well as prepared cellular constituents, such as nuclei and organelles. Flow cytometers are investigative tools for a broad range of scientific disciplines because they make measurements on thousands of individual cells/particles in a matter of seconds. This is a unique advantage relative to other detection instruments that provide bulk particle measurements. Flow cytometry is a complex and highly technical field; therefore, a basic understanding of the technology is essential for all users. The purpose of this article is to provide fundamental information about the instrumentation used for flow cytometry as well as the methods used to analyze and interpret data. This information will provide a foundation to use flow cytometry effectively as a research tool.
The General Mission Analysis Tool (GMAT): Current Features And Adding Custom Functionality
NASA Technical Reports Server (NTRS)
Conway, Darrel J.; Hughes, Steven P.
2010-01-01
The General Mission Analysis Tool (GMAT) is a software system for trajectory optimization, mission analysis, trajectory estimation, and prediction developed by NASA, the Air Force Research Lab, and private industry. GMAT's design and implementation are based on four basic principles: open source visibility for both the source code and design documentation; platform independence; modular design; and user extensibility. The system, released under the NASA Open Source Agreement, runs on Windows, Mac and Linux. User extensions, loaded at run time, have been built for optimization, trajectory visualization, force model extension, and estimation, by parties outside of GMAT's development group. The system has been used to optimize maneuvers for the Lunar Crater Observation and Sensing Satellite (LCROSS) and ARTEMIS missions and is being used for formation design and analysis for the Magnetospheric Multiscale Mission (MMS).
Grants4Targets - an innovative approach to translate ideas from basic research into novel drugs.
Lessl, Monika; Schoepe, Stefanie; Sommer, Anette; Schneider, Martin; Asadullah, Khusru
2011-04-01
Collaborations between industry and academia are steadily gaining importance. To combine expertises Bayer Healthcare has set up a novel open innovation approach called Grants4Targets. Ideas on novel drug targets can easily be submitted to http://www.grants4targets.com. After a review process, grants are provided to perform focused experiments to further validate the proposed targets. In addition to financial support specific know-how on target validation and drug discovery is provided. Experienced scientists are nominated as project partners and, depending on the project, tools or specific models are provided. Around 280 applications have been received and 41 projects granted. According to our experience, this type of bridging fund combined with joint efforts provides a valuable tool to foster drug discovery collaborations. Copyright © 2010 Elsevier Ltd. All rights reserved.
Installation/Removal Tool for Screw-Mounted Components
NASA Technical Reports Server (NTRS)
Ash, J. P.
1984-01-01
Tweezerlike tool simplifies installation of screws in places reached only through narrow openings. With changes in size and shape, basic tool concept applicable to mounting and dismounting of transformers, sockets, terminal strips and mechanical parts. Inexpensive tool fabricated as needed by bending two pieces of steel wire. Exact size and shape selected to suit part manipulated and nature of inaccessible mounting space.
de-Marcos, Luis; García-López, Eva; García-Cabot, Antonio
2017-04-01
This paper reports data about the learning performance of students using four different motivational tools: an educational game, a gamified plugin, a social networking website and a gamified social networking website. It also reports a control group. The data pertain to 379 students of an undergraduate course that covers basic Information and Communication Technology (ICT) skills in Spain. Data corresponds to different learning modules of the European Computer Driving License (ECDL) initiative. The data include variables of four pre-test scores, four post-test scores and a final examination. It was gathered using a quasi-experimental research design during 2014. Data reported here refers to the research paper in (de-Marcos et al., 2016) [1].
Exploring actinide materials through synchrotron radiation techniques.
Shi, Wei-Qun; Yuan, Li-Yong; Wang, Cong-Zhi; Wang, Lin; Mei, Lei; Xiao, Cheng-Liang; Zhang, Li; Li, Zi-Jie; Zhao, Yu-Liang; Chai, Zhi-Fang
2014-12-10
Synchrotron radiation (SR) based techniques have been utilized with increasing frequency in the past decade to explore the brilliant and challenging sciences of actinide-based materials. This trend is partially driven by the basic needs for multi-scale actinide speciation and bonding information and also the realistic needs for nuclear energy research. In this review, recent research progresses on actinide related materials by means of various SR techniques were selectively highlighted and summarized, with the emphasis on X-ray absorption spectroscopy, X-ray diffraction and scattering spectroscopy, which are powerful tools to characterize actinide materials. In addition, advanced SR techniques for exploring future advanced nuclear fuel cycles dealing with actinides are illustrated as well. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Correia, Walter; Rodrigues, Laís; Campos, Fábio; Soares, Marcelo; Barros, Marina
2012-01-01
To demonstrate the relationship between design and emotional development for kids, this article offers an initial approach about the definition and historical aspects of emotion in product development, citing the main authors of this issue. Based the field research conducted with children from 2 to 6 years of age, was also describes the basic ideas of Piaget in the child psychology and pre-operational stage (age group studied) and the significance of children's toys from the perspective of Vigostsky. Using this theoretical framework and results of field research, we can infer some emotional design as advocated by the producers of positive affect on humans and its relationship with the child's development and choices of their toys.
Storytelling: A Qualitative Tool to Promote Health Among Vulnerable Populations.
Palacios, Janelle F; Salem, Benissa; Hodge, Felicia Schanche; Albarrán, Cyndi R; Anaebere, Ann; Hayes-Bautista, Teodocia Maria
2015-09-01
Storytelling is a basic cultural phenomenon that has recently been recognized as a valuable method for collecting research data and developing multidisciplinary interventions. The purpose of this article is to present a collection of nursing scholarship wherein the concept of storytelling, underpinned by cultural phenomena, is explored for data collection and intervention. A conceptual analysis of storytelling reveals key variables. Following a brief review of current research focused on storytelling used within health care, three case studies among three vulnerable populations (American Indian teen mothers, American Indian cancer survivors, and African American women at risk for HIV/AIDS) demonstrate the uses of storytelling for data collection and intervention. Implications for transcultural nursing regarding storytelling are discussed. © The Author(s) 2014.
Expanding the scope of site-specific recombinases for genetic and metabolic engineering.
Gaj, Thomas; Sirk, Shannon J; Barbas, Carlos F
2014-01-01
Site-specific recombinases are tremendously valuable tools for basic research and genetic engineering. By promoting high-fidelity DNA modifications, site-specific recombination systems have empowered researchers with unprecedented control over diverse biological functions, enabling countless insights into cellular structure and function. The rigid target specificities of many sites-specific recombinases, however, have limited their adoption in fields that require highly flexible recognition abilities. As a result, intense effort has been directed toward altering the properties of site-specific recombination systems by protein engineering. Here, we review key developments in the rational design and directed molecular evolution of site-specific recombinases, highlighting the numerous applications of these enzymes across diverse fields of study. © 2013 Wiley Periodicals, Inc.
NASA Technical Reports Server (NTRS)
Fink, Pamela K.
1991-01-01
Two intelligent tutoring systems were developed. These tutoring systems are being used to study the effectiveness of intelligent tutoring systems in training high performance tasks and the interrelationship of high performance and cognitive tasks. The two tutoring systems, referred to as the Console Operations Tutors, were built using the same basic approach to the design of an intelligent tutoring system. This design approach allowed researchers to more rapidly implement the cognitively based tutor, the OMS Leak Detect Tutor, by using the foundation of code generated in the development of the high performance based tutor, the Manual Select Keyboard (MSK). It is believed that the approach can be further generalized to develop a generic intelligent tutoring system implementation tool.
Interpretation of statistical results.
García Garmendia, J L; Maroto Monserrat, F
2018-02-21
The appropriate interpretation of the statistical results is crucial to understand the advances in medical science. The statistical tools allow us to transform the uncertainty and apparent chaos in nature to measurable parameters which are applicable to our clinical practice. The importance of understanding the meaning and actual extent of these instruments is essential for researchers, the funders of research and for professionals who require a permanent update based on good evidence and supports to decision making. Various aspects of the designs, results and statistical analysis are reviewed, trying to facilitate his comprehension from the basics to what is most common but no better understood, and bringing a constructive, non-exhaustive but realistic look. Copyright © 2018 Elsevier España, S.L.U. y SEMICYUC. All rights reserved.
NASA Astrophysics Data System (ADS)
Schulz, Rachel Corinne
This study investigated the intended teacher use of a technology-enhanced learning tool, Web-based Inquiry Science Environment (WISE), and the first experiences of teachers new to using it and untrained in its use. The purpose of the study was to learn more about the factors embedded into the design of the technology that enabled it or hindered it from being used as intended. The qualitative research design applied grounded theory methods. Using theoretical sampling and a constant comparative analysis, a document review of WISE website led to a model of intended teacher use. The experiences of four middle school science teachers as they enacted WISE for the first time were investigated through ethnographic field observations, surveys and interviews using thematic analysis to construct narratives of each teachers use. These narratives were compared to the model of intended teacher use of WISE. This study found two levels of intended teacher uses for WISE. A basic intended use involved having student running the project to completion while the teacher provides feedback and assesses student learning. A more optimal description of intended use involved the supplementing the core curriculum with WISE as well as enhancing the core scope and sequence of instruction and aligning assessment with the goals of instruction through WISE. Moreover, WISE projects were optimally intended to be facilitated through student-centered teaching practices and inquiry-based instruction in a collaborative learning environment. It is also optimally intended for these projects to be shared with other colleagues for feedback and iterative development towards improving the Knowledge Integration of students. Of the four teachers who participated in this study, only one demonstrated the use of WISE as intended in the most basic way. This teacher also demonstrated the use of WISE in a number of optimal ways. Teacher confusion with certain tools available within WISE suggests that there may be a way to develop the user experience through these touch points and help teachers learn how to use the technology as they are selecting and setting up a project run. Further research may study whether improving these touch points can improve the teachers' use of WISE as intended both basically and optimally. It may also study whether or not teacher in basic and optimal ways directly impact student learning results.
Yamazaki, Yuka; Uka, Takanori; Shimizu, Haruhiko; Miyahira, Akira; Sakai, Tatsuo; Marui, Eiji
2013-02-01
The number of physicians engaged in basic sciences and teaching is sharply decreasing in Japan. To alleviate this shortage, central government has increased the quota of medical students entering the field. This study investigated medical students' interest in basic sciences in efforts to recruit talent. A questionnaire distributed to 501 medical students in years 2 to 6 of Juntendo University School of Medicine inquired about sex, grade, interest in basic sciences, interest in research, career path as a basic science physician, faculties' efforts to encourage students to conduct research, increases in the number of lectures, and practical training sessions on research. Associations between interest in basic sciences and other variables were examined using χ(2) tests. From among the 269 medical students (171 female) who returned the questionnaire (response rate 53.7%), 24.5% of respondents were interested in basic sciences and half of them considered basic sciences as their future career. Obstacles to this career were their original aim to become a clinician and concerns about salary. Medical students who were likely to be interested in basic sciences were fifth- and sixth-year students, were interested in research, considered basic sciences as their future career, considered faculties were making efforts to encourage medical students to conduct research, and wanted more research-related lectures. Improving physicians' salaries in basic sciences is important for securing talent. Moreover, offering continuous opportunities for medical students to experience research and encouraging advanced-year students during and after bedside learning to engage in basic sciences are important for recruiting talent.
Nucleic acids-based tools for ballast water surveillance, monitoring, and research
NASA Astrophysics Data System (ADS)
Darling, John A.; Frederick, Raymond M.
2018-03-01
Understanding the risks of biological invasion posed by ballast water-whether in the context of compliance testing, routine monitoring, or basic research-is fundamentally an exercise in biodiversity assessment, and as such should take advantage of the best tools available for tackling that problem. The past several decades have seen growing application of genetic methods for the study of biodiversity, driven in large part by dramatic technological advances in nucleic acids analysis. Monitoring approaches based on such methods have the potential to increase dramatically sampling throughput for biodiversity assessments, and to improve on the sensitivity, specificity, and taxonomic accuracy of traditional approaches. The application of targeted detection tools (largely focused on PCR but increasingly incorporating novel probe-based methodologies) has led to a paradigm shift in rare species monitoring, and such tools have already been applied for early detection in the context of ballast water surveillance. Rapid improvements in community profiling approaches based on high throughput sequencing (HTS) could similarly impact broader efforts to catalogue biodiversity present in ballast tanks, and could provide novel opportunities to better understand the risks of biotic exchange posed by ballast water transport-and the effectiveness of attempts to mitigate those risks. These various approaches still face considerable challenges to effective implementation, depending on particular management or research needs. Compliance testing, for instance, remains dependent on accurate quantification of viable target organisms; while tools based on RNA detection show promise in this context, the demands of such testing require considerable additional investment in methods development. In general surveillance and research contexts, both targeted and community-based approaches are still limited by various factors: quantification remains a challenge (especially for taxa in larger size classes), gaps in nucleic acids reference databases are still considerable, uncertainties in taxonomic assignment methods persist, and many applications have not yet matured sufficiently to offer standardized methods capable of meeting rigorous quality assurance standards. Nevertheless, the potential value of these tools, their growing utilization in biodiversity monitoring, and the rapid methodological advances over the past decade all suggest that they should be seriously considered for inclusion in the ballast water surveillance toolkit.
Hey, Spencer Phillips; Heilig, Charles M; Weijer, Charles
2013-05-30
Maximizing efficiency in drug development is important for drug developers, policymakers, and human subjects. Limited funds and the ethical imperative of risk minimization demand that researchers maximize the knowledge gained per patient-subject enrolled. Yet, despite a common perception that the current system of drug development is beset by inefficiencies, there remain few approaches for systematically representing, analyzing, and communicating the efficiency and coordination of the research enterprise. In this paper, we present the first steps toward developing such an approach: a graph-theoretic tool for representing the Accumulating Evidence and Research Organization (AERO) across a translational trajectory. This initial version of the AERO model focuses on elucidating two dimensions of robustness: (1) the consistency of results among studies with an identical or similar outcome metric; and (2) the concordance of results among studies with qualitatively different outcome metrics. The visual structure of the model is a directed acyclic graph, designed to capture these two dimensions of robustness and their relationship to three basic questions that underlie the planning of a translational research program: What is the accumulating state of total evidence? What has been the translational trajectory? What studies should be done next? We demonstrate the utility of the AERO model with an application to a case study involving the antibacterial agent, moxifloxacin, for the treatment of drug-susceptible tuberculosis. We then consider some possible elaborations for the AERO model and propose a number of ways in which the tool could be used to enhance the planning, reporting, and analysis of clinical trials. The AERO model provides an immediate visual representation of the number of studies done at any stage of research, depicting both the robustness of evidence and the relationship of each study to the larger translational trajectory. In so doing, it makes some of the invisible or inchoate properties of the research system explicit - helping to elucidate judgments about the accumulating state of evidence and supporting decision-making for future research.
Human Disease Insight: An integrated knowledge-based platform for disease-gene-drug information.
Tasleem, Munazzah; Ishrat, Romana; Islam, Asimul; Ahmad, Faizan; Hassan, Md Imtaiyaz
2016-01-01
The scope of the Human Disease Insight (HDI) database is not limited to researchers or physicians as it also provides basic information to non-professionals and creates disease awareness, thereby reducing the chances of patient suffering due to ignorance. HDI is a knowledge-based resource providing information on human diseases to both scientists and the general public. Here, our mission is to provide a comprehensive human disease database containing most of the available useful information, with extensive cross-referencing. HDI is a knowledge management system that acts as a central hub to access information about human diseases and associated drugs and genes. In addition, HDI contains well-classified bioinformatics tools with helpful descriptions. These integrated bioinformatics tools enable researchers to annotate disease-specific genes and perform protein analysis, search for biomarkers and identify potential vaccine candidates. Eventually, these tools will facilitate the analysis of disease-associated data. The HDI provides two types of search capabilities and includes provisions for downloading, uploading and searching disease/gene/drug-related information. The logistical design of the HDI allows for regular updating. The database is designed to work best with Mozilla Firefox and Google Chrome and is freely accessible at http://humandiseaseinsight.com. Copyright © 2015 King Saud Bin Abdulaziz University for Health Sciences. Published by Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roberto, J.; Diaz de la Rubia, T.; Gibala, R.
2006-10-01
The global utilization of nuclear energy has come a long way from its humble beginnings in the first sustained nuclear reaction at the University of Chicago in 1942. Today, there are over 440 nuclear reactors in 31 countries producing approximately 16% of the electrical energy used worldwide. In the United States, 104 nuclear reactors currently provide 19% of electrical energy used nationally. The International Atomic Energy Agency projects significant growth in the utilization of nuclear power over the next several decades due to increasing demand for energy and environmental concerns related to emissions from fossil plants. There are 28 newmore » nuclear plants currently under construction including 10 in China, 8 in India, and 4 in Russia. In the United States, there have been notifications to the Nuclear Regulatory Commission of intentions to apply for combined construction and operating licenses for 27 new units over the next decade. The projected growth in nuclear power has focused increasing attention on issues related to the permanent disposal of nuclear waste, the proliferation of nuclear weapons technologies and materials, and the sustainability of a once-through nuclear fuel cycle. In addition, the effective utilization of nuclear power will require continued improvements in nuclear technology, particularly related to safety and efficiency. In all of these areas, the performance of materials and chemical processes under extreme conditions is a limiting factor. The related basic research challenges represent some of the most demanding tests of our fundamental understanding of materials science and chemistry, and they provide significant opportunities for advancing basic science with broad impacts for nuclear reactor materials, fuels, waste forms, and separations techniques. Of particular importance is the role that new nanoscale characterization and computational tools can play in addressing these challenges. These tools, which include DOE synchrotron X-ray sources, neutron sources, nanoscale science research centers, and supercomputers, offer the opportunity to transform and accelerate the fundamental materials and chemical sciences that underpin technology development for advanced nuclear energy systems. The fundamental challenge is to understand and control chemical and physical phenomena in multi-component systems from femto-seconds to millennia, at temperatures to 1000?C, and for radiation doses to hundreds of displacements per atom (dpa). This is a scientific challenge of enormous proportions, with broad implications in the materials science and chemistry of complex systems. New understanding is required for microstructural evolution and phase stability under relevant chemical and physical conditions, chemistry and structural evolution at interfaces, chemical behavior of actinide and fission-product solutions, and nuclear and thermomechanical phenomena in fuels and waste forms. First-principles approaches are needed to describe f-electron systems, design molecules for separations, and explain materials failure mechanisms. Nanoscale synthesis and characterization methods are needed to understand and design materials and interfaces with radiation, temperature, and corrosion resistance. Dynamical measurements are required to understand fundamental physical and chemical phenomena. New multiscale approaches are needed to integrate this knowledge into accurate models of relevant phenomena and complex systems across multiple length and time scales.« less
Neurological and developmental approaches to poor pitch perception and production
Loui, Psyche; Demorest, Steven M.; Pfordresher, Peter Q.; Iyer, Janani
2014-01-01
Whereas much of research in music and neuroscience is aimed at understanding the mechanisms by which the human brain facilitates music, emerging interest in the neuromusic community aims to translate basic music research into clinical and educational applications. In the present workshop, we explore the problems of poor pitch perception and production from both neurological and developmental/educational perspectives. We begin by reviewing previous and novel findings on the neural regulation of pitch perception and production. We then discuss issues in measuring singing accuracy consistently between the laboratory and educational settings. We review the Seattle Singing Accuracy Protocol—a new assessment tool that we hope can be adopted by cognitive psychologists as well as music educators—and we conclude with some suggestions that the present interdisciplinary approach might offer for future research. PMID:25773643
Iserbyt, Peter; Byra, Mark
2013-11-01
Research investigating design effects of instructional tools for learning Basic Life Support (BLS) is almost non-existent. To demonstrate the design of instructional tools matter. The effect of spatial contiguity, a design principle stating that people learn more deeply when words and corresponding pictures are placed close (i.e., integrated) rather than far from each other on a page was investigated on task cards for learning Cardiopulmonary Resuscitation (CPR) during reciprocal peer learning. A randomized controlled trial. A total of 111 students (mean age: 13 years) constituting six intact classes learned BLS through reciprocal learning with task cards. Task cards combine a picture of the skill with written instructions about how to perform it. In each class, students were randomly assigned to the experimental group or the control. In the control, written instructions were placed under the picture on the task cards. In the experimental group, written instructions were placed close to the corresponding part of the picture on the task cards reflecting application of the spatial contiguity principle. One-way analysis of variance found significantly better performances in the experimental group for ventilation volumes (P=.03, ηp2=.10) and flow rates (P=.02, ηp2=.10). For chest compression depth, compression frequency, compressions with correct hand placement, and duty cycles no significant differences were found. This study shows that the design of instructional tools (i.e., task cards) affects student learning. Research-based design of learning tools can enhance BLS and CPR education. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Focus on Basics: Connecting Research & Practice. Volume 8, Issue D
ERIC Educational Resources Information Center
Garner, Barbara, Ed.
2007-01-01
Learning disabilities is the theme of the latest issue of "Focus on Basics," the World Education publication that brings together research, policy, and practice in adult basic education. Starting with an update on research on neurobiology and dyslexia, this issue also examines how the adult basic education system supports students with…
Focus on Basics: Connecting Research & Practice. Volume 6, Issue B
ERIC Educational Resources Information Center
Garner, Barbara, Ed.
2003-01-01
"Focus on Basics" is the quarterly publication of the National Center for the Study of Adult Learning and Literacy. It presents best practices, current research on adult learning and literacy, and how research is used by adult basic education teachers, counselors, program administrators, and policymakers. "Focus on Basics" is dedicated to…
Focus on Basics: Connecting Research and Practice. Volume 6, Issue D
ERIC Educational Resources Information Center
National Center for the Study of Adult Learning and Literacy (NCSALL), Harvard University, 2004
2004-01-01
"Focus on Basics" is the quarterly publication of the National Center for the Study of Adult Learning and Literacy. It presents best practices, current research on adult learning and literacy, and how research is used by adult basic education teachers, counselors, program administrators, and policymakers. "Focus on Basics" is dedicated to…
Research to knowledge: promoting the training of physician-scientists in the biology of pregnancy.
Sadovsky, Yoel; Caughey, Aaron B; DiVito, Michelle; D'Alton, Mary E; Murtha, Amy P
2018-01-01
Common disorders of pregnancy, such as preeclampsia, preterm birth, and fetal growth abnormalities, continue to challenge perinatal biologists seeking insights into disease pathogenesis that will result in better diagnosis, therapy, and disease prevention. These challenges have recently been intensified with discoveries that associate gestational diseases with long-term maternal and neonatal outcomes. Whereas modern high-throughput investigative tools enable scientists and clinicians to noninvasively probe the maternal-fetal genome, epigenome, and other analytes, their implications for clinical medicine remain uncertain. Bridging these knowledge gaps depends on strengthening the existing pool of scientists with expertise in basic, translational, and clinical tools to address pertinent questions in the biology of pregnancy. Although PhD researchers are critical in this quest, physician-scientists would facilitate the inquiry by bringing together clinical challenges and investigative tools, promoting a culture of intellectual curiosity among clinical providers, and helping transform discoveries into relevant knowledge and clinical solutions. Uncertainties related to future administration of health care, federal support for research, attrition of physician-scientists, and an inadequate supply of new scholars may jeopardize our ability to address these challenges. New initiatives are necessary to attract current scholars and future generations of researchers seeking expertise in the scientific method and to support them, through mentorship and guidance, in pursuing a career that combines scientific investigation with clinical medicine. These efforts will promote breadth and depth of inquiry into the biology of pregnancy and enhance the pace of translation of scientific discoveries into better medicine and disease prevention. Copyright © 2017 Elsevier Inc. All rights reserved.
Commented review of the Colombian legislation regarding the ethics of health research.
Lopera, Mónica María
2017-12-01
The scope of ethics in health research transcends its legal framework and the regulations established in Resolution 8430 of 1993. These norms represent a fundamental tool to determine the minimum protection standards for research subjects, and, therefore, they should be known, applied properly, and reflect upon by all researchers in the field.Here I present and discuss from an analytical point of view the regulations that guide research in health. In this framework, health is understood as a multidimensional process, and research in health as a multidisciplinary exercise involving basic, clinical and public health research, collective health, and other related sciences.The main analytical categories are related to the principles and actors involved in research (regulatory authorities, ethical committees, and special or vulnerable subjects and populations), and to professional ethics codes, in addition to informed consents and data management.Despite the contribution of this legislation to the qualification of health research, my conclusion is that the national legislation in ethics for health research requires updating regarding technological and scientific developments, as well as specifications from the multiple types of health studies.
Using the General Mission Analysis Tool (GMAT)
NASA Technical Reports Server (NTRS)
Hughes, Steven P.; Conway, Darrel J.; Parker, Joel
2017-01-01
This is a software tutorial and presentation demonstrating the application of the General Mission Analysis Tool (GMAT). These slides will be used to accompany the demonstration. The demonstration discusses GMAT basics, then presents a detailed example of GMAT application to the Transiting Exoplanet Survey Satellite (TESS) mission. This talk is a combination of existing presentations and material; system user guide and technical documentation; a GMAT basics and overview, and technical presentations from the TESS projects on their application of GMAT to critical mission design. The GMAT basics slides are taken from the open source training material. The TESS slides are a streamlined version of the CDR package provided by the project with SBU and ITAR data removed by the TESS project. Slides for navigation and optimal control are borrowed from system documentation and training material.
RESLanjut: The learning media for improve students understanding in embedded systems
NASA Astrophysics Data System (ADS)
Indrianto, Susanti, Meilia Nur Indah; Karina, Djunaidi
2017-08-01
The use of network in embedded system can be done with many kinds of network, with the use of mobile phones, bluetooths, modems, ethernet cards, wireless technology and so on. Using network in embedded system could help people to do remote controlling. On previous research, researchers found that many students have the ability to comprehend the basic concept of embedded system. They could also make embedded system tools but without network integration. And for that, a development is needed for the embedded system module. The embedded system practicum module design needs a prototype method in order to achieve the desired goal. The prototype method is often used in the real world. Or even, a prototype method is a part of products that consist of logic expression or external physical interface. The embedded system practicum module is meant to increase student comprehension of embedded system course, and also to encourage students to innovate on technology based tools. It is also meant to help teachers to teach the embedded system concept on the course. The student comprehension is hoped to increase with the use of practicum course.
New diagnostic tool for robotic psychology and robotherapy studies.
Libin, Elena; Libin, Alexander
2003-08-01
Robotic psychology and robotherapy as a new research area employs a systematic approach in studying psycho-physiological, psychological, and social aspects of person-robot communication. An analysis of the mechanisms underlying different forms of computer-mediated behavior requires both an adequate methodology and research tools. In the proposed article we discuss the concept, basic principles, structure, and contents of the newly designed Person-Robot Complex Interactive Scale (PRCIS), proposed for the purpose of investigating psychological specifics and therapeutic potentials of multilevel person-robot interactions. Assuming that human-robot communication has symbolic meaning, each interactive pattern evaluated via the newly developed scale is assigned certain psychological value associated with the person's past life experiences, likes and dislikes, emotional, cognitive, and behavioral traits or states. PRCIS includes (1) assessment of a person's individual style of communication with the robotic creature based on direct observations; (2) the participant's evaluation of his/her new experiences with an interactive robot and evaluation of its features, advantages and disadvantages, as well as past experiences with modern technology; and (3) the instructor's overall evaluation of the session.
Developing a Tool for Measuring the Decision-Making Competence of Older Adults
Finucane, Melissa L.; Gullion, Christina M.
2010-01-01
The authors evaluated the reliability and validity of a tool for measuring older adults’ decision-making competence (DMC). Two-hundred-five younger adults (25-45 years), 208 young-older adults (65-74 years), and 198 old-older adults (75-97 years) made judgments and decisions related to health, finance, and nutrition. Reliable indices of comprehension, dimension weighting, and cognitive reflection were developed. Unlike previous research, the authors were able to compare old-older with young-older adults’ performance. As hypothesized, old-older adults performed more poorly than young-older adults; both groups of older adults performed more poorly than younger adults. Hierarchical regression analyses showed that a large amount of variance in decision performance across age groups (including mean trends) could be accounted for by social variables, health measures, basic cognitive skills, attitudinal measures, and numeracy. Structural equation modeling revealed significant pathways from three exogenous latent factors (crystallized intelligence, other cognitive abilities, and age) to the endogenous DMC latent factor. Further research is needed to validate the meaning of performance on these tasks for real-life decision making. PMID:20545413
On consciousness, resting state fMRI, and neurodynamics
2010-01-01
Background During the last years, functional magnetic resonance imaging (fMRI) of the brain has been introduced as a new tool to measure consciousness, both in a clinical setting and in a basic neurocognitive research. Moreover, advanced mathematical methods and theories have arrived the field of fMRI (e.g. computational neuroimaging), and functional and structural brain connectivity can now be assessed non-invasively. Results The present work deals with a pluralistic approach to "consciousness'', where we connect theory and tools from three quite different disciplines: (1) philosophy of mind (emergentism and global workspace theory), (2) functional neuroimaging acquisitions, and (3) theory of deterministic and statistical neurodynamics – in particular the Wilson-Cowan model and stochastic resonance. Conclusions Based on recent experimental and theoretical work, we believe that the study of large-scale neuronal processes (activity fluctuations, state transitions) that goes on in the living human brain while examined with functional MRI during "resting state", can deepen our understanding of graded consciousness in a clinical setting, and clarify the concept of "consiousness" in neurocognitive and neurophilosophy research. PMID:20522270
MATLAB-Based Program for Teaching Autocorrelation Function and Noise Concepts
ERIC Educational Resources Information Center
Jovanovic Dolecek, G.
2012-01-01
An attractive MATLAB-based tool for teaching the basics of autocorrelation function and noise concepts is presented in this paper. This tool enhances traditional in-classroom lecturing. The demonstrations of the tool described here highlight the description of the autocorrelation function (ACF) in a general case for wide-sense stationary (WSS)…
Shop Tools. FOS: Fundamentals of Service.
ERIC Educational Resources Information Center
John Deere Co., Moline, IL.
This shop tools manual is one of a series of power mechanics texts and visual aids on servicing of automotive and off-the-road equipment. Materials provide basic information and illustrations for use by vocational students and teachers as well as shop servicemen and laymen. Sections describe the use of the following tools: screwdrivers, hammers,…
A sonic tool for spinal fusion.
Weis, E B
1977-01-01
The application of sonic energy to bone cutting problems is reported. The basic principle of the resonant tool, its adaptation for surgery, the experimental results of its use in animals, and clinical experience are reported. This sonic tool is found to introduce no significant tissue destruction. It does have several desirable characteristics for routine use in orthopedics.
The Relationship between Basic and Applied Research in Universities
ERIC Educational Resources Information Center
Bentley, Peter James; Gulbrandsen, Magnus; Kyvik, Svein
2015-01-01
What is the central research activity in modern universities? This paper uses a comprehensive survey among individuals from 15 countries to map differences in orientation towards basic/fundamental research, applied/practical research and a combination of the two. Despite some claims in the literature that basic research is no longer a…
NASA Astrophysics Data System (ADS)
Mendoza, A. M.; Bakshi, S.; Berrios, D.; Chulaki, A.; Evans, R. M.; Kuznetsova, M. M.; Lee, H.; MacNeice, P. J.; Maddox, M. M.; Mays, M. L.; Mullinix, R. E.; Ngwira, C. M.; Patel, K.; Pulkkinen, A.; Rastaetter, L.; Shim, J.; Taktakishvili, A.; Zheng, Y.
2012-12-01
Community Coordinated Modeling Center (CCMC) was established to enhance basic solar terrestrial research and to aid in the development of models for specifying and forecasting conditions in the space environment. In achieving this goal, CCMC has developed and provides a set of innovative tools varying from: Integrated Space Weather Analysis (iSWA) web -based dissemination system for space weather information, Runs-On-Request System providing access to unique collection of state-of-the-art solar and space physics models (unmatched anywhere in the world), Advanced Online Visualization and Analysis tools for more accurate interpretation of model results, Standard Data formats for Simulation Data downloads, and recently Mobile apps (iPhone/Android) to view space weather data anywhere to the scientific community. The number of runs requested and the number of resulting scientific publications and presentations from the research community has not only been an indication of the broad scientific usage of the CCMC and effective participation by space scientists and researchers, but also guarantees active collaboration and coordination amongst the space weather research community. Arising from the course of CCMC activities, CCMC also supports community-wide model validation challenges and research focus group projects for a broad range of programs such as the multi-agency National Space Weather Program, NSF's CEDAR (Coupling, Energetics and Dynamics of Atmospheric Regions), GEM (Geospace Environment Modeling) and Shine (Solar Heliospheric and INterplanetary Environment) programs. In addition to performing research and model development, CCMC also supports space science education by hosting summer students through local universities; through the provision of simulations in support of classroom programs such as Heliophysics Summer School (with student research contest) and CCMC Workshops; training next generation of junior scientists in space weather forecasting; and educating the general public about the importance and impacts of space weather effects. Although CCMC is organizationally comprised of United States federal agencies, CCMC services are open to members of the international science community and encourages interagency and international collaboration. In this poster, we provide an overview of using Community Coordinated Modeling Center (CCMC) tools and services to support worldwide space weather scientific communities and networks.;
The Basics in Pottery: Clay and Tools.
ERIC Educational Resources Information Center
Larson, Joan
1985-01-01
Art teachers at the middle school or junior high school level usually find themselves in a program teaching ceramics. The most essential tools needed for a ceramics class are discussed. Different kinds of clay are also discussed. (RM)
Web portal on environmental sciences "ATMOS''
NASA Astrophysics Data System (ADS)
Gordov, E. P.; Lykosov, V. N.; Fazliev, A. Z.
2006-06-01
The developed under INTAS grant web portal ATMOS (http://atmos.iao.ru and http://atmos.scert.ru) makes available to the international research community, environmental managers, and the interested public, a bilingual information source for the domain of Atmospheric Physics and Chemistry, and the related application domain of air quality assessment and management. It offers access to integrated thematic information, experimental data, analytical tools and models, case studies, and related information and educational resources compiled, structured, and edited by the partners into a coherent and consistent thematic information resource. While offering the usual components of a thematic site such as link collections, user group registration, discussion forum, news section etc., the site is distinguished by its scientific information services and tools: on-line models and analytical tools, and data collections and case studies together with tutorial material. The portal is organized as a set of interrelated scientific sites, which addressed basic branches of Atmospheric Sciences and Climate Modeling as well as the applied domains of Air Quality Assessment and Management, Modeling, and Environmental Impact Assessment. Each scientific site is open for external access information-computational system realized by means of Internet technologies. The main basic science topics are devoted to Atmospheric Chemistry, Atmospheric Spectroscopy and Radiation, Atmospheric Aerosols, Atmospheric Dynamics and Atmospheric Models, including climate models. The portal ATMOS reflects current tendency of Environmental Sciences transformation into exact (quantitative) sciences and is quite effective example of modern Information Technologies and Environmental Sciences integration. It makes the portal both an auxiliary instrument to support interdisciplinary projects of regional environment and extensive educational resource in this important domain.
Synthetic biology: applying biological circuits beyond novel therapies.
Dobrin, Anton; Saxena, Pratik; Fussenegger, Martin
2016-04-18
Synthetic biology, an engineering, circuit-driven approach to biology, has developed whole new classes of therapeutics. Unfortunately, these advances have thus far been undercapitalized upon by basic researchers. As discussed herein, using synthetic circuits, one can undertake exhaustive investigations of the endogenous circuitry found in nature, develop novel detectors and better temporally and spatially controlled inducers. One could detect changes in DNA, RNA, protein or even transient signaling events, in cell-based systems, in live mice, and in humans. Synthetic biology has also developed inducible systems that can be induced chemically, optically or using radio waves. This induction has been re-wired to lead to changes in gene expression, RNA stability and splicing, protein stability and splicing, and signaling via endogenous pathways. Beyond simple detectors and inducible systems, one can combine these modalities and develop novel signal integration circuits that can react to a very precise pre-programmed set of conditions or even to multiple sets of precise conditions. In this review, we highlight some tools that were developed in which these circuits were combined such that the detection of a particular event automatically triggered a specific output. Furthermore, using novel circuit-design strategies, circuits have been developed that can integrate multiple inputs together in Boolean logic gates composed of up to 6 inputs. We highlight the tools available and what has been developed thus far, and highlight how some clinical tools can be very useful in basic science. Most of the systems that are presented can be integrated together; and the possibilities far exceed the number of currently developed strategies.
EERE's State & Local Energy Data Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shambarger, Erick; DeCesaro, Jennifer
2014-06-23
EERE's State and Local Energy Data (SLED) Tool provides basic energy market information that can help state and local governments plan and implement clean energy projects, including electricity generation; fuel sources and costs; applicable policies, regulations, and financial incentives; and renewable energy resource potential. Watch this video to learn more about the tool and hear testimonials from real users about the benefits of using this tool.
EERE's State & Local Energy Data Tool
Shambarger, Erick; DeCesaro, Jennifer
2018-05-30
EERE's State and Local Energy Data (SLED) Tool provides basic energy market information that can help state and local governments plan and implement clean energy projects, including electricity generation; fuel sources and costs; applicable policies, regulations, and financial incentives; and renewable energy resource potential. Watch this video to learn more about the tool and hear testimonials from real users about the benefits of using this tool.
Freeware eLearning Flash-ECG for learning electrocardiography.
Romanov, Kalle; Kuusi, Timo
2009-06-01
Electrocardiographic (ECG) analysis can be taught in eLearning programmes with suitable software that permits the effective use of basic tools such as a ruler and a magnifier, required for measurements. The Flash-ECG (Research & Development Unit for Medical Education, University of Helsinki, Finland) was developed to enable teachers and students to use scanned and archived ECGs on computer screens and classroom projectors. The software requires only a standard web browser with a Flash plug-in and can be integrated with learning environments (Blackboard/WebCT, Moodle). The Flash-ECG is freeware and is available to medical teachers worldwide.
Tropospheric ozone as a fungal elicitor.
Zuccarini, Paolo
2009-03-01
Tropospheric ozone has been proven to trigger biochemical plant responses that are similar to the ones induced by an attack of fungal pathogens,i.e. it resembles fungal elicitors.This suggests that ozone can represent a valid tool for the study of stress responses and induction of resistance to pathogens. This review provides an overview of the implications of such a phenomenon for basic and applied research. After an introduction about the environmental implications of tropospheric ozone and plant responses to biotic stresses, the biochemistry of ozone stress is analysed, pointing out its similarities with plant responses to pathogens and its possible applications.
Cohen, C D; Kretzler, M
2009-03-01
Histological analysis of kidney biopsies is an essential part of our current diagnostic workup of patients with renal disease. Besides the already established diagnostic tools, new methods allow extensive analysis of the sample tissue's gene expression. Using results from a European multicenter study on gene expression analysis of renal biopsies, in this review we demonstrate that this novel approach not only expands the scope of so-called basic research but also might supplement future biopsy diagnostics. The goals are improved diagnosis and more specific therapy choice and prognosis estimates.
Mountain Plains Learning Experience Guide: Heating, Refrigeration, & Air Conditioning.
ERIC Educational Resources Information Center
Carey, John
This Heating, Refrigeration, and Air Conditioning course is comprised of eleven individualized units: (1) Refrigeration Tools, Materials, and Refrigerant; (2) Basic Heating and Air Conditioning; (3) Sealed System Repairs; (4) Basic Refrigeration Systems; (5) Compression Systems and Compressors; (6) Refrigeration Controls; (7) Electric Circuit…
Thermodynamics--A Practical Subject.
ERIC Educational Resources Information Center
Jones, Hugh G.
1984-01-01
Provides a simplified, synoptic overview of the area of thermodynamics, enumerating and explaining the four basic laws, and introducing the mathematics involved in a stepwise fashion. Discusses such basic tools of thermodynamics as enthalpy, entropy, Helmholtz free energy, and Gibbs free energy, and their uses in problem solving. (JM)
Can Basic Research on Children and Families Be Useful for the Policy Process?
ERIC Educational Resources Information Center
Moore, Kristin A.
Based on the assumption that basic science is the crucial building block for technological and biomedical progress, this paper examines the relevance for public policy of basic demographic and behavioral sciences research on children and families. The characteristics of basic research as they apply to policy making are explored. First, basic…
Kristman, Vicki L; Shaw, William S; Boot, Cécile R L; Delclos, George L; Sullivan, Michael J; Ehrhart, Mark G
2016-12-01
Purpose There is growing research evidence that workplace factors influence disability outcomes, but these variables reflect a variety of stakeholder perspectives, measurement tools, and methodologies. The goal of this article is to summarize existing research of workplace factors in relation to disability, compare this with employer discourse in the grey literature, and recommend future research priorities. Methods The authors participated in a year-long collaboration that ultimately led to an invited 3-day conference, "Improving Research of Employer Practices to Prevent Disability, held October 14-16, 2015, in Hopkinton, Massachusetts, USA. The collaboration included a topical review of the literature, group conference calls to identify key areas and challenges, drafting of initial documents, review of industry publications, and a conference presentation that included feedback from peer researchers and a question/answer session with a special panel of knowledge experts with direct employer experience. Results Predominant factors in the scientific literature were categorized as physical or psychosocial job demands, work organization and support, and workplace beliefs and attitudes. Employees experiencing musculoskeletal disorders in large organizations were the most frequently studied population. Research varied with respect to the basic unit of assessment (e.g., worker, supervisor, policy level) and whether assessments should be based on worker perceptions, written policies, or observable practices. The grey literature suggested that employers focus primarily on defining roles and responsibilities, standardizing management tools and procedures, being prompt and proactive, and attending to the individualized needs of workers. Industry publications reflected a high reliance of employers on a strict biomedical model in contrast to the more psychosocial framework that appears to guide research designs. Conclusion Assessing workplace factors at multiple levels, within small and medium-sized organizations, and at a more granular level may help to clarify generalizable concepts of organizational support that can be translated to specific employer strategies involving personnel, tools, and practices.
32 CFR 37.1240 - Basic research.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 32 National Defense 1 2013-07-01 2013-07-01 false Basic research. 37.1240 Section 37.1240 National... TECHNOLOGY INVESTMENT AGREEMENTS Definitions of Terms Used in This Part § 37.1240 Basic research. Efforts... practical application of that knowledge and understanding. It typically is funded within Research...
32 CFR 37.1240 - Basic research.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 32 National Defense 1 2011-07-01 2011-07-01 false Basic research. 37.1240 Section 37.1240 National... TECHNOLOGY INVESTMENT AGREEMENTS Definitions of Terms Used in This Part § 37.1240 Basic research. Efforts... practical application of that knowledge and understanding. It typically is funded within Research...
32 CFR 37.1240 - Basic research.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 32 National Defense 1 2014-07-01 2014-07-01 false Basic research. 37.1240 Section 37.1240 National... TECHNOLOGY INVESTMENT AGREEMENTS Definitions of Terms Used in This Part § 37.1240 Basic research. Efforts... practical application of that knowledge and understanding. It typically is funded within Research...
32 CFR 37.1240 - Basic research.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 32 National Defense 1 2010-07-01 2010-07-01 false Basic research. 37.1240 Section 37.1240 National... TECHNOLOGY INVESTMENT AGREEMENTS Definitions of Terms Used in This Part § 37.1240 Basic research. Efforts... practical application of that knowledge and understanding. It typically is funded within Research...
32 CFR 37.1240 - Basic research.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 32 National Defense 1 2012-07-01 2012-07-01 false Basic research. 37.1240 Section 37.1240 National... TECHNOLOGY INVESTMENT AGREEMENTS Definitions of Terms Used in This Part § 37.1240 Basic research. Efforts... practical application of that knowledge and understanding. It typically is funded within Research...
Implementation and Evaluation of Microcomputer Systems for the Republic of Turkey’s Naval Ships.
1986-03-01
important database design tool for both logical and physical database design, such as flowcharts or pseudocodes are used for program design. Logical...string manipulation in FORTRAN is difficult but not impossible. BASIC ( Beginners All-Purpose Symbolic Instruction Code): Basic is currently the most...63 APPENDIX B GLOSSARY/ACRONYM LIST AC Alternating Current AP Application Program BASIC Beginners All-purpose Symbolic Instruction Code CCP
Zinc Fingers, TALEs, and CRISPR Systems: A Comparison of Tools for Epigenome Editing.
Waryah, Charlene Babra; Moses, Colette; Arooj, Mahira; Blancafort, Pilar
2018-01-01
The completion of genome, epigenome, and transcriptome mapping in multiple cell types has created a demand for precision biomolecular tools that allow researchers to functionally manipulate DNA, reconfigure chromatin structure, and ultimately reshape gene expression patterns. Epigenetic editing tools provide the ability to interrogate the relationship between epigenetic modifications and gene expression. Importantly, this information can be exploited to reprogram cell fate for both basic research and therapeutic applications. Three different molecular platforms for epigenetic editing have been developed: zinc finger proteins (ZFs), transcription activator-like effectors (TALEs), and the system of Clustered Regularly Interspaced Short Palindromic Repeats (CRISPR) and CRISPR-associated (Cas) proteins. These platforms serve as custom DNA-binding domains (DBDs), which are fused to epigenetic modifying domains to manipulate epigenetic marks at specific sites in the genome. The addition and/or removal of epigenetic modifications reconfigures local chromatin structure, with the potential to provoke long-lasting changes in gene transcription. Here we summarize the molecular structure and mechanism of action of ZF, TALE, and CRISPR platforms and describe their applications for the locus-specific manipulation of the epigenome. The advantages and disadvantages of each platform will be discussed with regard to genomic specificity, potency in regulating gene expression, and reprogramming cell phenotypes, as well as ease of design, construction, and delivery. Finally, we outline potential applications for these tools in molecular biology and biomedicine and identify possible barriers to their future clinical implementation.
Evaluating the Interdisciplinary Discoverability of Data
NASA Astrophysics Data System (ADS)
Gordon, S.; Habermann, T.
2017-12-01
Documentation needs are similar across communities. Communities tend to agree on many of the basic concepts necessary for discovery. Shared concepts such as a title or a description of the data exist in most metadata dialects. Many dialects have been designed and recommendations implemented to create metadata valuable for data discovery. These implementations can create barriers to discovering the right data. How can we ensure that the documentation we curate will be discoverable and understandable by researchers outside of our own disciplines and organizations? Since communities tend to use and understand many of the same documentation concepts, the barriers to interdisciplinary discovery are caused by the differences in the implementation. Thus tools and methods designed for the conceptual layer that evaluate records for documentation concepts, regardless of the dialect, can be effective in identifying opportunities for improvement and providing guidance. The Metadata Evaluation Web Service combined with a Jupyter Notebook interface allows a user to gather insight about a collection of records with respect to different communities' conceptual recommendations. It accomplishes this via data visualizations and provides links to implementation specific guidance on the ESIP Wiki for each recommendation applied to the collection. By utilizing these curation tools as part of an iterative process the data's impact can be increased by making it discoverable to a greater scientific and research community. Due to the conceptual focus of the methods and tools used, they can be utilized by any community or organization regardless of their documentation dialect or tools.
Cunha, Leonardo Rodrigues; Cudischevitch, Cecília de Oliveira; Carneiro, Alan Brito; Macedo, Gustavo Bartholomeu; Lannes, Denise; Silva-Neto, Mário Alberto Cardoso da
2014-01-01
We evaluate a new approach to teaching the basic biochemistry mechanisms that regulate the biology of Triatominae, major vectors of Trypanosoma cruzi, the causative agent of Chagas disease. We have designed and used a comic book, "Carlos Chagas: 100 years after a hero's discovery" containing scientific information obtained by seven distinguished contemporary Brazilian researchers working with Triatominaes. Students (22) in the seventh grade of a public elementary school received the comic book. The study was then followed up by the use of Concept Maps elaborated by the students. Six Concept Maps elaborated by the students before the introduction of the comic book received an average score of 7. Scores rose to an average of 45 after the introduction of the comic book. This result suggests that a more attractive content can greatly improve the knowledge and conceptual understanding among students not previously exposed to insect biochemistry. In conclusion, this study illustrates an alternative to current strategies of teaching about the transmission of neglected diseases. It also promotes the diffusion of the scientific knowledge produced by Brazilian researchers that may stimulate students to choose a scientific career. © 2014 The International Union of Biochemistry and Molecular Biology.
NASA Technical Reports Server (NTRS)
Hasenstein, Karl H.; Boody, April; Cox, David (Technical Monitor)
2002-01-01
The BioTube/Magnetic Field Apparatus (MFA) research is designed to provide insight into the organization and operation of the gravity sensing systems of plants and other small organisms. This experiment on STS-107 uses magnetic fields to manipulate sensory cells in plant roots, thus using magnetic fields as a tool to study gravity-related phenomena. The experiment will be located in the SPACEHAB module and is about the size of a household microwave oven. The goal of the experiment is to improve our understanding of the basic phenomenon of how plants respond to gravity. The BioTube/MFA experiment specifically examines how gravitational forces serve as a directional signal for growth in the low-gravity environment of space. As with all basic research, this study will contribute to an improved understanding of how plants grow and will have important implications for improving plant growth and productivity on Earth. In BioTube/MFA, magnetic fields will be used to determine whether the distribution of subcellular starch grains, called amyloplasts, within plant cells predicts the direction in which roots will grow and curve in microgravity.
Measuring Networking as an Outcome Variable in Undergraduate Research Experiences
Hanauer, David I.; Hatfull, Graham
2015-01-01
The aim of this paper is to propose, present, and validate a simple survey instrument to measure student conversational networking. The tool consists of five items that cover personal and professional social networks, and its basic principle is the self-reporting of degrees of conversation, with a range of specific discussion partners. The networking instrument was validated in three studies. The basic psychometric characteristics of the scales were established by conducting a factor analysis and evaluating internal consistency using Cronbach’s alpha. The second study used a known-groups comparison and involved comparing outcomes for networking scales between two different undergraduate laboratory courses (one involving a specific effort to enhance networking). The final study looked at potential relationships between specific networking items and the established psychosocial variable of project ownership through a series of binary logistic regressions. Overall, the data from the three studies indicate that the networking scales have high internal consistency (α = 0.88), consist of a unitary dimension, can significantly differentiate between research experiences with low and high networking designs, and are related to project ownership scales. The ramifications of the networking instrument for student retention, the enhancement of public scientific literacy, and the differentiation of laboratory courses are discussed. PMID:26538387