Update on the Health Services Research Doctoral Core Competencies.
Burgess, James F; Menachemi, Nir; Maciejewski, Matthew L
2018-03-13
To present revised core competencies for doctoral programs in health services research (HSR), modalities to deliver these competencies, and suggested methods for assessing mastery of these competencies. Core competencies were originally developed in 2005, updated (but unpublished) in 2008, modestly updated for a 2016 HSR workforce conference, and revised based on feedback from attendees. Additional feedback was obtained from doctoral program directors, employer/workforce experts and attendees of presentation on these competencies at the AcademyHealth's June 2017 Annual Research Meeting. The current version (V2.1) competencies include the ethical conduct of research, conceptual models, development of research questions, study designs, data measurement and collection methods, statistical methods for analyzing data, professional collaboration, and knowledge dissemination. These competencies represent a core that defines what HSR researchers should master in order to address the complexities of microsystem to macro-system research that HSR entails. There are opportunities to conduct formal evaluation of newer delivery modalities (e.g., flipped classrooms) and to integrate new Learning Health System Researcher Core Competencies, developed by AHRQ, into the HSR core competencies. Core competencies in HSR are a continually evolving work in progress because new research questions arise, new methods are developed, and the trans-disciplinary nature of the field leads to new multidisciplinary and team building needs. © Health Research and Educational Trust.
Method for tracking core-contributed publications.
Loomis, Cynthia A; Curchoe, Carol Lynn
2012-12-01
Accurately tracking core-contributed publications is an important and often difficult task. Many core laboratories are supported by programmatic grants (such as Cancer Center Support Grant and Clinical Translational Science Awards) or generate data with instruments funded through S10, Major Research Instrumentation, or other granting mechanisms. Core laboratories provide their research communities with state-of-the-art instrumentation and expertise, elevating research. It is crucial to demonstrate the specific projects that have benefited from core services and expertise. We discuss here the method we developed for tracking core contributed publications.
NASA Astrophysics Data System (ADS)
Fisher, Dahlia; Yaniawati, Poppy; Kusumah, Yaya Sukjaya
2017-08-01
This study aims to analyze the character of students who obtain CORE learning model using metacognitive approach. The method in this study is qualitative research and quantitative research design (Mixed Method Design) with concurrent embedded strategy. The research was conducted on two groups: an experimental group and the control group. An experimental group consists of students who had CORE model learning using metacognitive approach while the control group consists of students taught by conventional learning. The study was conducted the object this research is the seventh grader students in one the public junior high schools in Bandung. Based on this research, it is known that the characters of the students in the CORE model learning through metacognitive approach is: honest, hard work, curious, conscientious, creative and communicative. Overall it can be concluded that CORE model learning is good for developing characters of a junior high school student.
Development of the Learning Health System Researcher Core Competencies.
Forrest, Christopher B; Chesley, Francis D; Tregear, Michelle L; Mistry, Kamila B
2017-08-04
To develop core competencies for learning health system (LHS) researchers to guide the development of training programs. Data were obtained from literature review, expert interviews, a modified Delphi process, and consensus development meetings. The competencies were developed from August to December 2016 using qualitative methods. The literature review formed the basis for the initial draft of a competency domain framework. Key informant semi-structured interviews, a modified Delphi survey, and three expert panel (n = 19 members) consensus development meetings produced the final set of competencies. The iterative development process yielded seven competency domains: (1) systems science; (2) research questions and standards of scientific evidence; (3) research methods; (4) informatics; (5) ethics of research and implementation in health systems; (6) improvement and implementation science; and (7) engagement, leadership, and research management. A total of 33 core competencies were prioritized across these seven domains. The real-world milieu of LHS research, the embeddedness of the researcher within the health system, and engagement of stakeholders are distinguishing characteristics of this emerging field. The LHS researcher core competencies can be used to guide the development of learning objectives, evaluation methods, and curricula for training programs. © Health Research and Educational Trust.
NASA Astrophysics Data System (ADS)
Wang, Xuping; Quan, Long; Xiong, Guangyu
2013-11-01
Currently, most researches use signals, such as the coil current or voltage of solenoid, to identify parameters; typically, parameter identification method based on variation rate of coil current is applied for position estimation. The problem exists in these researches that the detected signals are prone to interference and difficult to obtain. This paper proposes a new method for detecting the core position by using flux characteristic quantity, which adds a new group of secondary winding to the coil of the ordinary switching electromagnet. On the basis of electromagnetic coupling theory analysis and simulation research of the magnetic field regarding the primary and secondary winding coils, and in accordance with the fact that under PWM control mode varying core position and operating current of windings produce different characteristic of flux increment of the secondary winding. The flux increment of the electromagnet winding can be obtained by conducting time domain integration for the induced voltage signal of the extracted secondary winding, and the core position from the two-dimensional fitting curve of the operating winding current and flux-linkage characteristic quantity of solenoid are calculated. The detecting and testing system of solenoid core position is developed based on the theoretical research. The testing results show that the flux characteristic quantity of switching electromagnet magnetic circuit is able to effectively show the core position and thus to accomplish the non-displacement transducer detection of the said core position of the switching electromagnet. This paper proposes a new method for detecting the core position by using flux characteristic quantity, which provides a new theory and method for switch solenoid to control the proportional valve.
The talk will highlight key aspects and results of analytical methods the EPA National Health and Environmental Effects Research Laboratory (NHEERL) Analytical Chemistry Research Core (ACRC) develops and uses to provide data on disposition, metabolism, and effects of environmenta...
Cores Of Recurrent Events (CORE) | Informatics Technology for Cancer Research (ITCR)
CORE is a statistically supported computational method for finding recurrently targeted regions in massive collections of genomic intervals, such as those arising from DNA copy number analysis of single tumor cells or bulk tumor tissues.
Establishing an efficient way to utilize the drought resistance germplasm population in wheat.
Wang, Jiancheng; Guan, Yajing; Wang, Yang; Zhu, Liwei; Wang, Qitian; Hu, Qijuan; Hu, Jin
2013-01-01
Drought resistance breeding provides a hopeful way to improve yield and quality of wheat in arid and semiarid regions. Constructing core collection is an efficient way to evaluate and utilize drought-resistant germplasm resources in wheat. In the present research, 1,683 wheat varieties were divided into five germplasm groups (high resistant, HR; resistant, R; moderate resistant, MR; susceptible, S; and high susceptible, HS). The least distance stepwise sampling (LDSS) method was adopted to select core accessions. Six commonly used genetic distances (Euclidean distance, Euclid; Standardized Euclidean distance, Seuclid; Mahalanobis distance, Mahal; Manhattan distance, Manhat; Cosine distance, Cosine; and Correlation distance, Correlation) were used to assess genetic distances among accessions. Unweighted pair-group average (UPGMA) method was used to perform hierarchical cluster analysis. Coincidence rate of range (CR) and variable rate of coefficient of variation (VR) were adopted to evaluate the representativeness of the core collection. A method for selecting the ideal constructing strategy was suggested in the present research. A wheat core collection for the drought resistance breeding programs was constructed by the strategy selected in the present research. The principal component analysis showed that the genetic diversity was well preserved in that core collection.
The Social Determinants of Health Core: Taking a Place-Based Approach.
Scribner, Richard A; Simonsen, Neal R; Leonardi, Claudia
2017-01-01
There is growing recognition that health disparities research needs to incorporate social determinants in the local environment into explanatory models. In the transdisciplinary setting of the Mid-South Transdisciplinary Collaborative Center (TCC), the Social Determinants of Health (SDH) Core developed an approach to incorporating SDH across a variety of studies. This place-based approach, which is geographically based, transdisciplinary, and inherently multilevel, is discussed. From 2014 through 2016, the SDH Core consulted on a variety of Mid-South TCC research studies with the goal of incorporating social determinants into their research designs. The approach used geospatial methods (e.g., geocoding) to link individual data files with measures of the physical and social environment in the SDH Core database. Once linked, the method permitted various types of analysis (e.g., multilevel analysis) to determine if racial disparities could be explained in terms of social determinants in the local environment. The SDH Core consulted on five Mid-South TCC research projects. In resulting analyses for all the studies, a significant portion of the variance in one or more outcomes was partially explained by a social determinant from the SDH Core database. The SDH Core approach to addressing health disparities by linking neighborhood social and physical environment measures to an individual-level data file proved to be a successful approach across Mid-South TCC research projects. Copyright © 2016 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.
Pricing the Services of Scientific Cores. Part I: Charging Subsidized and Unsubsidized Users.
ERIC Educational Resources Information Center
Fife, Jerry; Forrester, Robert
2002-01-01
Explaining that scientific cores at research institutions support shared resources and facilities, discusses devising a method of charging users for core services and controlling and managing the rates. Proposes the concept of program-based management to cover sources of core support that are funding similar work. (EV)
ERIC Educational Resources Information Center
Barron, Kenneth E.; Apple, Kevin J.
2014-01-01
Coursework in statistics and research methods is a core requirement in most undergraduate psychology programs. However, is there an optimal way to structure and sequence methodology courses to facilitate student learning? For example, should statistics be required before research methods, should research methods be required before statistics, or…
Development of a Research Methods and Statistics Concept Inventory
ERIC Educational Resources Information Center
Veilleux, Jennifer C.; Chapman, Kate M.
2017-01-01
Research methods and statistics are core courses in the undergraduate psychology major. To assess learning outcomes, it would be useful to have a measure that assesses research methods and statistical literacy beyond course grades. In two studies, we developed and provided initial validation results for a research methods and statistical knowledge…
The development of optimal lightweight truss-core sandwich panels
NASA Astrophysics Data System (ADS)
Langhorst, Benjamin Robert
Sandwich structures effectively provide lightweight stiffness and strength by sandwiching a low-density core between stiff face sheets. The performance of lightweight truss-core sandwich panels is enhanced through the design of novel truss arrangements and the development of methods by which the panels may be optimized. An introduction to sandwich panels is presented along with an overview of previous research of truss-core sandwich panels. Three alternative truss arrangements are developed and their corresponding advantages, disadvantages, and optimization routines are discussed. Finally, performance is investigated by theoretical and numerical methods, and it is shown that the relative structural efficiency of alternative truss cores varies with panel weight and load-carrying capacity. Discrete truss core sandwich panels can be designed to serve bending applications more efficiently than traditional pyramidal truss arrangements at low panel weights and load capacities. Additionally, discrete-truss cores permit the design of heterogeneous cores, which feature unit cells that vary in geometry throughout the panel according to the internal loads present at each unit cell's location. A discrete-truss core panel may be selectively strengthened to more efficiently support bending loads. Future research is proposed and additional areas for lightweight sandwich panel development are explained.
Optimization of rotor shaft shrink fit method for motor using "Robust design"
NASA Astrophysics Data System (ADS)
Toma, Eiji
2018-01-01
This research is collaborative investigation with the general-purpose motor manufacturer. To review construction method in production process, we applied the parameter design method of quality engineering and tried to approach the optimization of construction method. Conventionally, press-fitting method has been adopted in process of fitting rotor core and shaft which is main component of motor, but quality defects such as core shaft deflection occurred at the time of press fitting. In this research, as a result of optimization design of "shrink fitting method by high-frequency induction heating" devised as a new construction method, its construction method was feasible, and it was possible to extract the optimum processing condition.
Investigating Anomalies in the Output Generated by the Weather Research and Forecasting (WRF) Model
NASA Astrophysics Data System (ADS)
Decicco, Nicholas; Trout, Joseph; Manson, J. Russell; Rios, Manny; King, David
2015-04-01
The Weather Research and Forecasting (WRF) model is an advanced mesoscale numerical weather prediction (NWP) model comprised of two numerical cores, the Numerical Mesoscale Modeling (NMM) core, and the Advanced Research WRF (ARW) core. An investigation was done to determine the source of erroneous output generated by the NMM core. In particular were the appearance of zero values at regularly spaced grid cells in output fields and the NMM core's evident (mis)use of static geographic information at a resolution lower than the nesting level for which the core is performing computation. A brief discussion of the high-level modular architecture of the model is presented as well as methods utilized to identify the cause of these problems. Presented here are the initial results from a research grant, ``A Pilot Project to Investigate Wake Vortex Patterns and Weather Patterns at the Atlantic City Airport by the Richard Stockton College of NJ and the FAA''.
APA's Learning Objectives for Research Methods and Statistics in Practice: A Multimethod Analysis
ERIC Educational Resources Information Center
Tomcho, Thomas J.; Rice, Diana; Foels, Rob; Folmsbee, Leah; Vladescu, Jason; Lissman, Rachel; Matulewicz, Ryan; Bopp, Kara
2009-01-01
Research methods and statistics courses constitute a core undergraduate psychology requirement. We analyzed course syllabi and faculty self-reported coverage of both research methods and statistics course learning objectives to assess the concordance with APA's learning objectives (American Psychological Association, 2007). We obtained a sample of…
2014-01-01
Background The incidence of oropharyngeal cancer is increasing in the developed world. This has led to a large rise in research activity and clinical trials in this area, yet there is no consensus on which outcomes should be measured. As a result, the outcomes measured often differ between trials of comparable interventions, making the combination or comparison of results between trials impossible. Outcomes may also be ‘cherry-picked’, such that favourable results are reported, and less favourable results withheld. The development of a minimum outcome reporting standard, known as a core outcome set, goes some way to addressing these problems. Core outcome sets are ideally developed using a patient-centred approach so that the outcomes measured are relevant to patients and clinical practice. Core outcome sets drive up the quality and relevance of research by ensuring that the right outcomes are consistently measured and reported in trials in specific areas of health or healthcare. Methods/Design This is a mixed methods study involving three phases to develop a core outcome set for oropharyngeal cancer clinical trials. Firstly, a systematic review will establish which outcomes are measured in published oropharyngeal cancer randomised controlled trials (RCTs). Secondly, qualitative interviews with patients and carers in the UK and the USA will aim to establish which outcomes are important to these stakeholders. Data from these first two stages will be used to develop a comprehensive list of outcomes to be considered for inclusion in the core outcome set. In the third stage, patients and clinicians will participate in an iterative consensus exercise known as a Delphi study to refine the contents of the core outcome set. This protocol lays out the methodology to be implemented in the CONSENSUS study. Discussion A core outcome set defines a minimum outcome reporting standard for clinical trials in a particular area of health or healthcare. Its consistent implementation in oropharyngeal cancer clinical trials will improve the quality and relevance of research. Trials and registration This study is registered at the National Institute for Health Research (NIHR) Clinical Research Network (CRN) portfolio, ID 13823 (17 January 2013). PMID:24885068
Pricing the Services of Scientific Cores. Part II: Charging Outside Users.
ERIC Educational Resources Information Center
Fife, Jerry; Forrester, Robert
2002-01-01
Explaining that scientific cores at research institutions support shared resources and facilities, considers pricing of services to users from outside the institution. Proposes a method of allocating charges from the cores to projects with multiple funding sources through program-based management. Describes aspects of an example program: price of…
Optimizing performance by improving core stability and core strength.
Hibbs, Angela E; Thompson, Kevin G; French, Duncan; Wrigley, Allan; Spears, Iain
2008-01-01
Core stability and core strength have been subject to research since the early 1980s. Research has highlighted benefits of training these processes for people with back pain and for carrying out everyday activities. However, less research has been performed on the benefits of core training for elite athletes and how this training should be carried out to optimize sporting performance. Many elite athletes undertake core stability and core strength training as part of their training programme, despite contradictory findings and conclusions as to their efficacy. This is mainly due to the lack of a gold standard method for measuring core stability and strength when performing everyday tasks and sporting movements. A further confounding factor is that because of the differing demands on the core musculature during everyday activities (low load, slow movements) and sporting activities (high load, resisted, dynamic movements), research performed in the rehabilitation sector cannot be applied to the sporting environment and, subsequently, data regarding core training programmes and their effectiveness on sporting performance are lacking. There are many articles in the literature that promote core training programmes and exercises for performance enhancement without providing a strong scientific rationale of their effectiveness, especially in the sporting sector. In the rehabilitation sector, improvements in lower back injuries have been reported by improving core stability. Few studies have observed any performance enhancement in sporting activities despite observing improvements in core stability and core strength following a core training programme. A clearer understanding of the roles that specific muscles have during core stability and core strength exercises would enable more functional training programmes to be implemented, which may result in a more effective transfer of these skills to actual sporting activities.
Improving Defense Health Program Medical Research Processes
2017-08-08
needed for DHP medical research , such as the Army’s Clinical and Translational Research Program Office, 38 the Navy’s Research Methods Training Program... research stated, “key infrastructure for a learning health system will encompass three core elements: data networks, methods , and workforce.” 221 A 2012... Research Methods Training Program, 132 which will be further discussed in Appendix D.2. AIR FORCE Air Force Instruction 40-402, Protection of
ERIC Educational Resources Information Center
Engbers, Trent A
2016-01-01
The teaching of research methods has been at the core of public administration education for almost 30 years. But since 1990, this journal has published only two articles on the teaching of research methods. Given the increasing emphasis on data driven decision-making, greater insight is needed into the best practices for teaching public…
Core outcome sets for research and clinical practice.
Chiarotto, Alessandro; Ostelo, Raymond W; Turk, Dennis C; Buchbinder, Rachelle; Boers, Maarten
This masterclass introduces the topic of core outcome sets, describing rationale and methods for developing them, and providing some examples that are relevant for clinical research and practice. A core outcome set is a minimum consensus-based set of outcomes that should be measured and reported in all clinical trials for a specific health condition and/or intervention. Issues surrounding outcome assessment, such as selective reporting and inconsistency across studies, can be addressed by the development of a core set. As suggested by key initiatives in this field (i.e. OMERACT and COMET), the development requires achieving consensus on: (1) core outcome domains and (2) core outcome measurement instruments. Different methods can be used to reach consensus, including: literature systematic reviews to inform the process, qualitative research with clinicians and patients, group discussions (e.g. nominal group technique), and structured surveys (e.g. Delphi technique). Various stakeholders should be involved in the process, with particular attention to patients. Several COSs have been developed for musculoskeletal conditions including a longstanding one for low back pain, IMMPACT recommendations on outcomes for chronic pain, and OMERACT COSs for hip, knee and hand osteoarthritis. There is a lack of COSs for neurological, geriatric, cardio-respiratory and pediatric conditions, therefore, future research could determine the value of developing COSs for these conditions. Copyright © 2017 Associação Brasileira de Pesquisa e Pós-Graduação em Fisioterapia. Publicado por Elsevier Editora Ltda. All rights reserved.
Influence of item distribution pattern and abundance on efficiency of benthic core sampling
Behney, Adam C.; O'Shaughnessy, Ryan; Eichholz, Michael W.; Stafford, Joshua D.
2014-01-01
ore sampling is a commonly used method to estimate benthic item density, but little information exists about factors influencing the accuracy and time-efficiency of this method. We simulated core sampling in a Geographic Information System framework by generating points (benthic items) and polygons (core samplers) to assess how sample size (number of core samples), core sampler size (cm2), distribution of benthic items, and item density affected the bias and precision of estimates of density, the detection probability of items, and the time-costs. When items were distributed randomly versus clumped, bias decreased and precision increased with increasing sample size and increased slightly with increasing core sampler size. Bias and precision were only affected by benthic item density at very low values (500–1,000 items/m2). Detection probability (the probability of capturing ≥ 1 item in a core sample if it is available for sampling) was substantially greater when items were distributed randomly as opposed to clumped. Taking more small diameter core samples was always more time-efficient than taking fewer large diameter samples. We are unable to present a single, optimal sample size, but provide information for researchers and managers to derive optimal sample sizes dependent on their research goals and environmental conditions.
A suggested core content for education scholarship fellowships in emergency medicine.
Yarris, Lalena M; Coates, Wendy C; Lin, Michelle; Lind, Karen; Jordan, Jaime; Clarke, Sam; Guth, Todd A; Santen, Sally A; Hamstra, Stanley J
2012-12-01
A working group at the 2012 Academic Emergency Medicine consensus conference on education research in emergency medicine (EM) convened to develop a curriculum for dedicated postgraduate fellowships in EM education scholarship. This fellowship is intended to create future education scholars, equipped with the skills to thrive in academic careers. This proceedings article reports on the consensus of a breakout session subgroup tasked with defining a common core content for education scholarship fellowships. The authors propose that the core content of an EM education scholarship fellowship can be categorized in four distinct areas: career development, theories of learning and teaching methods, education research methods, and educational program administration. This core content can be incorporated into curricula for education scholarship fellowships in EM or other fields and can also be adapted for use in general medical education fellowships. © 2012 by the Society for Academic Emergency Medicine.
LUMA: A many-core, Fluid-Structure Interaction solver based on the Lattice-Boltzmann Method
NASA Astrophysics Data System (ADS)
Harwood, Adrian R. G.; O'Connor, Joseph; Sanchez Muñoz, Jonathan; Camps Santasmasas, Marta; Revell, Alistair J.
2018-01-01
The Lattice-Boltzmann Method at the University of Manchester (LUMA) project was commissioned to build a collaborative research environment in which researchers of all abilities can study fluid-structure interaction (FSI) problems in engineering applications from aerodynamics to medicine. It is built on the principles of accessibility, simplicity and flexibility. The LUMA software at the core of the project is a capable FSI solver with turbulence modelling and many-core scalability as well as a wealth of input/output and pre- and post-processing facilities. The software has been validated and several major releases benchmarked on supercomputing facilities internationally. The software architecture is modular and arranged logically using a minimal amount of object-orientation to maintain a simple and accessible software.
Beyond the Core: The Hot Topic(al) Alternative to the Survey-Based Introduction to Sociology Course.
Schwartz, Michael; Smith, R Tyson
2010-10-01
In the following paper we argue that the conventional "Introduction to Sociology" survey course should be restructured because such courses try to survey an unsurveyable body of knowledge and they do not teach the application of sociological research. The conventional intro course should be replaced with an intro course that surveys the types of social dynamics that sociologists typically research and the methods they use to do so. We propose a semester-long intro course with four case study learning-units that are chosen for their coverage of the underlying sociological dynamics, methods, and core concepts. We contend that case study learning-units which concentrate on topical issues and core sociological concepts are better suited for an introduction course.
An optical method for characterizing carbon content in ceramic pot filters.
Goodwin, J Y; Elmore, A C; Salvinelli, C; Reidmeyer, Mary R
2017-08-01
Ceramic pot filter (CPF) technology is a relatively common means of household water treatment in developing areas, and performance characteristics of CPFs have been characterized using production CPFs, experimental CPFs fabricated in research laboratories, and ceramic disks intended to be CPF surrogates. There is evidence that CPF manufacturers do not always fire their products according to best practices and the result is incomplete combustion of the pore forming material and the creation of a carbon core in the final CPFs. Researchers seldom acknowledge the existence of potential existence of carbon cores, and at least one CPF producer has postulated that the carbon may be beneficial in terms of final water quality because of the presence of activated carbon in consumer filters marketed in the Western world. An initial step in characterizing the presence and impact of carbon cores is the characterization of those cores. An optical method which may be more viable to producers relative to off-site laboratory analysis of carbon content has been developed and verified. The use of the optical method is demonstrated via preliminary disinfection and flowrate studies, and the results of these studies indicate that the method may be of use in studying production kiln operation.
Teaching Note--Integrating a Social Justice Assignment Into a Research Methods Course
ERIC Educational Resources Information Center
Mapp, Susan C.
2013-01-01
Although social justice is a core value of social work, it can be more difficult to integrate into a research methods class. This article describes an assignment developed for a BSW one-semester research class that served the dual purpose of educating students about social justice as well as qualitative research. Students were instructed to…
ERIC Educational Resources Information Center
Hammad, Waheed; Hallinger, Philip
2017-01-01
This review of research analyzed topics, conceptual models and research methods employed in 62 EDLM studies from Arab societies published between 2000 and 2016. Systematic review methods were used to identify relevant studies published in nine core international EDLM journals. Quantitative analyses identified patterns within this set of Arab…
Brain Jogging Training to Improve Motivation and Learning Result of Tennis Skills
NASA Astrophysics Data System (ADS)
Tafaqur, M.; Komarudin; Mulyana; Saputra, M. Y.
2017-03-01
This research is aimed to determine the effect of brain jogging towards improvement of motivation and learning result of tennis skills. The method used in this research is experimental method. The population of this research is 15 tennis athletes of Core Siliwangi Bandung Tennis Club. The sampling technique used in this research is purposive sampling technique. Sample of this research is the 10 tennis athletes of Core Siliwangi Bandung Tennis Club. Design used for this research is pretest-posttest group design. Data analysis technique used in this research is by doing Instrument T-test to measure motivation using The Sport Motivation Scale questionnaire (SMS-28) and Instrument to measure learning result of tennis skill by using tennis skills test, which include: (1) forehand test, (2) backhand test, and (3) service placement test. The result of this research showed that brain jogging significantly impact the improvement of motivation and learning result of tennis skills.
Beyond the Core: The Hot Topic(al) Alternative to the Survey-Based Introduction to Sociology Course
Smith, R. Tyson
2011-01-01
In the following paper we argue that the conventional “Introduction to Sociology” survey course should be restructured because such courses try to survey an unsurveyable body of knowledge and they do not teach the application of sociological research. The conventional intro course should be replaced with an intro course that surveys the types of social dynamics that sociologists typically research and the methods they use to do so. We propose a semester-long intro course with four case study learning-units that are chosen for their coverage of the underlying sociological dynamics, methods, and core concepts. We contend that case study learning-units which concentrate on topical issues and core sociological concepts are better suited for an introduction course. PMID:21709825
Complexity, Methodology and Method: Crafting a Critical Process of Research
ERIC Educational Resources Information Center
Alhadeff-Jones, Michel
2013-01-01
This paper defines a theoretical framework aiming to support the actions and reflections of researchers looking for a "method" in order to critically conceive the complexity of a scientific process of research. First, it starts with a brief overview of the core assumptions framing Morin's "paradigm of complexity" and Le…
ERIC Educational Resources Information Center
Arnott, Stephanie
2011-01-01
Over the last decade, almost 4,000 Canadian schools have moved to using the Accelerative Integrated Method (AIM) for core French (CF) instruction. Following researchers' recommendations (Brumfit, 1984; Lapkin, Mady, & Arnott, 2009; Larsen-Freeman, 1996, 2000; Prahbu, 1990), I am shifting the focus in this case study from product to process. In…
Core Training in Low Back Disorders: Role of the Pilates Method.
Joyce, Andrew A; Kotler, Dana H
The Pilates method is a system of exercises developed by Joseph Pilates, which emphasizes recruitment and strengthening of the core muscles, flexibility, and breathing, to promote stability and control of movement. Its focus bears similarity to current evidence-based exercise programs for low back disorders. Spinal stability is a function of three interdependent systems, osseoligamentous, muscular, and neural control; exercise addresses both the muscular and neural function. The "core" typically refers to the muscular control required to maintain functional stability. Prior research has highlighted the importance of muscular strength and recruitment, with debate over the importance of individual muscles in the wider context of core control. Though developed long before the current evidence, the Pilates method is relevant in this setting and clearly relates to current evidence-based exercise interventions. Current literature supports the Pilates method as a treatment for low back disorders, but its benefit when compared with other exercise is less clear.
Field testing of stiffened deep cement mixing piles under lateral cyclic loading
NASA Astrophysics Data System (ADS)
Raongjant, Werasak; Jing, Meng
2013-06-01
Construction of seaside and underground wall bracing often uses stiffened deep cement mixed columns (SDCM). This research investigates methods used to improve the level of bearing capacity of these SDCM when subjected to cyclic lateral loading via various types of stiffer cores. Eight piles, two deep cement mixed piles and six stiffened deep cement mixing piles with three different types of cores, H shape cross section prestressed concrete, steel pipe, and H-beam steel, were embedded though soft clay into medium-hard clay on site in Thailand. Cyclic horizontal loading was gradually applied until pile failure and the hysteresis loops of lateral load vs. lateral deformation were recorded. The lateral carrying capacities of the SDCM piles with an H-beam steel core increased by 3-4 times that of the DCM piles. This field research clearly shows that using H-beam steel as a stiffer core for SDCM piles is the best method to improve its lateral carrying capacity, ductility and energy dissipation capacity.
Francesca Monn, M; Wang, Ming-Hsien; Gilson, Marta M; Chen, Belinda; Kern, David; Gearhart, Susan L
2013-01-01
To determine the perceived effectiveness of surgical subspecialty training programs in teaching and assessing the 6 ACGME core competencies including research. Cross-sectional survey. ACGME approved training programs in pediatric urology and colorectal surgery. Program Directors and recent trainees (2007-2009). A total of 39 program directors (60%) and 57 trainees (64%) responded. Both program directors and recent trainees reported a higher degree of training and mentorship (75%) in patient care and medical knowledge than the other core competencies (p<0.0001). Practice based learning and improvement, interpersonal and communication, and professionalism training were perceived effective to a lesser degree. Specifically, in the areas of teaching residents and medical students and team building, program directors, compared with recent trainees, perceived training to be more effective, (p = 0.004, p = 0.04). Responses to questions assessing training in systems based practice ubiquitously identified a lack of training, particularly in financial matters of running a practice. Although effective training in research was perceived as lacking by recent trainees, 81% reported mentorship in this area. According to program directors and recent trainees, the most effective method of teaching was faculty supervision and feedback. Only 50% or less of the recent trainees reported mentorship in career planning, work-life balance, and job satisfaction. Not all 6 core competencies and research are effectively being taught in surgery subspecialty training programs and mentorship in areas outside of patient care and research is lacking. Emphasis should be placed on faculty supervision and feedback when designing methods to better incorporate all 6 core competencies, research, and mentorship. Copyright © 2012 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.
Introduction of biotin or folic acid into polypyrrole magnetite core-shell nanoparticles
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nan, Alexandrina; Turcu, Rodica; Liebscher, Jürgen
2013-11-13
In order to contribute to the trend in contemporary research to develop magnetic core shell nanoparticles with better properties (reduced toxicity, high colloidal and chemical stability, wide scope of application) in straightforward and reproducible methods new core shell magnetic nanoparticles were developed based on polypyrrole shells functionalized with biotin and folic acid. Magnetite nanoparticles stabilized by sebacic acid were used as magnetic cores. The morphology of magnetite was determined by transmission electron microscopy TEM, while the chemical structure investigated by FT-IR.
Waters, Aoife Mi; Tudur Smith, Catrin; Young, Bridget; Jones, Terry M
2014-05-13
The incidence of oropharyngeal cancer is increasing in the developed world. This has led to a large rise in research activity and clinical trials in this area, yet there is no consensus on which outcomes should be measured. As a result, the outcomes measured often differ between trials of comparable interventions, making the combination or comparison of results between trials impossible. Outcomes may also be 'cherry-picked', such that favourable results are reported, and less favourable results withheld. The development of a minimum outcome reporting standard, known as a core outcome set, goes some way to addressing these problems. Core outcome sets are ideally developed using a patient-centred approach so that the outcomes measured are relevant to patients and clinical practice. Core outcome sets drive up the quality and relevance of research by ensuring that the right outcomes are consistently measured and reported in trials in specific areas of health or healthcare. This is a mixed methods study involving three phases to develop a core outcome set for oropharyngeal cancer clinical trials. Firstly, a systematic review will establish which outcomes are measured in published oropharyngeal cancer randomised controlled trials (RCTs). Secondly, qualitative interviews with patients and carers in the UK and the USA will aim to establish which outcomes are important to these stakeholders. Data from these first two stages will be used to develop a comprehensive list of outcomes to be considered for inclusion in the core outcome set. In the third stage, patients and clinicians will participate in an iterative consensus exercise known as a Delphi study to refine the contents of the core outcome set. This protocol lays out the methodology to be implemented in the CONSENSUS study. A core outcome set defines a minimum outcome reporting standard for clinical trials in a particular area of health or healthcare. Its consistent implementation in oropharyngeal cancer clinical trials will improve the quality and relevance of research. This study is registered at the National Institute for Health Research (NIHR) Clinical Research Network (CRN) portfolio, ID 13823 (17 January 2013).
Matos, Sergio; Kapadia, Smiti; Islam, Nadia; Cusack, Arthur; Kwong, Sylvia; Trinh-Shevrin, Chau
2012-01-01
Objectives. Despite the importance of community health workers (CHWs) in strategies to reduce health disparities and the call to enhance their roles in research, little information exists on how to prepare CHWs involved in community–academic initiatives (CAIs). Therefore, the New York University Prevention Research Center piloted a CAI–CHW training program. Methods. We applied a core competency framework to an existing CHW curriculum and bolstered the curriculum to include research-specific sessions. We employed diverse training methods, guided by adult learning principles and popular education philosophy. Evaluation instruments assessed changes related to confidence, intention to use learned skills, usefulness of sessions, and satisfaction with the training. Results. Results demonstrated that a core competency–based training can successfully affect CHWs’ perceived confidence and intentions to apply learned content, and can provide a larger social justice context of their role and work. Conclusions. This program demonstrates that a core competency–based framework coupled with CAI-research–specific skill sessions (1) provides skills that CAI–CHWs intend to use, (2) builds confidence, and (3) provides participants with a more contextualized view of client needs and CHW roles. PMID:22594730
Developing, implementing and disseminating a core outcome set for neonatal medicine.
Webbe, James; Brunton, Ginny; Ali, Shohaib; Duffy, James Mn; Modi, Neena; Gale, Chris
2017-01-01
In high resource settings, 1 in 10 newborn babies require admission to a neonatal unit. Research evaluating neonatal care involves recording and reporting many different outcomes and outcome measures. Such variation limits the usefulness of research as studies cannot be compared or combined. To address these limitations, we aim to develop, disseminate and implement a core outcome set for neonatal medicine. A steering group that includes parents and former patients, healthcare professionals and researchers has been formed to guide the development of the core outcome set. We will review neonatal trials systematically to identify previously reported outcomes. Additionally, we will specifically identify outcomes of importance to parents, former patients and healthcare professionals through a systematic review of qualitative studies. Outcomes identified will be entered into an international, multi-perspective eDelphi survey. All key stakeholders will be invited to participate. The Delphi method will encourage individual and group stakeholder consensus to identify a core outcome set. The core outcome set will be mapped to existing, routinely recorded data where these exist. Use of a core set will ensure outcomes of importance to key stakeholders, including former patients and parents, are recorded and reported in a standard fashion in future research. Embedding the core outcome set within future clinical studies will extend the usefulness of research to inform practice, enhance patient care and ultimately improve outcomes. Using routinely recorded electronic data will facilitate implementation with minimal addition burden. Core Outcome Measures in Effectiveness Trials (COMET) database: 842 (www.comet-initiative.org/studies/details/842).
Thulesius, Hans; Barfod, Toke; Ekström, Helene; Håkansson, Anders
2004-09-30
Grounded theory (GT) is a popular research method for exploring human behavior. GT was developed by the medical sociologists Glaser and Strauss while they studied dying in hospitals in the 1960s resulting in the book "Awareness of dying". The goal of a GT is to generate conceptual theories by using all types of data but without applying existing theories and hypotheses. GT procedures are mostly inductive as opposed to deductive research where hypotheses are tested. A good GT has a core variable that is a central concept connected to many other concepts explaining the main action in the studied area. A core variable answers the question "What's going on?". Examples of core variables are: "Cutting back after a heart attack"--how people adapt to life after a serious illness; and "Balancing in palliative cancer care"--a process of weighing, shifting, compensating and compromising when treating people with a progressive and incurable illness trajectory.
System-level protection and hardware Trojan detection using weighted voting.
Amin, Hany A M; Alkabani, Yousra; Selim, Gamal M I
2014-07-01
The problem of hardware Trojans is becoming more serious especially with the widespread of fabless design houses and design reuse. Hardware Trojans can be embedded on chip during manufacturing or in third party intellectual property cores (IPs) during the design process. Recent research is performed to detect Trojans embedded at manufacturing time by comparing the suspected chip with a golden chip that is fully trusted. However, Trojan detection in third party IP cores is more challenging than other logic modules especially that there is no golden chip. This paper proposes a new methodology to detect/prevent hardware Trojans in third party IP cores. The method works by gradually building trust in suspected IP cores by comparing the outputs of different untrusted implementations of the same IP core. Simulation results show that our method achieves higher probability of Trojan detection over a naive implementation of simple voting on the output of different IP cores. In addition, experimental results show that the proposed method requires less hardware overhead when compared with a simple voting technique achieving the same degree of security.
Reconstruction of a digital core containing clay minerals based on a clustering algorithm.
He, Yanlong; Pu, Chunsheng; Jing, Cheng; Gu, Xiaoyu; Chen, Qingdong; Liu, Hongzhi; Khan, Nasir; Dong, Qiaoling
2017-10-01
It is difficult to obtain a core sample and information for digital core reconstruction of mature sandstone reservoirs around the world, especially for an unconsolidated sandstone reservoir. Meanwhile, reconstruction and division of clay minerals play a vital role in the reconstruction of the digital cores, although the two-dimensional data-based reconstruction methods are specifically applicable as the microstructure reservoir simulation methods for the sandstone reservoir. However, reconstruction of clay minerals is still challenging from a research viewpoint for the better reconstruction of various clay minerals in the digital cores. In the present work, the content of clay minerals was considered on the basis of two-dimensional information about the reservoir. After application of the hybrid method, and compared with the model reconstructed by the process-based method, the digital core containing clay clusters without the labels of the clusters' number, size, and texture were the output. The statistics and geometry of the reconstruction model were similar to the reference model. In addition, the Hoshen-Kopelman algorithm was used to label various connected unclassified clay clusters in the initial model and then the number and size of clay clusters were recorded. At the same time, the K-means clustering algorithm was applied to divide the labeled, large connecting clusters into smaller clusters on the basis of difference in the clusters' characteristics. According to the clay minerals' characteristics, such as types, textures, and distributions, the digital core containing clay minerals was reconstructed by means of the clustering algorithm and the clay clusters' structure judgment. The distributions and textures of the clay minerals of the digital core were reasonable. The clustering algorithm improved the digital core reconstruction and provided an alternative method for the simulation of different clay minerals in the digital cores.
ERIC Educational Resources Information Center
Barton, Erin E.; Pustejovsky, James E.; Maggin, Daniel M.; Reichow, Brian
2017-01-01
The adoption of methods and strategies validated through rigorous, experimentally oriented research is a core professional value of special education. We conducted a systematic review and meta-analysis examining the experimental literature on Technology-Aided Instruction and Intervention (TAII) using research identified as part of the National…
ERIC Educational Resources Information Center
Horner, Jennifer; Minifie, Fred D.
2011-01-01
Purpose: In this series of articles--"Research Ethics I", "Research Ethics II", and "Research Ethics III"--the authors provide a comprehensive review of the 9 core domains for the responsible conduct of research (RCR) as articulated by the Office of Research Integrity. Method: In "Research Ethics III", they review the RCR domains of publication…
Canada's neglected tropical disease research network: who's in the core-who's on the periphery?
Phillips, Kaye; Kohler, Jillian Clare; Pennefather, Peter; Thorsteinsdottir, Halla; Wong, Joseph
2013-01-01
This study designed and applied accessible yet systematic methods to generate baseline information about the patterns and structure of Canada's neglected tropical disease (NTD) research network; a network that, until recently, was formed and functioned on the periphery of strategic Canadian research funding. MULTIPLE METHODS WERE USED TO CONDUCT THIS STUDY, INCLUDING: (1) a systematic bibliometric procedure to capture archival NTD publications and co-authorship data; (2) a country-level "core-periphery" network analysis to measure and map the structure of Canada's NTD co-authorship network including its size, density, cliques, and centralization; and (3) a statistical analysis to test the correlation between the position of countries in Canada's NTD network ("k-core measure") and the quantity and quality of research produced. Over the past sixty years (1950-2010), Canadian researchers have contributed to 1,079 NTD publications, specializing in Leishmania, African sleeping sickness, and leprosy. Of this work, 70% of all first authors and co-authors (n = 4,145) have been Canadian. Since the 1990s, however, a network of international co-authorship activity has been emerging, with representation of researchers from 62 different countries; largely researchers from OECD countries (e.g. United States and United Kingdom) and some non-OECD countries (e.g. Brazil and Iran). Canada has a core-periphery NTD international research structure, with a densely connected group of OECD countries and some African nations, such as Uganda and Kenya. Sitting predominantly on the periphery of this research network is a cluster of 16 non-OECD nations that fall within the lowest GDP percentile of the network. The publication specialties, composition, and position of NTD researchers within Canada's NTD country network provide evidence that while Canadian researchers currently remain the overall gatekeepers of the NTD research they generate; there is opportunity to leverage existing research collaborations and help advance regions and NTD areas that are currently under-developed.
Digital Core Modelling for Clastic Oil and Gas Reservoir
NASA Astrophysics Data System (ADS)
Belozerov, I.; Berezovsky, V.; Gubaydullin, M.; Yur’ev, A.
2018-05-01
"Digital core" is a multi-purpose tool for solving a variety of tasks in the field of geological exploration and production of hydrocarbons at various stages, designed to improve the accuracy of geological study of subsurface resources, the efficiency of reproduction and use of mineral resources, as well as applying the results obtained in production practice. The actuality of the development of the "Digital core" software is that even a partial replacement of natural laboratory experiments with mathematical modelling can be used in the operative calculation of reserves in exploratory drilling, as well as in the absence of core material from wells. Or impossibility of its research by existing laboratory methods (weakly cemented, loose, etc. rocks). 3D-reconstruction of the core microstructure can be considered as a cheap and least time-consuming method for obtaining petrophysical information about the main filtration-capacitive properties and fluid motion in reservoir rocks.
Instructional Practices: A Qualitative Study on the Response to Common Core Standardized Testing
ERIC Educational Resources Information Center
Hightower, Gabrielle
2017-01-01
The purpose of this qualitative study was to examine the instructional practices implemented by Tennessee elementary teachers in response to Common Core Standardized Testing. This research study utilized a basic qualitative method that included a purposive and convenient sampling. This qualitative study focused on face-to-face interviews, phone…
Writing Bragg Gratings in Multicore Fibers.
Lindley, Emma Y; Min, Seong-Sik; Leon-Saval, Sergio G; Cvetojevic, Nick; Lawrence, Jon; Ellis, Simon C; Bland-Hawthorn, Joss
2016-04-20
Fiber Bragg gratings in multicore fibers can be used as compact and robust filters in astronomical and other research and commercial applications. Strong suppression at a single wavelength requires that all cores have matching transmission profiles. These gratings cannot be inscribed using the same method as for single-core fibers because the curved surface of the cladding acts as a lens, focusing the incoming UV laser beam and causing variations in exposure between cores. Therefore we use an additional optical element to ensure that the beam shape does not change while passing through the cross-section of the multicore fiber. This consists of a glass capillary tube which has been polished flat on one side, which is then placed over the section of the fiber to be inscribed. The laser beam enters the fiber through the flat surface of the capillary tube and hence maintains its original dimensions. This paper demonstrates the improvements in core-to-core uniformity for a 7-core fiber using this method. The technique can be generalized to larger multicore fibers.
Writing Bragg Gratings in Multicore Fibers
Lindley, Emma Y.; Min, Seong-sik; Leon-Saval, Sergio G.; Cvetojevic, Nick; Lawrence, Jon; Ellis, Simon C.; Bland-Hawthorn, Joss
2016-01-01
Fiber Bragg gratings in multicore fibers can be used as compact and robust filters in astronomical and other research and commercial applications. Strong suppression at a single wavelength requires that all cores have matching transmission profiles. These gratings cannot be inscribed using the same method as for single-core fibers because the curved surface of the cladding acts as a lens, focusing the incoming UV laser beam and causing variations in exposure between cores. Therefore we use an additional optical element to ensure that the beam shape does not change while passing through the cross-section of the multicore fiber. This consists of a glass capillary tube which has been polished flat on one side, which is then placed over the section of the fiber to be inscribed. The laser beam enters the fiber through the flat surface of the capillary tube and hence maintains its original dimensions. This paper demonstrates the improvements in core-to-core uniformity for a 7-core fiber using this method. The technique can be generalized to larger multicore fibers. PMID:27167576
Expedient Spall Repair Methods and Equipment for Airfield Pavements Preprint
2009-08-01
placement (3). RESEACH OBJECTIVES AND SCOPE The objective of this research was to develop one or more methods that will allow field personnel to...cores were used to perform in-situ tensile pull-off tests to evaluate the bond between the repair material and the substrate. Also, a series of 4...inch diameters cores were cut, and direct shear tests were performed on the repair material/substrate interface. Finally, all spalls were trafficked for
Adaptive control method for core power control in TRIGA Mark II reactor
NASA Astrophysics Data System (ADS)
Sabri Minhat, Mohd; Selamat, Hazlina; Subha, Nurul Adilla Mohd
2018-01-01
The 1MWth Reactor TRIGA PUSPATI (RTP) Mark II type has undergone more than 35 years of operation. The existing core power control uses feedback control algorithm (FCA). It is challenging to keep the core power stable at the desired value within acceptable error bands to meet the safety demand of RTP due to the sensitivity of nuclear research reactor operation. Currently, the system is not satisfied with power tracking performance and can be improved. Therefore, a new design core power control is very important to improve the current performance in tracking and regulate reactor power by control the movement of control rods. In this paper, the adaptive controller and focus on Model Reference Adaptive Control (MRAC) and Self-Tuning Control (STC) were applied to the control of the core power. The model for core power control was based on mathematical models of the reactor core, adaptive controller model, and control rods selection programming. The mathematical models of the reactor core were based on point kinetics model, thermal hydraulic models, and reactivity models. The adaptive control model was presented using Lyapunov method to ensure stable close loop system and STC Generalised Minimum Variance (GMV) Controller was not necessary to know the exact plant transfer function in designing the core power control. The performance between proposed adaptive control and FCA will be compared via computer simulation and analysed the simulation results manifest the effectiveness and the good performance of the proposed control method for core power control.
Cone penetration test for facies study: a review
NASA Astrophysics Data System (ADS)
Satriyo, N. A.; Soebowo, E.
2018-02-01
Engineering geology investigation through Cone Penetration Test (with pore-pressure measurements) approach is one of the most effective methods to find out sub surface layer. This method is generally used in Late Quaternary and typical deposit and can also be used for sedimentological purposes. CPTu and drilling core for high-resolution stratigraphy sub surface have been done in many research. These combined data can also be used to detail correlations of sub surface stratigraphy, to identify facies change and to determine the interpretation of sequence stratigraphy. The determination facies distribution research based on CPTu profile, which was included in quantitative data, is rarely done especially in Indonesia which has a different climate. Whereas drilling core description using grain size analysis will provide information on validation about physical lithology characteristics which are developed in research area. The interpretation is given using CPTu curve pattern and cone resistance parameter of CPTu’s data correlated with physical characteristics of drilling core. The cone resistance will provide the strength of the sediment layer which also gives the range of data between clay and sand. Finally, the review will show that each of developing facies characteristic provides a specific curve pattern and every sediment deposit facies can be determined by the transformation of CPTu curve profile. Despite the fact that the research using those methods are quite comprehensive, a review is presented on each of these methods related with the chronologic factor seen by the geological time and different characteristics sediment of different location.
ERIC Educational Resources Information Center
Sabey, Abigail; Horrocks, Sue
2011-01-01
Research methods modules have become a core component of a range of nursing and allied health professional educational programmes both at pre-qualifying, undergraduate level and at post-qualifying and Masters' level, in keeping with requirements of professional bodies. These courses are offered both on a full time basis and part time for qualified…
Khalil, Asma; Perry, Helen; Duffy, James; Reed, Keith; Baschat, Ahmet; Deprest, Jan; Hecher, Kurt; Lewi, Liesbeth; Lopriore, Enrico; Oepkes, Dick
2017-07-14
Twin-Twin Transfusion Syndrome (TTTS) is associated with an increased risk of perinatal mortality and morbidity. Several treatment interventions have been described for TTTS, including fetoscopic laser surgery, amnioreduction, septostomy, expectant management, and pregnancy termination. Over the last decade, fetoscopic laser surgery has become the primary treatment. The literature to date reports on many different outcomes, making it difficult to compare results or combine data from individual studies, limiting the value of research to guide clinical practice. With the advent and ongoing development of new therapeutic techniques, this is more important than ever. The development and use of a core outcome set has been proposed to address these issues, prioritising outcomes important to the key stakeholders, including patients. We aim to produce, disseminate, and implement a core outcome set for TTTS. An international steering group has been established to oversee the development of this core outcome set. This group includes healthcare professionals, researchers and patients. A systematic review is planned to identify previously reported outcomes following treatment for TTTS. Following completion, the identified outcomes will be evaluated by stakeholders using an international, multi-perspective online modified Delphi method to build consensus on core outcomes. This method encourages the participants towards consensus 'core' outcomes. All key stakeholders will be invited to participate. The steering group will then hold a consensus meeting to discuss results and form a core outcome set to be introduced and measured. Once core outcomes have been agreed, the next step will be to determine how they should be measured, disseminated, and implemented within an international context. The development, dissemination, and implementation of a core outcome set in TTTS will enable its use in future clinical trials, systematic reviews and clinical practice guidelines. This is likely to advance the quality of research studies and their effective use in order to guide clinical practice and improve patient care, maternal, short-term perinatal outcomes and long-term neurodevelopmental outcomes. Core Outcome Measures in Effectiveness Trials (COMET), 921 Registered on July 2016. International Prospective Register of Systematic Reviews (PROSPERO), CRD42016043999 . Registered on 2 August 2016.
The application of mixed methods designs to trauma research.
Creswell, John W; Zhang, Wanqing
2009-12-01
Despite the use of quantitative and qualitative data in trauma research and therapy, mixed methods studies in this field have not been analyzed to help researchers designing investigations. This discussion begins by reviewing four core characteristics of mixed methods research in the social and human sciences. Combining these characteristics, the authors focus on four select mixed methods designs that are applicable in trauma research. These designs are defined and their essential elements noted. Applying these designs to trauma research, a search was conducted to locate mixed methods trauma studies. From this search, one sample study was selected, and its characteristics of mixed methods procedures noted. Finally, drawing on other mixed methods designs available, several follow-up mixed methods studies were described for this sample study, enabling trauma researchers to view design options for applying mixed methods research in trauma investigations.
The Temporal Fabric of Research Methods: Posthuman Social Science and the Digital Data Deluge
ERIC Educational Resources Information Center
de Freitas, Elizabeth
2017-01-01
The aim of this paper is to adumbrate methods more suitable to a posthuman social science, so as to better attend to the digital datafication of life. Five core functions of research method are presented. The first three--the desire for origins, the need to exclude, and the establishment of a regime of labour--often reinstate social orders and…
Research on Shock Responses of Three Types of Honeycomb Cores
NASA Astrophysics Data System (ADS)
Peng, Fei; Yang, Zhiguang; Jiang, Liangliang; Ren, Yanting
2018-03-01
The shock responses of three kinds of honeycomb cores have been investigated and analyzed based on explicit dynamics analysis. According to the real geometric configuration and the current main manufacturing methods of aluminum alloy honeycomb cores, the finite element models of honeycomb cores with three different cellular configurations (conventional hexagon honeycomb core, rectangle honeycomb core and auxetic honeycomb core with negative Poisson’s ratio) have been established through FEM parametric modeling method based on Python and Abaqus. In order to highlight the impact response characteristics of the above three honeycomb cores, a 5 mm thick panel with the same mass and material was taken as contrast. The analysis results showed that the peak values of longitudinal acceleration history curves of the three honeycomb cores were lower than those of the aluminum alloy panel in all three reference points under the loading of a longitudinal pulse pressure load with the peak value of 1 MPa and the pulse width of 1 μs. It could be concluded that due to the complex reflection and diffraction of stress wave induced by shock in honeycomb structures, the impact energy was redistributed which led to a decrease in the peak values of the longitudinal acceleration at the measuring points of honeycomb cores relative to the panel.
Three-dimensional discrete element method simulation of core disking
NASA Astrophysics Data System (ADS)
Wu, Shunchuan; Wu, Haoyan; Kemeny, John
2018-04-01
The phenomenon of core disking is commonly seen in deep drilling of highly stressed regions in the Earth's crust. Given its close relationship with the in situ stress state, the presence and features of core disking can be used to interpret the stresses when traditional in situ stress measuring techniques are not available. The core disking process was simulated in this paper using the three-dimensional discrete element method software PFC3D (particle flow code). In particular, PFC3D is used to examine the evolution of fracture initiation, propagation and coalescence associated with core disking under various stress states. In this paper, four unresolved problems concerning core disking are investigated with a series of numerical simulations. These simulations also provide some verification of existing results by other researchers: (1) Core disking occurs when the maximum principal stress is about 6.5 times the tensile strength. (2) For most stress situations, core disking occurs from the outer surface, except for the thrust faulting stress regime, where the fractures were found to initiate from the inner part. (3) The anisotropy of the two horizontal principal stresses has an effect on the core disking morphology. (4) The thickness of core disk has a positive relationship with radial stress and a negative relationship with axial stresses.
Knowledge Economy Core Journals: Identification through LISTA Database Analysis.
Nouri, Rasool; Karimi, Saeed; Ashrafi-rizi, Hassan; Nouri, Azadeh
2013-03-01
Knowledge economy has become increasingly broad over the years and identification of core journals in this field can be useful for librarians in journal selection process and also for researchers to select their studies and finding Appropriate Journal for publishing their articles. Present research attempts to determine core journals of Knowledge Economy indexed in LISTA (Library and Information Science and Technology). The research method was bibliometric and research population include the journals indexed in LISTA (From the start until the beginning of 2011) with at least one article a bout "knowledge economy". For data collection, keywords about "knowledge economy"-were extracted from the literature in this area-have searched in LISTA by using title, keyword and abstract fields and also taking advantage of LISTA thesaurus. By using this search strategy, 1608 articles from 390 journals were retrieved. The retrieved records import in to the excel sheet and after that the journals were grouped and the Bradford's coefficient was measured for each group. Finally the average of the Bradford's coefficients were calculated and core journals with subject area of "Knowledge economy" were determined by using Bradford's formula. By using Bradford's scattering law, 15 journals with the highest publication rates were identified as "Knowledge economy" core journals indexed in LISTA. In this list "Library and Information update" with 64 articles was at the top. "ASLIB Proceedings" and "Serials" with 51 and 40 articles are next in rank. Also 41 journals were identified as beyond core that "Library Hi Tech" with 20 articles was at the top. Increased importance of knowledge economy has led to growth of production of articles in this subject area. So the evaluation of journals for ranking these journals becomes a very challenging task for librarians and generating core journal list can provide a useful tool for journal selection and also quick and easy access to information. Core journal list and beyond core journal list obtained from this study can be used by librarians and researchers in this field.
A protocol for developing, disseminating, and implementing a core outcome set for pre-eclampsia.
Duffy, James M N; van 't Hooft, Janneke; Gale, Chris; Brown, Mark; Grobman, William; Fitzpatrick, Ray; Karumanchi, S Ananth; Lucas, Nuala; Magee, Laura; Mol, Ben; Stark, Michael; Thangaratinam, Shakila; Wilson, Mathew; von Dadelszen, Peter; Williamson, Paula; Khan, Khalid S; Ziebland, Sue; McManus, Richard J
2016-10-01
Pre-eclampsia is a serious complication of pregnancy and contributes to maternal and offspring mortality and morbidity. Randomised controlled trials evaluating therapeutic interventions for pre-eclampsia have reported many different outcomes and outcome measures. Such variation contributes to an inability to compare, contrast, and combine individual studies, limiting the usefulness of research to inform clinical practice. The development and use of a core outcome set would help to address these issues ensuring outcomes important to all stakeholders, including patients, will be collected and reported in a standardised fashion. An international steering group including healthcare professionals, researchers, and patients, has been formed to guide the development of this core outcome set. Potential outcomes will be identified through a comprehensive literature review and semi-structured interviews with patients. Potential core outcomes will be entered into an international, multi-perspective online Delphi survey. All key stakeholders, including healthcare professionals, researchers, and patients will be invited to participate. The modified Delphi method encourages whole and stakeholder group convergence towards consensus 'core' outcomes. Once core outcomes have been agreed upon it is important to determine how they should be measured. The truth, discrimination, and feasibility assessment framework will assess the quality of potential outcome measures. High quality outcome measures will be associated with core outcomes. Mechanisms exist to disseminate and implement the resulting core outcome set within an international context. Embedding the core outcome set within future clinical trials, systematic reviews, and clinical practice guidelines could make a profound contribution to advancing the usefulness of research to inform clinical practice, enhance patient care, and improve maternal and offspring outcomes. The infrastructure created by developing a core outcome set for pre-eclampsia could be leveraged in other settings, for example selecting research priorities and clinical practice guideline development. PROSPECTIVE REGISTRATION: [1] Core Outcome Measures in Effectiveness Trials (COMET) registration number: 588. [2] International Prospective Register of Systematic Reviews (PROSPERO) registration number: CRD42015015529. Copyright © 2016 International Society for the Study of Hypertension in Pregnancy. Published by Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Deiglmayr, Anne
2018-01-01
Formative peer assessment is an instructional method that offers many opportunities to foster students' learning with respect to both the domain of the core task and students' assessment skills. The contributions to this special issue effectively address earlier calls for more research into instructional scaffolds and the implementation of…
Peer-Assisted Learning in Research Methods and Statistics
ERIC Educational Resources Information Center
Stone, Anna; Meade, Claire; Watling, Rosamond
2012-01-01
Feedback from students on a Level 1 Research Methods and Statistics module, studied as a core part of a BSc Psychology programme, highlighted demand for additional tutorials to help them to understand basic concepts. Students in their final year of study commonly request work experience to enhance their employability. All students on the Level 1…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tsunoda, Hirokazu; Sato, Osamu; Okajima, Shigeaki
2002-07-01
In order to achieve fully automated reactor operation of RAPID-L reactor, innovative reactivity control systems LEM, LIM, and LRM are equipped with lithium-6 as a liquid poison. Because lithium-6 has not been used as a neutron absorbing material of conventional fast reactors, measurements of the reactivity worth of Lithium-6 were performed at the Fast Critical Assembly (FCA) of Japan Atomic Energy Research Institute (JAERI). The FCA core was composed of highly enriched uranium and stainless steel samples so as to simulate the core spectrum of RAPID-L. The samples of 95% enriched lithium-6 were inserted into the core parallel to themore » core axis for the measurement of the reactivity worth at each position. It was found that the measured reactivity worth in the core region well agreed with calculated value by the method for the core designs of RAPID-L. Bias factors for the core design method were obtained by comparing between experimental and calculated results. The factors were used to determine the number of LEM and LIM equipped in the core to achieve fully automated operation of RAPID-L. (authors)« less
Developing Students, Developing Faculty: Incompatible or Compatible Goals?
ERIC Educational Resources Information Center
Ware, Mark E.; Davis, Stephen F.; Smith, Randolph A.
Grounding students in research methodology is at the core of the undergraduate curriculum. Students usually conduct individual projects in the experimental psychology or research methods courses, and most undergraduate courses in the psychology curriculum contain a strong research component. The opportunities and benefits for undergraduate student…
Beuscart, Jean-Baptiste; Dalleur, Olivia; Boland, Benoit; Thevelin, Stefanie; Knol, Wilma; Cullinan, Shane; Schneider, Claudio; O'Mahony, Denis; Rodondi, Nicolas; Spinewine, Anne
2017-01-01
Medication review has been advocated to address the challenge of polypharmacy in older patients, yet there is no consensus on how best to evaluate its efficacy. Heterogeneity of outcomes reported in clinical trials can hinder the comparison of clinical trial findings in systematic reviews. Moreover, the outcomes that matter most to older patients might be under-reported or disregarded altogether. A core outcome set can address this issue as it defines a minimum set of outcomes that should be reported in all clinical trials in any particular field of research. As part of the European Commission-funded project, called OPtimising thERapy to prevent Avoidable hospital admissions in the Multimorbid elderly, this paper describes the methods used to develop a core outcome set for clinical trials of medication review in older patients with multimorbidity. The study was designed in several steps. First, a systematic review established which outcomes were measured in published and ongoing clinical trials of medication review in older patients. Second, we undertook semistructured interviews with older patients and carers aimed at identifying additional relevant outcomes. Then, a multilanguage European Delphi survey adapted to older patients was designed. The international Delphi survey was conducted with older patients, health care professionals, researchers, and clinical experts in geriatric pharmacotherapy to validate outcomes to be included in the core outcome set. Consensus meetings were conducted to validate the results. We present the method for developing a core outcome set for medication review in older patients with multimorbidity. This study protocol could be used as a basis to develop core outcome sets in other fields of geriatric research.
Neutronics calculation of RTP core
NASA Astrophysics Data System (ADS)
Rabir, Mohamad Hairie B.; Zin, Muhammad Rawi B. Mohamed; Karim, Julia Bt. Abdul; Bayar, Abi Muttaqin B. Jalal; Usang, Mark Dennis Anak; Mustafa, Muhammad Khairul Ariff B.; Hamzah, Na'im Syauqi B.; Said, Norfarizan Bt. Mohd; Jalil, Muhammad Husamuddin B.
2017-01-01
Reactor calculation and simulation are significantly important to ensure safety and better utilization of a research reactor. The Malaysian's PUSPATI TRIGA Reactor (RTP) achieved initial criticality on June 28, 1982. The reactor is designed to effectively implement the various fields of basic nuclear research, manpower training, and production of radioisotopes. Since early 90s, neutronics modelling were used as part of its routine in-core fuel management activities. The are several computer codes have been used in RTP since then, based on 1D neutron diffusion, 2D neutron diffusion and 3D Monte Carlo neutron transport method. This paper describes current progress and overview on neutronics modelling development in RTP. Several important parameters were analysed such as keff, reactivity, neutron flux, power distribution and fission product build-up for the latest core configuration. The developed core neutronics model was validated by means of comparison with experimental and measurement data. Along with the RTP core model, the calculation procedure also developed to establish better prediction capability of RTP's behaviour.
Odong, T L; Jansen, J; van Eeuwijk, F A; van Hintum, T J L
2013-02-01
Definition of clear criteria for evaluation of the quality of core collections is a prerequisite for selecting high-quality cores. However, a critical examination of the different methods used in literature, for evaluating the quality of core collections, shows that there are no clear guidelines on the choices of quality evaluation criteria and as a result, inappropriate analyses are sometimes made leading to false conclusions being drawn regarding the quality of core collections and the methods to select such core collections. The choice of criteria for evaluating core collections appears to be based mainly on the fact that those criteria have been used in earlier publications rather than on the actual objectives of the core collection. In this study, we provide insight into different criteria used for evaluating core collections. We also discussed different types of core collections and related each type of core collection to their respective evaluation criteria. Two new criteria based on genetic distance are introduced. The consequences of the different evaluation criteria are illustrated using simulated and experimental data. We strongly recommend the use of the distance-based criteria since they not only allow the simultaneous evaluation of all variables describing the accessions, but they also provide intuitive and interpretable criteria, as compared with the univariate criteria generally used for the evaluation of core collections. Our findings will provide genebank curators and researchers with possibilities to make informed choices when creating, comparing and using core collections.
Outcome Studies of Social, Behavioral, and Educational Interventions: Emerging Issues and Challenges
ERIC Educational Resources Information Center
Fraser, Mark W.; Guo, Shenyang; Ellis, Alan R.; Thompson, Aaron M.; Wike, Traci L.; Li, Jilan
2011-01-01
This article describes the core features of outcome research and then explores issues confronting researchers who engage in outcome studies. Using an intervention research perspective, descriptive and explanatory methods are distinguished. Emphasis is placed on the counterfactual causal perspective, designing programs that fit culture and context,…
NASA Astrophysics Data System (ADS)
Tiyapun, K.; Chimtin, M.; Munsorn, S.; Somchit, S.
2015-05-01
The objective of this work is to demonstrate the method for validating the predication of the calculation methods for neutron flux distribution in the irradiation tubes of TRIGA research reactor (TRR-1/M1) using the MCNP computer code model. The reaction rate using in the experiment includes 27Al(n, α)24Na and 197Au(n, γ)198Au reactions. Aluminium (99.9 wt%) and gold (0.1 wt%) foils and the gold foils covered with cadmium were irradiated in 9 locations in the core referred to as CT, C8, C12, F3, F12, F22, F29, G5, and G33. The experimental results were compared to the calculations performed using MCNP which consisted of the detailed geometrical model of the reactor core. The results from the experimental and calculated normalized reaction rates in the reactor core are in good agreement for both reactions showing that the material and geometrical properties of the reactor core are modelled very well. The results indicated that the difference between the experimental measurements and the calculation of the reactor core using the MCNP geometrical model was below 10%. In conclusion the MCNP computational model which was used to calculate the neutron flux and reaction rate distribution in the reactor core can be used for others reactor core parameters including neutron spectra calculation, dose rate calculation, power peaking factors calculation and optimization of research reactor utilization in the future with the confidence in the accuracy and reliability of the calculation.
ZnSe based semiconductor core-shell structures: From preparation to application
NASA Astrophysics Data System (ADS)
Sun, Chengcheng; Gu, Yarong; Wen, Weijia; Zhao, Lijuan
2018-07-01
Inorganic core-shell semiconductor materials have attracted increasing interest in recent years because of the unique structure, stable chemical properties and high performance in devices. With special properties such as a direct band-gap and excellent photoelectrical characteristics, ZnSe based semiconductor core-shell structures are promising materials for applications in such fields as photocatalysts, light-emitting diodes, solar cells, photodetectors, biomedical science and so on. However, few reviews on ZnSe based semiconductor core-shell structures have been reported so far. Therefore this manuscript mainly focuses on the research activities on ZnSe based semiconductor core-shell composites including various preparation methods and the applications of these core-shell structures, especially in photocatalysts, light emitting, solar cells and photodetectors. The possibilities and limitations of studies on ZnSe based semiconductor core-shell composites are also highlighted.
Protocol for developing, disseminating and implementing a core outcome set for endometriosis.
Hirsch, Martin; Duffy, James M N; Barker, Claire; Hummelshoj, Lone; Johnson, Neil P; Mol, Ben; Khan, Khalid S; Farquhar, Cindy
2016-12-21
Endometriosis is a common gynaecological disease characterised by pain and subfertility. Randomised controlled trials evaluating treatments for endometriosis have reported many different outcomes and outcome measures. This variation restricts effective data synthesis limiting the usefulness of research to inform clinical practice. To address these methodological concerns, we aim to develop, disseminate and implement a core outcome set for endometriosis engaging with key stakeholders, including healthcare professionals, researchers and women with endometriosis. An international steering group has been established, including healthcare professionals, researchers and patient representatives. Potential outcomes identified from a systematic review of the literature will be entered into a modified Delphi method. Key stakeholders will be invited to participate including healthcare professionals, researchers and women with endometriosis. Participants will be invited to score individual outcomes on a nine-point Likert scale anchored between 1 (not important) and 9 (critical). Repeated reflection and rescoring should promote whole and individual stakeholder group converge towards consensus, 'core', outcomes. High-quality outcome measures will be associated with core outcomes. The implementation of a core outcome set for endometriosis within future clinical trials, systematic reviews and clinical guidelines will enhance the availability of comparable data to facilitate evidence-based patient care. This study was prospectively registered with Core Outcome Measures in Effectiveness Trials Initiative; number: 691. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
ERIC Educational Resources Information Center
Tudela, Ignacio; Bonete, Pedro; Fullana, Andres; Conesa, Juan Antonio
2011-01-01
The unreacted-core shrinking (UCS) model is employed to characterize fluid-particle reactions that are important in industry and research. An approach to understand the UCS model by numerical methods is presented, which helps the visualization of the influence of the variables that control the overall heterogeneous process. Use of this approach in…
Full Core TREAT Kinetics Demonstration Using Rattlesnake/BISON Coupling Within MAMMOTH
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ortensi, Javier; DeHart, Mark D.; Gleicher, Frederick N.
2015-08-01
This report summarizes key aspects of research in evaluation of modeling needs for TREAT transient simulation. Using a measured TREAT critical measurement and a transient for a small, experimentally simplified core, Rattlesnake and MAMMOTH simulations are performed building from simple infinite media to a full core model. Cross sections processing methods are evaluated, various homogenization approaches are assessed and the neutronic behavior of the core studied to determine key modeling aspects. The simulation of the minimum critical core with the diffusion solver shows very good agreement with the reference Monte Carlo simulation and the experiment. The full core transient simulationmore » with thermal feedback shows a significantly lower power peak compared to the documented experimental measurement, which is not unexpected in the early stages of model development.« less
Ruiz, Yumary; Matos, Sergio; Kapadia, Smiti; Islam, Nadia; Cusack, Arthur; Kwong, Sylvia; Trinh-Shevrin, Chau
2012-12-01
Despite the importance of community health workers (CHWs) in strategies to reduce health disparities and the call to enhance their roles in research, little information exists on how to prepare CHWs involved in community-academic initiatives (CAIs). Therefore, the New York University Prevention Research Center piloted a CAI-CHW training program. We applied a core competency framework to an existing CHW curriculum and bolstered the curriculum to include research-specific sessions. We employed diverse training methods, guided by adult learning principles and popular education philosophy. Evaluation instruments assessed changes related to confidence, intention to use learned skills, usefulness of sessions, and satisfaction with the training. Results demonstrated that a core competency-based training can successfully affect CHWs' perceived confidence and intentions to apply learned content, and can provide a larger social justice context of their role and work. This program demonstrates that a core competency-based framework coupled with CAI-research-specific skill sessions (1) provides skills that CAI-CHWs intend to use, (2) builds confidence, and (3) provides participants with a more contextualized view of client needs and CHW roles.
Manufacturing development for the SAFE 100 kW core
NASA Astrophysics Data System (ADS)
Carter, Robert; Roman, Jose; Salvail, Pat
2002-01-01
In stark contrast to what is sometimes considered the norm in traditional manufacturing processes, engineers at the Marshall Space Flight Center (MSFC) arc in the practice of altering the standard in an effort to realize other potential methods in core manufacturing. While remaining within the bounds of the materials database, we are researching into core manufacturing techniques that may have been overlooked in the past due to funding and/or time constraints. To augment proven core fabrication capabilities we are pursuing plating processes as another possible method for core build-up and assembly. Although brazing and a proprietary HIP cycle are used for module assembly (proven track record for stability and endurance), it is prudent to pursue secondary or backup methods of module and core assembly. For this reason heat tube manufacture and module assembly by means of plating is being investigated. Potentially, the plating processes will give engineers the ability to manufacture replacement modules for any module that might fail to perform nominally, and to assemble/disassemble a complete core in much less time than would be required for the conventional Braze-HIP process. Another area of improvement in core manufacturing capabilities is the installation of a sodium and lithium liquid metal heat pipe fill machine. This, along with the ability to Electron Beam Weld heat pipe seals and wet-in the pipes in the necessary vacuum atmosphere, will eliminate the need to ship potentially hazardous components outside for processing. In addition to developing core manufacturing techniques, the SAFE manufacturing team has been evaluating the thermal heat transfer characteristics, and manufacturability of several heat exchanger design concepts. .
HackaMol: An Object-Oriented Modern Perl Library for Molecular Hacking on Multiple Scales
Riccardi, Demian M.; Parks, Jerry M.; Johs, Alexander; ...
2015-03-20
HackaMol is an open source, object-oriented toolkit written in Modern Perl that organizes atoms within molecules and provides chemically intuitive attributes and methods. The library consists of two components: HackaMol, the core that contains classes for storing and manipulating molecular information, and HackaMol::X, the extensions that use the core. We tested the core; it is well-documented and easy to install across computational platforms. Our goal for the extensions is to provide a more flexible space for researchers to develop and share new methods. In this application note, we provide a description of the core classes and two extensions: HackaMol::X::Calculator, anmore » abstract calculator that uses code references to generalize interfaces with external programs, and HackaMol::X::Vina, a structured class that provides an interface with the AutoDock Vina docking program.« less
HackaMol: An Object-Oriented Modern Perl Library for Molecular Hacking on Multiple Scales.
Riccardi, Demian; Parks, Jerry M; Johs, Alexander; Smith, Jeremy C
2015-04-27
HackaMol is an open source, object-oriented toolkit written in Modern Perl that organizes atoms within molecules and provides chemically intuitive attributes and methods. The library consists of two components: HackaMol, the core that contains classes for storing and manipulating molecular information, and HackaMol::X, the extensions that use the core. The core is well-tested, well-documented, and easy to install across computational platforms. The goal of the extensions is to provide a more flexible space for researchers to develop and share new methods. In this application note, we provide a description of the core classes and two extensions: HackaMol::X::Calculator, an abstract calculator that uses code references to generalize interfaces with external programs, and HackaMol::X::Vina, a structured class that provides an interface with the AutoDock Vina docking program.
HackaMol: An Object-Oriented Modern Perl Library for Molecular Hacking on Multiple Scales
DOE Office of Scientific and Technical Information (OSTI.GOV)
Riccardi, Demian M.; Parks, Jerry M.; Johs, Alexander
HackaMol is an open source, object-oriented toolkit written in Modern Perl that organizes atoms within molecules and provides chemically intuitive attributes and methods. The library consists of two components: HackaMol, the core that contains classes for storing and manipulating molecular information, and HackaMol::X, the extensions that use the core. We tested the core; it is well-documented and easy to install across computational platforms. Our goal for the extensions is to provide a more flexible space for researchers to develop and share new methods. In this application note, we provide a description of the core classes and two extensions: HackaMol::X::Calculator, anmore » abstract calculator that uses code references to generalize interfaces with external programs, and HackaMol::X::Vina, a structured class that provides an interface with the AutoDock Vina docking program.« less
Is Graduate Students' Research Exposure to Business Ethics Comprehensive?
ERIC Educational Resources Information Center
Piotrowski, Chris; Guyette, Roger W., Jr.
2013-01-01
Graduate-level education, at its core, has a focus on specific, in-depth disciplinary subject matter, with a strong emphasis on methods, conceptual framework, and research. For the developing student, exposure to both past and current research developments is mainly achieved by reading and studying articles published in leading journals in their…
Time Is Precious: Variable- and Event-Centred Approaches to Process Analysis in CSCL Research
ERIC Educational Resources Information Center
Reimann, Peter
2009-01-01
Although temporality is a key characteristic of the core concepts of CSCL--interaction, communication, learning, knowledge building, technology use--and although CSCL researchers have privileged access to process data, the theoretical constructs and methods employed in research practice frequently neglect to make full use of information relating…
An Assessment of Intervention Fidelity in Published Social Work Intervention Research Studies
ERIC Educational Resources Information Center
Corley, Nicole A.; Kim, Irang
2016-01-01
Objectives: Intervention fidelity is a critical strategy to help advance the usefulness and integrity of social work research. This study assessed the extent to which a selected sample of published social work intervention researchers reported its intervention protocols. Methods: Six core social work journals were reviewed in this analysis. The…
Teleseismic Array Studies of Earth's Core-Mantle Boundary
NASA Astrophysics Data System (ADS)
Alexandrakis, Catherine
2011-12-01
The core mantle boundary (CMB) is an inaccessible and complex region, knowledge of which is vital to our understanding of many Earth processes. Above it is the heterogeneous lower-mantle. Below the boundary is the outer-core, composed of liquid iron, and/or nickel and some lighter elements. Elucidation of how these two distinct layers interact may enable researchers to better understand the geodynamo, global tectonics, and overall Earth history. One parameter that can be used to study structure and limit potential chemical compositions is seismic-wave velocity. Current global-velocity models have significant uncertainties in the 200 km above and below the CMB. In this thesis, these regions are studied using three methods. The upper outer core is studied using two seismic array methods. First, a modified vespa, or slant-stack method is applied to seismic observations at broadband seismic arrays, and at large, dense groups of broadband seismic stations dubbed 'virtual' arrays. Observations of core-refracted teleseismic waves, such as SmKS, are used to extract relative arrivaltimes. As with previous studies, lower -mantle heterogeneities influence the extracted arrivaltimes, giving significant scatter. To remove raypath effects, a new method was developed, called Empirical Transfer Functions (ETFs). When applied to SmKS waves, this method effectively isolates arrivaltime perturbations caused by outer core velocities. By removing raypath effects, the signals can be stacked further reducing scatter. The results of this work were published as a new 1D outer-core model, called AE09. This model describes a well-mixed outer core. Two array methods are used to detect lower mantle heterogeneities, in particular Ultra-Low Velocity Zones (ULVZs). The ETF method and beam forming are used to isolate a weak P-wave that diffracts along the CMB. While neither the ETF method nor beam forming could adequately image the low-amplitude phase, beam forms of two events indicate precursors to the SKS and SKKS phase, which may be ULVZ indicators. Finally, cross-correlated observed and modelled beams indicate a tendency towards a ULVZ-like lower mantle in the study region.
NASA Astrophysics Data System (ADS)
Campbell, J. D.; Heilman, P.; Goodrich, D. C.; Sadler, J.
2015-12-01
The objective for the USDA Long-Term Agroecosystem Research (LTAR) network Common Observatory Repository (CORe) is to provide data management services including archive, discovery, and access for consistently observed data across all 18 nodes. LTAR members have an average of 56 years of diverse historic data. Each LTAR has designated a representative 'permanent' site as the location's common meteorological observatory. CORe implementation is phased, starting with meteorology, then adding hydrology, eddy flux, soil, and biology data. A design goal was to adopt existing best practices while minimizing the additional data management duties for the researchers. LTAR is providing support for data management specialists at the locations and the National Agricultural Library is providing central data management services. Maintaining continuity with historical observations is essential, so observations from both the legacy and new common methods are included in CORe. International standards are used to store robust descriptive metadata (ISO 19115) for the observation station and surrounding locale (WMO), sensors (Sensor ML), and activity (e.g., re-calibration, locale changes) to provide sufficient detail for novel data re-use for the next 50 years. To facilitate data submission a simple text format was designed. Datasets in CORe will receive DOIs to encourage citations giving fair credit for data providers. Data and metadata access are designed to support multiple formats and naming conventions. An automated QC process is being developed to enhance comparability among LTAR locations and to generate QC process metadata. Data provenance is maintained with a permanent record of changes including those by local scientists reviewing the automated QC results. Lessons learned so far include increase in site acceptance of CORe with the decision to store data from both legacy and new common methods. A larger than anticipated variety of currently used methods with potentially significant differences for future data use was found. Cooperative peer support among locations with the same sensors coupled with central support has reduced redundancy in procedural and data documentation.
Permeability of sediment cores from methane hydrate deposit in the Eastern Nankai Trough, Japan
NASA Astrophysics Data System (ADS)
Konno, Y.; Yoneda, J.; Egawa, K.; Ito, T.; Jin, Y.; Kida, M.; Suzuki, K.; Nakatsuka, Y.; Nagao, J.
2013-12-01
Effective and absolute permeability are key parameters for gas production from methane-hydrate-bearing sandy sediments. Effective and/or absolute permeability have been measured using methane-hydrate-bearing sandy cores and clayey and silty cores recovered from Daini Atsumi Knoll in the Eastern Nankai Trough during the 2012 JOGMEC/JAPEX Pressure coring operation. Liquid-nitrogen-immersed cores were prepared by rapid depressurization of pressure cores recovered by a pressure coring system referred to as the Hybrid PCS. Cores were shaped cylindrically on a lathe with spraying of liquid nitrogen to prevent hydrate dissociation. Permeability was measured by a flooding test or a pressure relaxation method under near in-situ pressure and temperature conditions. Measured effective permeability of hydrate-bearing sediments is less than tens of md, which are order of magnitude less than absolute permeability. Absolute permeability of clayey cores is approximately tens of μd, which would perform a sealing function as cap rocks. Permeability reduction due to a swelling effect was observed for a silty core during flooding test of pure water mimicking hydrate-dissociation-water. Swelling effect may cause production formation damage especially at a later stage of gas production from methane hydrate deposits. This study was financially supported by the Research Consortium for Methane Hydrate Resources in Japan (MH21 Research Consortium) that carries out Japan's Methane Hydrate R&D Program conducted by the Ministry of Economy, Trade and Industry (METI).
NASA Astrophysics Data System (ADS)
Williams, John; Eames, Chris; Hume, Anne; Lockley, John
2012-11-01
Background: This research addressed the key area of early career teacher education and aimed to explore the use of a 'content representation' (CoRe) as a mediational tool to develop early career secondary teacher pedagogical content knowledge (PCK). This study was situated in the subject areas of science and technology, where sound teacher knowledge is particularly important to student engagement. Purpose: The study was designed to examine whether such a tool (a CoRe), co-designed by an early career secondary teacher with expert content and pedagogy specialists, can enhance the PCK of early career teachers. The research questions were: How can experts in content and pedagogy work together with early career teachers to develop one science topic CoRe and one technology topic CoRe to support the development of PCK for early career secondary teachers? How does the use of a collaboratively designed CoRe affect the planning of an early career secondary teacher in science or technology? How has engagement in the development and use of an expert-informed CoRe developed an early career teacher's PCK? Sample: The research design incorporated a unique partnership between two expert classroom teachers, two content experts, four early career teachers, and four researchers experienced in science and technology education. Design: This study employed an interpretivist-based methodology and an action research approach within a four-case study design. Data were gathered using qualitative research methods focused on semi-structured interviews, observations and document analysis. Results: The study indicated that CoRes, developed through this collaborative process, helped the early career teachers focus on the big picture of the topic, emphasize particularly relevant areas of content and consider alternative ways of planning for their teaching. Conclusions: This paper presents an analysis of the process of CoRe development by the teacher-expert partnerships and the effect that had on the early career teachers' PCK. In addition, as the same tools and methodology were applied to both a science and a technology teaching context, differences between the two learning areas are discussed.
Zare-Farashbandi, Firoozeh; Ramezan-Shirazi, Mahtab; Ashrafi-Rizi, Hasan; Nouri, Rasool
2014-01-01
Introduction: Recent progress in providing innovative solutions in the organization of electronic resources and research in this area shows a global trend in the use of new strategies such as metadata to facilitate description, place for, organization and retrieval of resources in the web environment. In this context, library metadata standards have a special place; therefore, the purpose of the present study has been a comparative study on the Central Libraries’ Websites of Iran State Universities for Hyper Text Mark-up Language (HTML) and Dublin Core metadata elements usage in 2011. Materials and Methods: The method of this study is applied-descriptive and data collection tool is the check lists created by the researchers. Statistical community includes 98 websites of the Iranian State Universities of the Ministry of Health and Medical Education and Ministry of Science, Research and Technology and method of sampling is the census. Information was collected through observation and direct visits to websites and data analysis was prepared by Microsoft Excel software, 2011. Results: The results of this study indicate that none of the websites use Dublin Core (DC) metadata and that only a few of them have used overlaps elements between HTML meta tags and Dublin Core (DC) elements. The percentage of overlaps of DC elements centralization in the Ministry of Health were 56% for both description and keywords and, in the Ministry of Science, were 45% for the keywords and 39% for the description. But, HTML meta tags have moderate presence in both Ministries, as the most-used elements were keywords and description (56%) and the least-used elements were date and formatter (0%). Conclusion: It was observed that the Ministry of Health and Ministry of Science follows the same path for using Dublin Core standard on their websites in the future. Because Central Library Websites are an example of scientific web pages, special attention in designing them can help the researchers to achieve faster and more accurate information resources. Therefore, the influence of librarians’ ideas on the awareness of web designers and developers will be important for using metadata elements as general, and specifically for applying such standards. PMID:24741646
Mixed methods research in music therapy research.
Bradt, Joke; Burns, Debra S; Creswell, John W
2013-01-01
Music therapists have an ethical and professional responsibility to provide the highest quality care possible to their patients. Much of the time, high quality care is guided by evidence-based practice standards that integrate the most current, available research in making decisions. Accordingly, music therapists need research that integrates multiple ways of knowing and forms of evidence. Mixed methods research holds great promise for facilitating such integration. At this time, there have not been any methodological articles published on mixed methods research in music therapy. The purpose of this article is to introduce mixed methods research as an approach to address research questions relevant to music therapy practice. This article describes the core characteristics of mixed methods research, considers paradigmatic issues related to this research approach, articulates major challenges in conducting mixed methods research, illustrates four basic designs, and provides criteria for evaluating the quality of mixed methods articles using examples of mixed methods research from the music therapy literature. Mixed methods research offers unique opportunities for strengthening the evidence base in music therapy. Recommendations are provided to ensure rigorous implementation of this research approach.
Schmitt, Jochen; Apfelbacher, Christian; Spuls, Phyllis I; Thomas, Kim S; Simpson, Eric L; Furue, Masutaka; Chalmers, Joanne; Williams, Hywel C
2015-01-01
Core outcome sets (COSs) are consensus-derived minimum sets of outcomes to be assessed in a specific situation. COSs are being increasingly developed to limit outcome-reporting bias, allow comparisons across trials, and strengthen clinical decision making. Despite the increasing interest in outcomes research, methods to develop COSs have not yet been standardized. The aim of this paper is to present the Harmonizing Outcomes Measures for Eczema (HOME) roadmap for the development and implementation of COSs, which was developed on the basis of our experience in the standardization of outcome measurements for atopic eczema. Following the establishment of a panel representing all relevant stakeholders and a research team experienced in outcomes research, the scope and setting of the core set should be defined. The next steps are the definition of a core set of outcome domains such as symptoms or quality of life, followed by the identification or development and validation of appropriate outcome measurement instruments to measure these core domains. Finally, the consented COS needs to be disseminated, implemented, and reviewed. We believe that the HOME roadmap is a useful methodological framework to develop COSs in dermatology, with the ultimate goal of better decision making and promoting patient-centered health care.
Dating sediment cores from Hudson River marshes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robideau, R.; Bopp, R.F.
1993-03-01
There are several methods for determining sediment accumulation rates in the Hudson River estuary. One involves the analysis of the concentration of certain radionuclides in sediment core sections. Radionuclides occur in the Hudson River as a result of: natural sources, fallout from nuclear weapons testing and low level aqueous releases from the Indian Point Nuclear Power Facility. The following radionuclides have been studied in the authors work: Cesium-137, which is derived from global fallout that started in the 1950's and has peaked in 1963. Beryllium-7, a natural radionuclide with a 53 day half-life and found associated with very recently depositedmore » sediments. Another useful natural radionuclide is Lead-210 derived from the decay of Radon-222 in the atmosphere. Lead-210 has a half-life of 22 years and can be used to date sediments up to about 100 years old. In the Hudson River, Cobalt-60 is a marker for Indian Point Nuclear Reactor discharges. The author's research involved taking sediment core samples from four sites in the Hudson River Estuarine Research Reserve areas. These core samples were sectioned, dried, ground and analyzed for the presence of radionuclides by the method of gamma-ray spectroscopy. The strength of each current pulse is proportional to the energy level of the gamma ray absorbed. Since different radionuclides produce gamma rays of different energies, several radionuclides can be analyzed simultaneously in each of the samples. The data obtained from this research will be compared to earlier work to obtain a complete chronology of sediment deposition in these Reserve areas of the river. Core samples may then by analyzed for the presence of PCB's, heavy metals and other pollutants such as pesticides to construct a pollution history of the river.« less
Understanding the haling power depletion (HPD) method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Levine, S.; Blyth, T.; Ivanov, K.
2012-07-01
The Pennsylvania State Univ. (PSU) is using the university version of the Studsvik Scandpower Code System (CMS) for research and education purposes. Preparations have been made to incorporate the CMS into the PSU Nuclear Engineering graduate class 'Nuclear Fuel Management' course. The information presented in this paper was developed during the preparation of the material for the course. The Haling Power Depletion (HPD) was presented in the course for the first time. The HPD method has been criticized as not valid by many in the field even though it has been successfully applied at PSU for the past 20 years.more » It was noticed that the radial power distribution (RPD) for low leakage cores during depletion remained similar to that of the HPD during most of the cycle. Thus, the Haling Power Depletion (HPD) may be used conveniently mainly for low leakage cores. Studies were then made to better understand the HPD and the results are presented in this paper. Many different core configurations can be computed quickly with the HPD without using Burnable Poisons (BP) to produce several excellent low leakage core configurations that are viable for power production. Once the HPD core configuration is chosen for further analysis, techniques are available for establishing the BP design to prevent violating any of the safety constraints in such HPD calculated cores. In summary, in this paper it has been shown that the HPD method can be used for guiding the design for the low leakage core. (authors)« less
ERIC Educational Resources Information Center
Whittock, Tammy
2013-01-01
Through this mixed-method study, the researcher investigated social reproduction in a student's decision to follow the Louisiana Career/Basic Core Diploma Path. In 2008-2009, Louisiana's cohort graduation rate was 67.3%, which was well below the national average of 75.5%, ranking Louisiana forty-sixth in the country. This rate led to the…
Shearer, Barbara S.; Klatt, Carolyn; Nagy, Suzanne P.
2009-01-01
Objectives: The current study evaluates the results of a previously reported method for creating a core medical electronic journal collection for a new medical school library, validates the core collection created specifically to meet the needs of the new school, and identifies strategies for making cost-effective e-journal selection decisions. Methods: Usage data were extracted for four e-journal packages (Blackwell-Synergy, Cell Press, Lippincott Williams & Wilkins, and ScienceDirect). Usage was correlated with weighted point values assigned to a core list of journal titles, and each package was evaluated for relevancy and cost-effectiveness to the Florida State University College of Medicine (FSU COM) population. Results: The results indicated that the development of the core list was a valid method for creating a new twenty-first century, community-based medical school library. Thirty-seven journals are identified for addition to the FSU COM core list based on use by the COM, and areas of overlapping research interests between the university and the COM are identified based on use of specific journals by each population. Conclusions: The collection development approach that evolved at the FSU COM library was useful during the initial stages of identifying and evaluating journal selections and in assessing the relative value of a particular journal package for the FSU COM after the school was established. PMID:19404499
The influence of pore structure parameters on the digital core recovery degree
NASA Astrophysics Data System (ADS)
Xia, Huifen; Zhao, Ling; Sun, Yanyu; Yuan, Shi
2017-05-01
Constructing digital core in the research of water flooding or polymer flooding oil displacement efficiency has its unique advantage. Using mercury injection experiment measured pore throat size distribution frequency, coordination number measured by CT scanning method and imbibition displacement method is used to measure the wettability of the data, on the basis of considering pore throat ratio, wettability, using the principle of adaptive porosity, on the basis of fitting the permeability to complete the construction of digital core. The results show that the model of throat distribution is concentrated water flooding recovery degree is higher, and distribution is more decentralized model polymer flooding recovery degree is higher. Around the same number of PV in poly, coordination number model of water flooding and polymer flooding recovery degree is higher.
Learning through Experience: The Transition from Doctoral Student to Social Work Educator
ERIC Educational Resources Information Center
Oktay, Julianne S.; Jacobson, Jodi M.; Fisher, Elizabeth
2013-01-01
The researchers conducted an exploratory study using grounded theory qualitative research methods to examine experiences of social work doctoral students as they learned to teach ("N"?=?14). A core category, "learning through experience," representing a basic social process, was identified. The doctoral students experienced…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rahnema, Farzad; Garimeela, Srinivas; Ougouag, Abderrafi
2013-11-29
This project will develop a 3D, advanced coarse mesh transport method (COMET-Hex) for steady- state and transient analyses in advanced very high-temperature reactors (VHTRs). The project will lead to a coupled neutronics and thermal hydraulic (T/H) core simulation tool with fuel depletion capability. The computational tool will be developed in hexagonal geometry, based solely on transport theory without (spatial) homogenization in complicated 3D geometries. In addition to the hexagonal geometry extension, collaborators will concurrently develop three additional capabilities to increase the code’s versatility as an advanced and robust core simulator for VHTRs. First, the project team will develop and implementmore » a depletion method within the core simulator. Second, the team will develop an elementary (proof-of-concept) 1D time-dependent transport method for efficient transient analyses. The third capability will be a thermal hydraulic method coupled to the neutronics transport module for VHTRs. Current advancements in reactor core design are pushing VHTRs toward greater core and fuel heterogeneity to pursue higher burn-ups, efficiently transmute used fuel, maximize energy production, and improve plant economics and safety. As a result, an accurate and efficient neutron transport, with capabilities to treat heterogeneous burnable poison effects, is highly desirable for predicting VHTR neutronics performance. This research project’s primary objective is to advance the state of the art for reactor analysis.« less
Evaluation and Ranking of Researchers – Bh Index
Bharathi, D. Gnana
2013-01-01
Evaluation and ranking of every author is very crucial as it is widely used to evaluate the performance of the researcher. This article proposes a new method, called Bh-Index, to evaluate the researchers based on the publications and citations. The method is built on h-Index and only the h-core articles are taken into consideration. The method assigns value additions to those articles that receive significantly high citations in comparison to the h-Index of the researcher. It provides a wide range of values for a given h-Index and effective evaluation even for a short period. Use of Bh-Index along with the h-Index gives a powerful tool to evaluate the researchers. PMID:24349183
NASA Astrophysics Data System (ADS)
Dooley, J.; Courville, Z.; Artinian, E.
2016-12-01
BackgroundStreet Road Artists Space Summer 2015 show was Sailing Stones. Works presented scenarios on tension between transience and permanence, highlighting cultural constructs imposed onto landscape and place. Dooley's installation, CryoZen Garden, operated as visual metaphor, modeling cryospheric processes and explored effects of melting polar ice caps on a warming world. A grant from Pennsylvania Partners in the Arts, with a focus on sharing contemporary works which were participatory, conceptual, and polar science research-based, allowed for a new project to engage community members, particularly students.MethodsIn this project students were introduced to the work of Dooley, artist/educator and Courville, snow/ice researcher. Students created `Life Cores', a take on ice and sediment coring scientists use as evidence of Earth's atmospheric and geologic changes. Students were given plastic tubes 2' long and 2" in diameter and were asked to add a daily layer of materials taken from everyday life, for a one month period. Students chose materials important to them personally, and kept journals, reflecting on items' significance, and/or relationship to life and world events. After creation of the Life Cores, Courville and Dooley visited students, shared their work on polar research, what it's like to live and work on ice, and ways science and art can intertwine to create better understanding of climate change issues. Students used core logging sheets to make observations of each others' life cores, noting layer colors, textures and deposition rates as some of the characteristics researchers use in ice and sediment core interpretation. Students' work was exhibited at Street Road and will remain on Street Road's website. Courville and Dooley presented to the general public during the opening. ConclusionsParticipants were better able to answer the question, How do we know what we know from coring? by relating the science to something that is known and personal, such as the passage of time with recognizable indicators. Success of the project was based on attendance, very positive feedback from participants, subsequent visits and the effects this programming had on continued efforts to forge long-term relationships with community groups.
NASA Technical Reports Server (NTRS)
Benton, E. R.
1983-01-01
Instrumentation, analytical methods, and research goals for understanding the behavior and source of geophysical magnetism are reviewed. Magsat, launched in 1979, collected global magnetometer data and identified the main terrestrial magnetic fields. The data has been treated by representing the curl-free field in terms of a scalar potential which is decomposed into a truncated series of spherical harmonics. Solutions to the Laplace equation then extend the field upward or downward from the measurement level through intervening spaces with no source. Further research is necessary on the interaction between harmonics of various spatial scales. Attempts are also being made to analytically model the main field and its secular variation at the core-mantle boundary. Work is also being done on characterizing the core structure, composition, thermodynamics, energetics, and formation, as well as designing a new Magsat or a tethered satellite to be flown on the Shuttle.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-23
...; SignalCore Inc., Austin, TX; Modular Methods, LLC, Steamboat Springs, CO; and SELEX Galileo S.p.A., Roma... DEPARTMENT OF JUSTICE Antitrust Division Notice Pursuant to the National Cooperative Research and..., pursuant to Section 6(a) of the National Cooperative Research and Production Act of 1993, 15 U.S.C. 4301 et...
ERIC Educational Resources Information Center
Bailey, Lora Battle
2014-01-01
Although a plethora of research focuses on economically at-risk preschool children in general across the United States, little can be found that investigates methods for improving rural children's academic outcomes. This review of research is intended to provide a contextual understanding of the background and current conditions that exist…
Methods and Resources for Elementary and Middle-School Social Studies.
ERIC Educational Resources Information Center
Stockard, James W., Jr.
Designed for preservice elementary and/or middle school teachers, this methods and resources volume compiles well-researched information on social studies education. It uses the standards recommended by the National Council for the Social Studies (NCSS) as a foundation, thoroughly discussing the core disciplines and thematic strands. The book…
Canada's Neglected Tropical Disease Research Network: Who's in the Core—Who's on the Periphery?
Phillips, Kaye; Kohler, Jillian Clare; Pennefather, Peter; Thorsteinsdottir, Halla; Wong, Joseph
2013-01-01
Background This study designed and applied accessible yet systematic methods to generate baseline information about the patterns and structure of Canada's neglected tropical disease (NTD) research network; a network that, until recently, was formed and functioned on the periphery of strategic Canadian research funding. Methodology Multiple methods were used to conduct this study, including: (1) a systematic bibliometric procedure to capture archival NTD publications and co-authorship data; (2) a country-level “core-periphery” network analysis to measure and map the structure of Canada's NTD co-authorship network including its size, density, cliques, and centralization; and (3) a statistical analysis to test the correlation between the position of countries in Canada's NTD network (“k-core measure”) and the quantity and quality of research produced. Principal Findings Over the past sixty years (1950–2010), Canadian researchers have contributed to 1,079 NTD publications, specializing in Leishmania, African sleeping sickness, and leprosy. Of this work, 70% of all first authors and co-authors (n = 4,145) have been Canadian. Since the 1990s, however, a network of international co-authorship activity has been emerging, with representation of researchers from 62 different countries; largely researchers from OECD countries (e.g. United States and United Kingdom) and some non-OECD countries (e.g. Brazil and Iran). Canada has a core-periphery NTD international research structure, with a densely connected group of OECD countries and some African nations, such as Uganda and Kenya. Sitting predominantly on the periphery of this research network is a cluster of 16 non-OECD nations that fall within the lowest GDP percentile of the network. Conclusion/Significance The publication specialties, composition, and position of NTD researchers within Canada's NTD country network provide evidence that while Canadian researchers currently remain the overall gatekeepers of the NTD research they generate; there is opportunity to leverage existing research collaborations and help advance regions and NTD areas that are currently under-developed. PMID:24340113
Mastery of Content Representation (CoRes) Related TPACK High School Biology Teacher
NASA Astrophysics Data System (ADS)
Nasution, W. R.; Sriyati, S.; Riandi, R.; Safitri, M.
2017-09-01
The purpose of this study was to determine the mastery of Content Representation (CoRes) teachers related to the integration of technology and pedagogy in teaching Biology (TPACK). This research uses a descriptive method. The data were taken using instruments CoRes as the primary data and semi-structured interviews as supporting data. The subjects were biology teacher in class X MIA from four schools in Bandung. Teachers raised CoRes was analyzed using a scoring rubric CoRes with coding 1-3 then categorized into a group of upper, middle, or lower. The results showed that the two teachers in the lower category. This results means that the control of teachers in defining the essential concept in the CoRes has not been detailed and specific. Meanwhile, two other teachers were in the middle category. This means that the ability of teachers to determine the essential concepts in the CoRes are still inadequate so that still needs to be improved.
Joosten, Yvonne A; Israel, Tiffany L; Williams, Neely A; Boone, Leslie R; Schlundt, David G; Mouton, Charles P; Dittus, Robert S; Bernard, Gordon R; Wilkins, Consuelo H
2015-12-01
Engaging communities in research increases its relevance and may speed the translation of discoveries into improved health outcomes. Many researchers lack training to effectively engage stakeholders, whereas academic institutions lack infrastructure to support community engagement. In 2009, the Meharry-Vanderbilt Community-Engaged Research Core began testing new approaches for community engagement, which led to the development of the Community Engagement Studio (CE Studio). This structured program facilitates project-specific input from community and patient stakeholders to enhance research design, implementation, and dissemination. Developers used a team approach to recruit and train stakeholders, prepare researchers to engage with stakeholders, and facilitate an in-person meeting with both. The research core has implemented 28 CE Studios that engaged 152 community stakeholders. Participating researchers, representing a broad range of faculty ranks and disciplines, reported that input from stakeholders was valuable and that the CE Studio helped determine project feasibility and enhanced research design and implementation. Stakeholders found the CE Studio to be an acceptable method of engagement and reported a better understanding of research in general. A tool kit was developed to replicate this model and to disseminate this approach. The research core will collect data to better understand the impact of CE Studios on research proposal submissions, funding, research outcomes, patient and stakeholder engagement in projects, and dissemination of results. They will also collect data to determine whether CE Studios increase patient-centered approaches in research and whether stakeholders who participate have more trust and willingness to participate in research.
NASA Astrophysics Data System (ADS)
Dmitriev, S. F.; Ishkov, A. V.; Katasonov, A. O.; Malikov, V. N.; Sagalakov, A. M.
2018-01-01
The research aims to develop a microminiature eddy current transducer for aluminum alloys. The research topic is considered relevant due to the need for evaluation and forecasting of safe operating life of aluminum. A microminiature transformer-type transducer was designed, which enables to perform local investigations of unferromagnetic materials using eddy-current method based on local studies conductivity. Having the designed transducer as a basis, a hardware-software complex was built to perform experimental studies of aluminium. Cores with different shapes were used in this work. Test results are reported for a flaws in the form of hidden slits and apertures inside the slabs is derived for excitation coil frequencies of 300-700 Hz.
High-Speed Computation of the Kleene Star in Max-Plus Algebraic System Using a Cell Broadband Engine
NASA Astrophysics Data System (ADS)
Goto, Hiroyuki
This research addresses a high-speed computation method for the Kleene star of the weighted adjacency matrix in a max-plus algebraic system. We focus on systems whose precedence constraints are represented by a directed acyclic graph and implement it on a Cell Broadband Engine™ (CBE) processor. Since the resulting matrix gives the longest travel times between two adjacent nodes, it is often utilized in scheduling problem solvers for a class of discrete event systems. This research, in particular, attempts to achieve a speedup by using two approaches: parallelization and SIMDization (Single Instruction, Multiple Data), both of which can be accomplished by a CBE processor. The former refers to a parallel computation using multiple cores, while the latter is a method whereby multiple elements are computed by a single instruction. Using the implementation on a Sony PlayStation 3™ equipped with a CBE processor, we found that the SIMDization is effective regardless of the system's size and the number of processor cores used. We also found that the scalability of using multiple cores is remarkable especially for systems with a large number of nodes. In a numerical experiment where the number of nodes is 2000, we achieved a speedup of 20 times compared with the method without the above techniques.
Moving out of one's comfort zone: developing and teaching an interprofessional research course.
Berman, Rosemarie O
2013-07-01
Teamwork and interprofessional collaboration have long been identified as core competencies for achieving quality, safe, patient-centered care. The shared learning environment of an interprofessional course is one method for developing the foundation for a collaborative practice-ready work force. Developing and teaching a course for students in a variety of health professions can be challenging as faculty move beyond the comfort level of their discipline. This article describes the development of an interprofessional research course to meet the needs of different health disciplines with specific teaching strategies to develop core competencies for interprofessional collaboration and practice. Copyright 2013, SLACK Incorporated.
Building Capacity through Action Research Curricula Reviews
ERIC Educational Resources Information Center
Lee, Vanessa; Coombe, Leanne; Robinson, Priscilla
2015-01-01
In Australia, graduates of Master of Public Health (MPH) programmes are expected to achieve a set of core competencies, including a subset that is specifically related to Indigenous health. This paper reports on the methods utilised in a project which was designed using action research to strengthen Indigenous public health curricula within MPH…
How Social Network Position Relates to Knowledge Building in Online Learning Communities
ERIC Educational Resources Information Center
Wang, Lu
2010-01-01
Social Network Analysis, Statistical Analysis, Content Analysis and other research methods were used to research online learning communities at Capital Normal University, Beijing. Analysis of the two online courses resulted in the following conclusions: (1) Social networks of the two online courses form typical core-periphery structures; (2)…
Assessing Statistical Change Indices in Selected Social Work Intervention Research Studies
ERIC Educational Resources Information Center
Ham, Amanda D.; Huggins-Hoyt, Kimberly Y.; Pettus, Joelle
2016-01-01
Objectives: This study examined how evaluation and intervention research (IR) studies assessed statistical change to ascertain effectiveness. Methods: Studies from six core social work journals (2009-2013) were reviewed (N = 1,380). Fifty-two evaluation (n= 27) and intervention (n = 25) studies met the inclusion criteria. These studies were…
Co-Researching with People Who Have Intellectual Disabilities: Insights from a National Survey
ERIC Educational Resources Information Center
O'Brien, Patricia; McConkey, Roy; García-Iriarte, Edurne
2014-01-01
Background: Inclusive research with people with intellectual disabilities is growing internationally but with few studies examining its feasibility. Methods: In undertaking a national study exploring what life was like in Ireland for people with intellectual disabilities, a community of practice was developed involving a core group of…
ERIC Educational Resources Information Center
Yellowlees, Peter M.; Hogarth, Michael; Hilty, Donald M.
2006-01-01
Objective: This article highlights the importance of distributed broadband networks as part of the core infrastructure necessary to deliver academic research and education programs. Method: The authors review recent developments in the field and present the University of California, Davis, environment as a case study of a future virtual regional…
NASA Astrophysics Data System (ADS)
Kernan, Nicholas Devereux
The Niobrara Formation is a fine-grained marine rock deposited in the Western Interior Seaway during the Late Cretaceous. It is composed of fossil-rich interlayered shale, marls, and chalks. Recent interest in the Niobrara has grown due to the advent of lateral drilling and multi-stage hydraulic fracturing. This technology allows operators to economically extract hydrocarbons from chalkier Niobrara facies. Yet two aspects of the Niobrara Formation have remained enigmatic. The first is the occurrence of abundant, randomly oriented, layer-bound, normal faults. The second is the large degree of vertical heterogeneity. This research aimed to increase understanding in both these aspects of the Niobrara Formation. Randomly oriented normal faults have been observed in Niobrara outcrops for nearly a hundred years. Recent high resolution 3D seismic in the Denver Basin has allowed investigators to interpret these faults as part of a polygonal fault system (PFS). PFS are layer bound extensional structures that typically occur in fine-grained marine sediments. Though their genesis and development is still poorly understood, their almost exclusive occurrence in fine-grained rocks indicates their origin is linked to lithology. Interpretation of a 3D seismic cube in Southeast Wyoming found a tier of polygonal faulting within the Greenhorn-Carlile formations and another tier of polygonal faulting within the Niobrara and Pierre formations. This research also found that underlying structural highs influence fault growth and geometries within both these tiers. Core data and thin sections best describe vertical heterogeneity in fine-grained rocks. This investigation interpreted core data and thin sections in a well in Southeast Wyoming and identified 10 different facies. Most of these facies fall within a carbonate/clay spectrum with clay-rich facies deposited during periods of lower sea level and carbonate-rich facies deposited during periods of higher sea level. Because the average operator will typically have little core but abundant well logs, this investigation used three different methods of describing facies variability with logs. Facies interpreted with these methods are referred to as electrofacies. First, a conventional interpretation of Niobrara sub-units was done using gamma ray and resistivity logs. Then a cluster analysis was conducted on an extensive petrophysical log suite. Finally, a neural network was trained with the previous core interpretation so that it learned to identify facies from logs. The research found that when little core is available a cluster analysis method can capture significant amounts of vertical heterogeneity within the Niobrara Formation. But if core is available then a neural network method provides more meaningful and higher resolution interpretations.
Doing qualitative research in dentistry and dental education.
Edmunds, S; Brown, G
2012-05-01
The purpose of this paper is to assist dental researchers to develop their expertise in qualitative research. It sketches the key characteristics of qualitative research; summarises theoretical perspectives; outlines the core skills of qualitative data collection and the procedures which underlie three methods of qualitative research: interviewing, focus groups and concept maps. The paper offers some guidance on writing qualitative research and provides examples of qualitative research drawn from dentistry and dental education. © 2012 John Wiley & Sons A/S.
[Development and research advance of pharmacognosy field based on CNKI].
Hu, Li; Xiao, Hong
2018-02-01
Based on the literature data in CNKI, data mining and analysis technologies were used in this paper to describe the scientific research and development direction of Pharmacognosy in the last decade from the perspective of bibliometrics. The analysis of measured data revealed the core research institutions, excellent research teams, leading scholars, major research aspects and research progress in the field. Results showed that most of the scholars in the field were from colleges and institutions, accounting for 74.6% of the total research findings and forming a group of core scholars. In terms of frequency and timeliness of citation, pharmacognosy is a discipline in sustained growth and development since it mainly cites the literature in the other disciplines, absorbs and utilizes knowledge of the other disciplines. Over the last few years, molecular identification and genetic diversity have become the research hotspots in pharmacognosy, and the techniques and methods such as ISSR, RAPD, DNA barcoding and DNA molecular marker have been widely used. Copyright© by the Chinese Pharmaceutical Association.
Fabrication of TREAT Fuel with Increased Graphite Loading
DOE Office of Scientific and Technical Information (OSTI.GOV)
Luther, Erik Paul; Leckie, Rafael M.; Dombrowski, David E.
2014-02-05
As part of the feasibility study exploring the replacement of the HEU fuel core of the TREAT reactor at Idaho National Laboratory with LEU fuel, this study demonstrates that it is possible to increase the graphite content of extruded fuel by reformulation. The extrusion process was use to fabricate the “upgrade” core1 for the TREAT reactor. The graphite content achieved is determined by calculation and has not been measured by any analytical method. In conjunction, a technique, Raman Spectroscopy, has been investigated for measuring the graphite content. This method shows some promise in differentiating between carbon and graphite; however, standardsmore » that would allow the technique to be calibrated to quantify the graphite concentration have yet to be fabricated. Continued research into Raman Spectroscopy is on going. As part of this study, cracking of graphite extrusions due to volatile evolution during heat treatment has been largely eliminated. Continued research to optimize this extrusion method is required.« less
The STAR Data Reporting Guidelines for Clinical High Altitude Research.
Brodmann Maeder, Monika; Brugger, Hermann; Pun, Matiram; Strapazzon, Giacomo; Dal Cappello, Tomas; Maggiorini, Marco; Hackett, Peter; Bärtsch, Peter; Swenson, Erik R; Zafren, Ken
2018-03-01
Brodmann Maeder, Monika, Hermann Brugger, Matiram Pun, Giacomo Strapazzon, Tomas Dal Cappello, Marco Maggiorini, Peter Hackett, Peter Baärtsch, Erik R. Swenson, Ken Zafren (STAR Core Group), and the STAR Delphi Expert Group. The STARdata reporting guidelines for clinical high altitude research. High AltMedBiol. 19:7-14, 2018. The goal of the STAR (STrengthening Altitude Research) initiative was to produce a uniform set of key elements for research and reporting in clinical high-altitude (HA) medicine. The STAR initiative was inspired by research on treatment of cardiac arrest, in which the establishment of the Utstein Style, a uniform data reporting protocol, substantially contributed to improving data reporting and subsequently the quality of scientific evidence. The STAR core group used the Delphi method, in which a group of experts reaches a consensus over multiple rounds using a formal method. We selected experts in the field of clinical HA medicine based on their scientific credentials and identified an initial set of parameters for evaluation by the experts. Of 51 experts in HA research who were identified initially, 21 experts completed both rounds. The experts identified 42 key parameters in 5 categories (setting, individual factors, acute mountain sickness and HA cerebral edema, HA pulmonary edema, and treatment) that were considered essential for research and reporting in clinical HA research. An additional 47 supplemental parameters were identified that should be reported depending on the nature of the research. The STAR initiative, using the Delphi method, identified a set of key parameters essential for research and reporting in clinical HA medicine.
Numerical evaluation of gas core length in free surface vortices
NASA Astrophysics Data System (ADS)
Cristofano, L.; Nobili, M.; Caruso, G.
2014-11-01
The formation and evolution of free surface vortices represent an important topic in many hydraulic intakes, since strong whirlpools introduce swirl flow at the intake, and could cause entrainment of floating matters and gas. In particular, gas entrainment phenomena are an important safety issue for Sodium cooled Fast Reactors, because the introduction of gas bubbles within the core causes dangerous reactivity fluctuation. In this paper, a numerical evaluation of the gas core length in free surface vortices is presented, according to two different approaches. In the first one, a prediction method, developed by the Japanese researcher Sakai and his team, has been applied. This method is based on the Burgers vortex model, and it is able to estimate the gas core length of a free surface vortex starting from two parameters calculated with single-phase CFD simulations. The two parameters are the circulation and the downward velocity gradient. The other approach consists in performing a two-phase CFD simulation of a free surface vortex, in order to numerically reproduce the gas- liquid interface deformation. Mapped convergent mesh is used to reduce numerical error and a VOF (Volume Of Fluid) method was selected to track the gas-liquid interface. Two different turbulence models have been tested and analyzed. Experimental measurements of free surface vortices gas core length have been executed, using optical methods, and numerical results have been compared with experimental measurements. The computational domain and the boundary conditions of the CFD simulations were set consistently with the experimental test conditions.
Niu, Ye; Qi, Lin; Zhang, Fen; Zhao, Yi
2018-07-30
Core/shell hydrogel microcapsules attract increasing research attention due to their potentials in tissue engineering, food engineering, and drug delivery. Current approaches for generating core/shell hydrogel microcapsules suffer from large geometric variations. Geometrically defective core/shell microcapsules need to be removed before further use. High-throughput geometric characterization of such core/shell microcapsules is therefore necessary. In this work, a continuous-flow device was developed to measure the geometric properties of microcapsules with a hydrogel shell and an aqueous core. The microcapsules were pumped through a tapered microchannel patterned with an array of interdigitated microelectrodes. The geometric parameters (the shell thickness and the diameter) were derived from the displacement profiles of the microcapsules. The results show that this approach can successfully distinguish all unencapsulated microparticles. The geometric properties of core/shell microcapsules can be determined with high accuracy. The efficacy of this method was demonstrated through a drug releasing experiment where the optimization of the electrospray process based on geometric screening can lead to controlled and extended drug releasing profiles. This method does not require high-speed optical systems, simplifying the system configuration and making it an indeed miniaturized device. The throughput of up to 584 microcapsules per minute was achieved. This study provides a powerful tool for screening core/shell hydrogel microcapsules and is expected to facilitate the applications of these microcapsules in various fields. Copyright © 2018 Elsevier B.V. All rights reserved.
Investigating the Effects of a Math-Enhanced Agricultural Teaching Methods Course
ERIC Educational Resources Information Center
Stripling, Christopher T.; Roberts, T. Grady
2013-01-01
Numerous calls have been made for agricultural education to support core academic subject matter including mathematics. Previous research has shown that the incorporation of mathematics content into a teaching methods course had a positive effect on preservice teachers' mathematics content knowledge. The purpose of this study was to investigate…
Improvement of core drill methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gatz, J.L.
1975-07-01
This report documents results of a program to evaluate effectiveness of more or less conventional subsurface samplers in obtaining representative and undisturbed samples of noncohesive alluvial materials containing large quantities of gravels and cobbles. This is the first phase of a research program to improve core drill methods. Samplers evaluated consisted of the Lawrence Livermore Laboratory membrane sampler, 4-in. Denison sampler, 6-in. Dension sampler, 5-in. Modified Denison sampler, and 3-in. thinwall drive tube. Small representative samples were obtained with the Dension samplers; no undisturbed samples were obtained. The field work was accomplished in the Rhodes Canyon area, White Sands Misslemore » Range, New Mexico.« less
Toupin-April, Karine; Barton, Jennifer; Fraenkel, Liana; Li, Linda; Grandpierre, Viviane; Guillemin, Francis; Rader, Tamara; Stacey, Dawn; Légaré, France; Jull, Janet; Petkovic, Jennifer; Scholte-Voshaar, Marieke; Welch, Vivian; Lyddiatt, Anne; Hofstetter, Cathie; De Wit, Maarten; March, Lyn; Meade, Tanya; Christensen, Robin; Gaujoux-Viala, Cécile; Suarez-Almazor, Maria E; Boonen, Annelies; Pohl, Christoph; Martin, Richard; Tugwell, Peter S
2015-12-01
Despite the importance of shared decision making for delivering patient-centered care in rheumatology, there is no consensus on how to measure its process and outcomes. The aim of this Outcome Measures in Rheumatology (OMERACT) working group is to determine the core set of domains for measuring shared decision making in intervention studies in adults with osteoarthritis (OA), from the perspectives of patients, health professionals, and researchers. We followed the OMERACT Filter 2.0 method to develop a draft core domain set by (1) forming an OMERACT working group; (2) conducting a review of domains of shared decision making; and (3) obtaining opinions of all those involved using a modified nominal group process held at a session activity at the OMERACT 12 meeting. In all, 26 people from Europe, North America, and Australia, including 5 patient research partners, participated in the session activity. Participants identified the following domains for measuring shared decision making to be included as part of the draft core set: (1) identifying the decision, (2) exchanging information, (3) clarifying views, (4) deliberating, (5) making the decision, (6) putting the decision into practice, and (7) assessing the effect of the decision. Contextual factors were also suggested. We proposed a draft core set of shared decision-making domains for OA intervention research studies. Next steps include a workshop at OMERACT 13 to reach consensus on these proposed domains in the wider OMERACT group, as well as to detail subdomains and assess instruments to develop a core outcome measurement set.
Toupin April, Karine; Barton, Jennifer; Fraenkel, Liana; Li, Linda; Grandpierre, Viviane; Guillemin, Francis; Rader, Tamara; Stacey, Dawn; Légaré, France; Jull, Janet; Petkovic, Jennifer; Scholte Voshaar, Marieke; Welch, Vivian; Lyddiatt, Anne; Hofstetter, Cathie; De Wit, Maarten; March, Lyn; Meade, Tanya; Christensen, Robin; Gaujoux-Viala, Cécile; Suarez-Almazor, Maria E.; Boonen, Annelies; Pohl, Christoph; Martin, Richard; Tugwell, Peter
2015-01-01
Objective Despite the importance of shared decision making for delivering patient-centred care in rheumatology, there is no consensus on how to measure its process and outcomes. The aim of this OMERACT working group is to determine the core set of domains for measuring shared decision making in intervention studies in adults with osteoarthritis (OA), from the perspective of patients, health professionals and researchers. Methods We followed the OMERACT Filter 2.0 to develop a draft core domain set, which consisted of: (i) forming an OMERACT working group; (ii) conducting a review of domains of shared decision making; and (iii) obtaining the opinions of stakeholders using a modified nominal group process held at a session activity at the OMERACT 2014 meeting. Results 26 stakeholders from Europe, North America and Australia, including 5 patient research partners, participated in the session activity. Participants identified the following domains for measuring shared decision making to be included as part of the Draft Core Set: 1) Identifying the decision; 2) Exchanging Information; 3) Clarifying views; 4) Deliberating; 5) Making the decision; 6) Putting the decision into practice; and 7) Assessing the impact of the decision. Contextual factors were also suggested. Conclusion We propose a Draft Core Set of shared decision making domains for OA intervention research studies. Next steps include a workshop at OMERACT 2016 to reach consensus on these proposed domains in the wider OMERACT group, as well as detail sub-domains and assess instruments to develop a Core Outcome Measurement Set. PMID:25877502
Full-Color Biomimetic Photonic Materials with Iridescent and Non-Iridescent Structural Colors
Kawamura, Ayaka; Kohri, Michinari; Morimoto, Gen; Nannichi, Yuri; Taniguchi, Tatsuo; Kishikawa, Keiki
2016-01-01
The beautiful structural colors in bird feathers are some of the brightest colors in nature, and some of these colors are created by arrays of melanin granules that act as both structural colors and scattering absorbers. Inspired by the color of bird feathers, high-visibility structural colors have been created by altering four variables: size, blackness, refractive index, and arrangement of the nano-elements. To control these four variables, we developed a facile method for the preparation of biomimetic core-shell particles with melanin-like polydopamine (PDA) shell layers. The size of the core-shell particles was controlled by adjusting the core polystyrene (PSt) particles’ diameter and the PDA shell thicknesses. The blackness and refractive index of the colloidal particles could be adjusted by controlling the thickness of the PDA shell. The arrangement of the particles was controlled by adjusting the surface roughness of the core-shell particles. This method enabled the production of both iridescent and non-iridescent structural colors from only one component. This simple and novel process of using core-shell particles containing PDA shell layers can be used in basic research on structural colors in nature and their practical applications. PMID:27658446
Full-Color Biomimetic Photonic Materials with Iridescent and Non-Iridescent Structural Colors.
Kawamura, Ayaka; Kohri, Michinari; Morimoto, Gen; Nannichi, Yuri; Taniguchi, Tatsuo; Kishikawa, Keiki
2016-09-23
The beautiful structural colors in bird feathers are some of the brightest colors in nature, and some of these colors are created by arrays of melanin granules that act as both structural colors and scattering absorbers. Inspired by the color of bird feathers, high-visibility structural colors have been created by altering four variables: size, blackness, refractive index, and arrangement of the nano-elements. To control these four variables, we developed a facile method for the preparation of biomimetic core-shell particles with melanin-like polydopamine (PDA) shell layers. The size of the core-shell particles was controlled by adjusting the core polystyrene (PSt) particles' diameter and the PDA shell thicknesses. The blackness and refractive index of the colloidal particles could be adjusted by controlling the thickness of the PDA shell. The arrangement of the particles was controlled by adjusting the surface roughness of the core-shell particles. This method enabled the production of both iridescent and non-iridescent structural colors from only one component. This simple and novel process of using core-shell particles containing PDA shell layers can be used in basic research on structural colors in nature and their practical applications.
Zhao, Xiaojun; Shi, Changxiu
2018-01-01
This study analyzed the mediation effect of a suicidal attitude from regulatory emotional self-efficacy to core self-evaluation. A measurement study was conducted among 438 college students using the Regulatory Emotional Self-Efficacy Scale, the Core Self-Evaluation Scale, and the Suicide Attitude Questionnaire. Results from the plug-in process in SPSS and the bootstrap method showed that the attitude toward suicidal behavior and the attitude toward family members of an individual who has committed suicide played a double-mediation role, from perceived self-efficacy in managing happiness to core self-evaluation. The results also showed that the attitude toward a person who committed suicide or attempted suicide played a mediation effect from perceived self-efficacy in managing curiousness to core self-evaluation. This research has great significance for improving the understanding of college students' sense of happiness and prevention for self-evaluation.
Zhao, Xiaojun; Shi, Changxiu
2018-01-01
This study analyzed the mediation effect of a suicidal attitude from regulatory emotional self-efficacy to core self-evaluation. A measurement study was conducted among 438 college students using the Regulatory Emotional Self-Efficacy Scale, the Core Self-Evaluation Scale, and the Suicide Attitude Questionnaire. Results from the plug-in process in SPSS and the bootstrap method showed that the attitude toward suicidal behavior and the attitude toward family members of an individual who has committed suicide played a double-mediation role, from perceived self-efficacy in managing happiness to core self-evaluation. The results also showed that the attitude toward a person who committed suicide or attempted suicide played a mediation effect from perceived self-efficacy in managing curiousness to core self-evaluation. This research has great significance for improving the understanding of college students’ sense of happiness and prevention for self-evaluation. PMID:29740378
Aircraft Engine Noise Research and Testing at the NASA Glenn Research Center
NASA Technical Reports Server (NTRS)
Elliott, Dave
2015-01-01
The presentation will begin with a brief introduction to the NASA Glenn Research Center as well as an overview of how aircraft engine noise research fits within the organization. Some of the NASA programs and projects with noise content will be covered along with the associated goals of aircraft noise reduction. Topics covered within the noise research being presented will include noise prediction versus experimental results, along with engine fan, jet, and core noise. Details of the acoustic research conducted at NASA Glenn will include the test facilities available, recent test hardware, and data acquisition and analysis methods. Lastly some of the actual noise reduction methods investigated along with their results will be shown.
A framework for managing core facilities within the research enterprise.
Haley, Rand
2009-09-01
Core facilities represent increasingly important operational and strategic components of institutions' research enterprises, especially in biomolecular science and engineering disciplines. With this realization, many research institutions are placing more attention on effectively managing core facilities within the research enterprise. A framework is presented for organizing the questions, challenges, and opportunities facing core facilities and the academic units and institutions in which they operate. This framework is intended to assist in guiding core facility management discussions in the context of a portfolio of facilities and within the overall institutional research enterprise.
NASA Astrophysics Data System (ADS)
Al-garadi, Mohammed Ali; Varathan, Kasturi Dewi; Ravana, Sri Devi
2017-02-01
Online social networks (OSNs) have become a vital part of everyday living. OSNs provide researchers and scientists with unique prospects to comprehend individuals on a scale and to analyze human behavioral patterns. Influential spreaders identification is an important subject in understanding the dynamics of information diffusion in OSNs. Targeting these influential spreaders is significant in planning the techniques for accelerating the propagation of information that is useful for various applications, such as viral marketing applications or blocking the diffusion of annoying information (spreading of viruses, rumors, online negative behaviors, and cyberbullying). Existing K-core decomposition methods consider links equally when calculating the influential spreaders for unweighted networks. Alternatively, the proposed link weights are based only on the degree of nodes. Thus, if a node is linked to high-degree nodes, then this node will receive high weight and is treated as an important node. Conversely, the degree of nodes in OSN context does not always provide accurate influence of users. In the present study, we improve the K-core method for OSNs by proposing a novel link-weighting method based on the interaction among users. The proposed method is based on the observation that the interaction of users is a significant factor in quantifying the spreading capability of user in OSNs. The tracking of diffusion links in the real spreading dynamics of information verifies the effectiveness of our proposed method for identifying influential spreaders in OSNs as compared with degree centrality, PageRank, and original K-core.
Walk Little, Look Lots: Tuning into Teachers' Action Research Rhythm
ERIC Educational Resources Information Center
Eberhardt, Annelie; Heinz, Manuela
2017-01-01
This article is a narrative résumé of a year-long collaborative critical inquiry into teaching methods with teachers of modern languages in Irish secondary schools. Putting myself, a cultural stranger and first-time qualitative researcher, at the core of this self-study, I discuss first the context and methodological framework of the study to…
ERIC Educational Resources Information Center
Brattin, Barbara C.
Content analysis was performed on the top six core journals for 1990 in library and information science to determine the extent of research in the field. Articles (n=186) were examined for descriptive or inferential statistics and separately for the presence of mathematical models. Results show a marked (14%) increase in research for 1990,…
Atom Core Interactive Electronic Book to Develop Self Efficacy and Critical Thinking Skills
ERIC Educational Resources Information Center
Pradina, Luthfia Puspa; Suyatna, Agus
2018-01-01
The purpose of this research is to develop interactive atomic electronic school book (IESB) to cultivate critical thinking skills and confidence of students grade 12. The method used in this research was the ADDIE (Analyze Design Development Implementation Evaluation) development procedure which is limited to the test phase of product design…
Joosten, Yvonne A.; Israel, Tiffany L.; Williams, Neely A.; Boone, Leslie R.; Schlundt, David G.; Mouton, Charles P.; Dittus, Robert S.; Bernard, Gordon R.
2015-01-01
Problem Engaging communities in research increases its relevance and may speed the translation of discoveries into improved health outcomes. Many researchers lack training to effectively engage stakeholders, whereas academic institutions lack infrastructure to support community engagement. Approach In 2009, the Meharry-Vanderbilt Community-Engaged Research Core began testing new approaches for community engagement, which led to the development of the Community Engagement Studio (CE Studio). This structured program facilitates project-specific input from community and patient stakeholders to enhance research design, implementation, and dissemination. Developers used a team approach to recruit and train stakeholders, prepare researchers to engage with stakeholders, and facilitate an in-person meeting with both. Outcomes The research core has implemented 28 CE Studios that engaged 152 community stakeholders. Participating researchers, representing a broad range of faculty ranks and disciplines, reported that input from stakeholders was valuable and that the CE Studio helped determine project feasibility and enhanced research design and implementation. Stakeholders found the CE Studio to be an acceptable method of engagement and reported a better understanding of research in general. A tool kit was developed to replicate this model and to disseminate this approach. Next Steps The research core will collect data to better understand the impact of CE Studios on research proposal submissions, funding, research outcomes, patient and stakeholder engagement in projects, and dissemination of results. They will also collect data to determine whether CE Studios increase patient-centered approaches in research and whether stakeholders who participate have more trust and willingness to participate in research. PMID:26107879
Molecular pathology of prostate cancer.
Cazares, L H; Drake, R R; Esquela-Kirscher, A; Lance, R S; Semmes, O J; Troyer, D A
2010-01-01
This chapter includes discussion of the molecular pathology of tissue, blood, urine, and expressed prostatic secretions. Because we are unable to reliably image the disease in vivo, a 12 core method that oversamples the peripheral zone is widely used. This generates large numbers of cores that need to be carefully processed and sampled. In spite of the large number of tissue cores, the amount of tumor available for study is often quite limited. This is a particular challenge for research, as new biomarker assays will need to preserve tissue architecture intact for histopathology. Methods of processing and reporting pathology are discussed. With the exception of ductal variants, recognized subtypes of prostate cancer are largely confined to research applications, and most prostate cancers are acinar. Biomarker discovery in urine and expressed prostatic secretions would be useful since these are readily obtained and are proximate fluids. The well-known challenges of biomarker discovery in blood and urine are referenced and discussed. Mediators of carcinogenesis can serve as biomarkers as exemplified by mutations in PTEN and TMPRSS2:ERG fusion. The use of proteomics in biomarker discovery with an emphasis on imaging mass spectroscopy of tissues is discussed. Small RNAs are of great interest, however, their usefulness as biomarkers in clinical decision making remains the subject of ongoing research. The chapter concludes with an overview of blood biomarkers such as circulating nucleic acids and tumor cells and bound/free isoforms of prostate specific antigen (PSA).
Zare-Farashbandi, Firoozeh; Ramezan-Shirazi, Mahtab; Ashrafi-Rizi, Hasan; Nouri, Rasool
2014-01-01
Recent progress in providing innovative solutions in the organization of electronic resources and research in this area shows a global trend in the use of new strategies such as metadata to facilitate description, place for, organization and retrieval of resources in the web environment. In this context, library metadata standards have a special place; therefore, the purpose of the present study has been a comparative study on the Central Libraries' Websites of Iran State Universities for Hyper Text Mark-up Language (HTML) and Dublin Core metadata elements usage in 2011. The method of this study is applied-descriptive and data collection tool is the check lists created by the researchers. Statistical community includes 98 websites of the Iranian State Universities of the Ministry of Health and Medical Education and Ministry of Science, Research and Technology and method of sampling is the census. Information was collected through observation and direct visits to websites and data analysis was prepared by Microsoft Excel software, 2011. The results of this study indicate that none of the websites use Dublin Core (DC) metadata and that only a few of them have used overlaps elements between HTML meta tags and Dublin Core (DC) elements. The percentage of overlaps of DC elements centralization in the Ministry of Health were 56% for both description and keywords and, in the Ministry of Science, were 45% for the keywords and 39% for the description. But, HTML meta tags have moderate presence in both Ministries, as the most-used elements were keywords and description (56%) and the least-used elements were date and formatter (0%). It was observed that the Ministry of Health and Ministry of Science follows the same path for using Dublin Core standard on their websites in the future. Because Central Library Websites are an example of scientific web pages, special attention in designing them can help the researchers to achieve faster and more accurate information resources. Therefore, the influence of librarians' ideas on the awareness of web designers and developers will be important for using metadata elements as general, and specifically for applying such standards.
Quantum chemical calculation of the equilibrium structures of small metal atom clusters
NASA Technical Reports Server (NTRS)
Kahn, L. R.
1982-01-01
Metal atom clusters are studied based on the application of ab initio quantum mechanical approaches. Because these large 'molecular' systems pose special practical computational problems in the application of the quantum mechanical methods, there is a special need to find simplifying techniques that do not compromise the reliability of the calculations. Research is therefore directed towards various aspects of the implementation of the effective core potential technique for the removal of the metal atom core electrons from the calculations.
NASA Astrophysics Data System (ADS)
Febriani, K.; Wahyuni, I.; Setiasih, S.; Hudiyono, S.
2017-07-01
The enzyme can be purified by fractional precipitation. This can be done by salt or organic solvent. In this research, purification of bromelain from pineapple core by fractional precipitation was done by 2 compounds, ammonium sulfate, and ethanol. Fractional precipitation by ammonium sulfate proved to be more effective as it yielded a higher specific activity. Specific activity by ethanol and ammonium sulfate is 4.6480 U/mg at 0-60 % saturation and 8.2243 U/mg at 50-80 % saturation.
Research on the statically thrusting propeller
NASA Technical Reports Server (NTRS)
Eisenhuth, J. J.
1978-01-01
Methods for calculating the induced flow at propeller blades were analyzed by treating the wake formation as an initial problem in time. An unsteady vortex lattice technique was applied to the wake formation and the vortex core size was studied.
ERIC Educational Resources Information Center
Telecommunications Policy Research Conference, Inc., Washington, DC.
The first of two papers presented in this section, "Price-Caps: Theory and Implementation" (Peter B. Linhart and Roy Radner) describes a proposed method of regulation involving price caps on core services and no price regulation of other services. This method is designed to replace rate-of-return regulation during a transition period to…
Main, Barry G; McNair, Angus G K; Huxtable, Richard; Donovan, Jenny L; Thomas, Steven J; Kinnersley, Paul; Blazeby, Jane M
2017-04-26
Consent remains a crucial, yet challenging, cornerstone of clinical practice. The ethical, legal and professional understandings of this construct have evolved away from a doctor-centred act to a patient-centred process that encompasses the patient's values, beliefs and goals. This alignment of consent with the philosophy of shared decision-making was affirmed in a recent high-profile Supreme Court ruling in England. The communication of information is central to this model of health care delivery but it can be difficult for doctors to gauge the information needs of the individual patient. The aim of this paper is to describe 'core information sets' which are defined as a minimum set of consensus-derived information about a given procedure to be discussed with all patients. Importantly, they are intended to catalyse discussion of subjective importance to individuals. The model described in this paper applies health services research and Delphi consensus-building methods to an idea orginally proposed 30 years ago. The hypothesis is that, first, large amounts of potentially-important information are distilled down to discrete information domains. These are then, secondly, rated by key stakeholders in multiple iterations, so that core information of agreed importance can be defined. We argue that this scientific approach is key to identifying information important to all stakeholders, which may otherwise be communicated poorly or omitted from discussions entirely. Our methods apply systematic review, qualitative, survey and consensus-building techniques to define this 'core information'. We propose that such information addresses the 'reasonable patient' standard for information disclosure but, more importantly, can serve as a spring board for high-value discussion of importance to the individual patient. The application of established research methods can define information of core importance to informed consent. Further work will establish how best to incorporate this model in routine practice.
A Data‐Rich Recruitment Core to Support Translational Clinical Research
Corregano, Lauren M.; Rainer, Tyler‐Lauren; Melendez, Caroline; Coller, Barry S.
2014-01-01
Abstract Background Underenrollment of clinical studies wastes resources and delays assessment of research discoveries. We describe the organization and impact of a centralized recruitment core delivering comprehensive recruitment support to investigators. Methods The Rockefeller University Center for Clinical and Translational Science supports a centralized recruitment core, call center, Research Volunteer Repository, data infrastructure, and staff who provide expert recruitment services to investigators. During protocol development, consultations aim to optimize enrollment feasibility, develop recruitment strategy, budget, and advertising. Services during study conduct include advertising placement, repository queries, call management, prescreening, referral, and visit scheduling. Utilization and recruitment outcomes are tracked using dedicated software. Results For protocols receiving recruitment services during 2009–2013: median time from initiation of recruitment to the first enrolled participant was 10 days; of 4,047 first‐time callers to the call center, 92% (n = 3,722) enrolled in the Research Volunteer Repository, with 99% retention; 23% of Repository enrollees subsequently enrolled in ≥1 research studies, with 89% retention. Of volunteers referred by repository queries, 49% (280/537) enrolled into the study, with 92% retained. Conclusions Provision of robust recruitment infrastructure including expertise, a volunteer repository, data capture and real‐time analysis accelerates protocol accrual. Application of recruitment science improves the quality of clinical investigation. PMID:25381717
Is Oral Temperature an Accurate Measurement of Deep Body Temperature? A Systematic Review
Mazerolle, Stephanie M.; Ganio, Matthew S.; Casa, Douglas J.; Vingren, Jakob; Klau, Jennifer
2011-01-01
Context: Oral temperature might not be a valid method to assess core body temperature. However, many clinicians, including athletic trainers, use it rather than criterion standard methods, such as rectal thermometry. Objective: To critically evaluate original research addressing the validity of using oral temperature as a measurement of core body temperature during periods of rest and changing core temperature. Data Sources: In July 2010, we searched the electronic databases PubMed, Scopus, Cumulative Index to Nursing and Allied Health Literature (CINAHL), SPORTDiscus, Academic Search Premier, and the Cochrane Library for the following concepts: core body temperature, oral, and thermometers. Controlled vocabulary was used, when available, as well as key words and variations of those key words. The search was limited to articles focusing on temperature readings and studies involving human participants. Data Synthesis: Original research was reviewed using the Physiotherapy Evidence Database (PEDro). Sixteen studies met the inclusion criteria and subsequently were evaluated by 2 independent reviewers. All 16 were included in the review because they met the minimal PEDro score of 4 points (of 10 possible points), with all but 2 scoring 5 points. A critical review of these studies indicated a disparity between oral and criterion standard temperature methods (eg, rectal and esophageal) specifically as the temperature increased. The difference was −0.50°C ± 0.31°C at rest and −0.58°C ± 0.75°C during a nonsteady state. Conclusions: Evidence suggests that, regardless of whether the assessment is recorded at rest or during periods of changing core temperature, oral temperature is an unsuitable diagnostic tool for determining body temperature because many measures demonstrated differences greater than the predetermined validity threshold of 0.27°C (0.5°F). In addition, the differences were greatest at the highest rectal temperatures. Oral temperature cannot accurately reflect core body temperature, probably because it is influenced by factors such as ambient air temperature, probe placement, and ingestion of fluids. Any reliance on oral temperature in an emergency, such as exertional heat stroke, might grossly underestimate temperature and delay proper diagnosis and treatment. PMID:22488144
Is oral temperature an accurate measurement of deep body temperature? A systematic review.
Mazerolle, Stephanie M; Ganio, Matthew S; Casa, Douglas J; Vingren, Jakob; Klau, Jennifer
2011-01-01
Oral temperature might not be a valid method to assess core body temperature. However, many clinicians, including athletic trainers, use it rather than criterion standard methods, such as rectal thermometry. To critically evaluate original research addressing the validity of using oral temperature as a measurement of core body temperature during periods of rest and changing core temperature. In July 2010, we searched the electronic databases PubMed, Scopus, Cumulative Index to Nursing and Allied Health Literature (CINAHL), SPORTDiscus, Academic Search Premier, and the Cochrane Library for the following concepts: core body temperature, oral, and thermometers. Controlled vocabulary was used, when available, as well as key words and variations of those key words. The search was limited to articles focusing on temperature readings and studies involving human participants. Original research was reviewed using the Physiotherapy Evidence Database (PEDro). Sixteen studies met the inclusion criteria and subsequently were evaluated by 2 independent reviewers. All 16 were included in the review because they met the minimal PEDro score of 4 points (of 10 possible points), with all but 2 scoring 5 points. A critical review of these studies indicated a disparity between oral and criterion standard temperature methods (eg, rectal and esophageal) specifically as the temperature increased. The difference was -0.50°C ± 0.31°C at rest and -0.58°C ± 0.75°C during a nonsteady state. Evidence suggests that, regardless of whether the assessment is recorded at rest or during periods of changing core temperature, oral temperature is an unsuitable diagnostic tool for determining body temperature because many measures demonstrated differences greater than the predetermined validity threshold of 0.27°C (0.5°F). In addition, the differences were greatest at the highest rectal temperatures. Oral temperature cannot accurately reflect core body temperature, probably because it is influenced by factors such as ambient air temperature, probe placement, and ingestion of fluids. Any reliance on oral temperature in an emergency, such as exertional heat stroke, might grossly underestimate temperature and delay proper diagnosis and treatment.
Sackett, Penelope C.; McConnell, Vicki S.; Roach, Angela L.; Priest, Susan S.; Sass, John H.
1999-01-01
Phase III of the Long Valley Exploratory Well, the Long Valley Coring Project, obtained continuous core between the depths of 7,180 and 9,831 ft (2,188 to 2,996 meters) during the summer of 1998. This report contains a compendium of information designed to facilitate post-drilling research focussed on the study of the core. Included are a preliminary stratigraphic column compiled primarily from field observations and a general description of well lithology for the Phase III drilling interval. Also included are high-resolution digital photographs of every core box (10 feet per box) as well as scanned images of pieces of recovered core. The user can easily move from the stratigraphic column to corresponding core box photographs for any depth. From there, compressed, "unrolled" images of the individual core pieces (core scans) can be accessed. Those interested in higher-resolution core scans can go to archive CD-ROMs stored at a number of locations specified herein. All core is stored at the USGS Core Research Center in Denver, Colorado where it is available to researchers following the protocol described in this report. Preliminary examination of core provided by this report and the archive CD-ROMs should assist researchers in narrowing their choices when requesting core splits.
Mixed Methods in Biomedical and Health Services Research
Curry, Leslie A.; Krumholz, Harlan M.; O’Cathain, Alicia; Plano Clark, Vicki L.; Cherlin, Emily; Bradley, Elizabeth H.
2013-01-01
Mixed methods studies, in which qualitative and quantitative methods are combined in a single program of inquiry, can be valuable in biomedical and health services research, where the complementary strengths of each approach can yield greater insight into complex phenomena than either approach alone. Although interest in mixed methods is growing among science funders and investigators, written guidance on how to conduct and assess rigorous mixed methods studies is not readily accessible to the general readership of peer-reviewed biomedical and health services journals. Furthermore, existing guidelines for publishing mixed methods studies are not well known or applied by researchers and journal editors. Accordingly, this paper is intended to serve as a concise, practical resource for readers interested in core principles and practices of mixed methods research. We briefly describe mixed methods approaches and present illustrations from published biomedical and health services literature, including in cardiovascular care, summarize standards for the design and reporting of these studies, and highlight four central considerations for investigators interested in using these methods. PMID:23322807
Tsen, Edward W J; Sitzia, Tommaso; Webber, Bruce L
2016-11-01
Trees are natural repositories of valuable environmental information that is preserved in the growth and structure of their stems, branches and roots. Dendrochronological analyses, based on the counting, crossdating and characterisation of incrementally formed wood rings, offer powerful insights for diverse fields including ecology, climatology and archaeology. The application of this toolset is likely to increase in popularity over coming decades due to advances in the field and a reduction in the cost of analyses. In research settings where the continued value of living trees subject to dendrochronological investigation is important, the use of an increment bore corer to extract trunk tissue is considered the best option to minimise negative impacts on tree health (e.g. stress and fitness). A small and fragmented body of literature, however, reports significant after-effects, and in some cases fatal outcomes, from this sampling technique. As it stands, the literature documenting increment bore coring (IBC) impacts lacks experimental consistency and is poorly replicated, making it difficult for prospective users of the method to assess likely tree responses to coring. This paucity of information has the potential to lead to destructive misuse of the method and also limits its safe implementation in circumstances where the risk of impacts may be appropriate. If IBC is to fulfil its potential as a method of choice across research fields, then we must first address our limited understanding of IBC impacts and provide a framework for its appropriate future use. Firstly, we review the historical context of studies examining the impacts of IBC on trees to identify known patterns, focal issues and biases in existing knowledge. IBC wound responses, particularly those that impact on lumber quality, have been the primary focus of prior studies. No universal treatment was identified that conclusively improved wound healing and few studies have linked wound responses to tree health impacts. Secondly, we build on literature insights using a theoretical approach to identify the most important factors to guide future research involving implementation of IBC, including innate tree characteristics and environmental factors. Thirdly, we synthesise and interrogate the quantitative data available through meta-analysis to identify risk factors for wound reactions. Although poor reporting standards, restricted scopes and a bias towards temperate ecosystems limited quantitative insight, we found that complete cambial wound closure could still harbour high rates of internal trunk decay, and that conditions favouring faster growth generally correlated with reduced indices of internal and external damage in broadleaved taxa. Finally, we propose a framework for guiding best-practice application of IBC to address knowledge gaps and maximise the utility of this method, including standardised reporting indices for identifying and minimising negative impacts on tree health. While IBC is an underutilised tool of ecological enquiry with broad applicability, the method will always incur some risk of negative impacts on the cored tree. We caution that the decision to core, or not to core, must be given careful consideration on a case-by-case basis. In time, we are confident that this choice will be better informed by evidence-based insight. © 2015 Cambridge Philosophical Society.
Leroy, Thierry; De Bellis, Fabien; Legnate, Hyacinthe; Musoli, Pascal; Kalonji, Adrien; Loor Solórzano, Rey Gastón; Cubry, Philippe
2014-06-01
The management of diversity for conservation and breeding is of great importance for all plant species and is particularly true in perennial species, such as the coffee Coffea canephora. This species exhibits a large genetic and phenotypic diversity with six different diversity groups. Large field collections are available in the Ivory Coast, Uganda and other Asian, American and African countries but are very expensive and time consuming to establish and maintain in large areas. We propose to improve coffee germplasm management through the construction of genetic core collections derived from a set of 565 accessions that are characterized with 13 microsatellite markers. Core collections of 12, 24 and 48 accessions were defined using two methods aimed to maximize the allelic diversity (Maximization strategy) or genetic distance (Maximum-Length Sub-Tree method). A composite core collection of 77 accessions is proposed for both objectives of an optimal management of diversity and breeding. This core collection presents a gene diversity value of 0.8 and exhibits the totality of the major alleles (i.e., 184) that are present in the initial set. The seven proposed core collections constitute a valuable tool for diversity management and a foundation for breeding programs. The use of these collections for collection management in research centers and breeding perspectives for coffee improvement are discussed.
Shearer, Barbara S; Klatt, Carolyn; Nagy, Suzanne P
2009-04-01
The current study evaluates the results of a previously reported method for creating a core medical electronic journal collection for a new medical school library, validates the core collection created specifically to meet the needs of the new school, and identifies strategies for making cost-effective e-journal selection decisions. Usage data were extracted for four e-journal packages (Blackwell-Synergy, Cell Press, Lippincott Williams & Wilkins, and ScienceDirect). Usage was correlated with weighted point values assigned to a core list of journal titles, and each package was evaluated for relevancy and cost-effectiveness to the Florida State University College of Medicine (FSU COM) population. The results indicated that the development of the core list was a valid method for creating a new twenty-first century, community-based medical school library. Thirty-seven journals are identified for addition to the FSU COM core list based on use by the COM, and areas of overlapping research interests between the university and the COM are identified based on use of specific journals by each population. The collection development approach that evolved at the FSU COM library was useful during the initial stages of identifying and evaluating journal selections and in assessing the relative value of a particular journal package for the FSU COM after the school was established.
Selker, Harry P.; Leslie, Laurel K.
2015-01-01
Abstract There is growing appreciation that process improvement holds promise for improving quality and efficiency across the translational research continuum but frameworks for such programs are not often described. The purpose of this paper is to present a framework and case examples of a Research Process Improvement Program implemented at Tufts CTSI. To promote research process improvement, we developed online training seminars, workshops, and in‐person consultation models to describe core process improvement principles and methods, demonstrate the use of improvement tools, and illustrate the application of these methods in case examples. We implemented these methods, as well as relational coordination theory, with junior researchers, pilot funding awardees, our CTRC, and CTSI resource and service providers. The program focuses on capacity building to address common process problems and quality gaps that threaten the efficient, timely and successful completion of clinical and translational studies. PMID:26332869
The use of qualitative methods to inform Delphi surveys in core outcome set development.
Keeley, T; Williamson, P; Callery, P; Jones, L L; Mathers, J; Jones, J; Young, B; Calvert, M
2016-05-04
Core outcome sets (COS) help to minimise bias in trials and facilitate evidence synthesis. Delphi surveys are increasingly being used as part of a wider process to reach consensus about what outcomes should be included in a COS. Qualitative research can be used to inform the development of Delphi surveys. This is an advance in the field of COS development and one which is potentially valuable; however, little guidance exists for COS developers on how best to use qualitative methods and what the challenges are. This paper aims to provide early guidance on the potential role and contribution of qualitative research in this area. We hope the ideas we present will be challenged, critiqued and built upon by others exploring the role of qualitative research in COS development. This paper draws upon the experiences of using qualitative methods in the pre-Delphi stage of the development of three different COS. Using these studies as examples, we identify some of the ways that qualitative research might contribute to COS development, the challenges in using such methods and areas where future research is required. Qualitative research can help to identify what outcomes are important to stakeholders; facilitate understanding of why some outcomes may be more important than others, determine the scope of outcomes; identify appropriate language for use in the Delphi survey and inform comparisons between stakeholder data and other sources, such as systematic reviews. Developers need to consider a number of methodological points when using qualitative research: specifically, which stakeholders to involve, how to sample participants, which data collection methods are most appropriate, how to consider outcomes with stakeholders and how to analyse these data. A number of areas for future research are identified. Qualitative research has the potential to increase the research community's confidence in COS, although this will be dependent upon using rigorous and appropriate methodology. We have begun to identify some issues for COS developers to consider in using qualitative methods to inform the development of Delphi surveys in this article.
Reproducibility in Data-Scarce Environments
NASA Astrophysics Data System (ADS)
Darch, P. T.
2016-12-01
Among the usual requirements for reproducibility are large volumes of data and computationally intensive methods. Many fields within earth sciences, however, do not meet these requirements. Data are scarce and data-intensive methods are not well established. How can science be reproducible under these conditions? What changes, both infrastructural and cultural, are needed to advance reproducibility? This paper presents findings from a long-term social scientific case study of an emergent and data scarce field, the deep subseafloor biosphere. This field studies interactions between microbial communities living in the seafloor and the physical environments they inhabit. Factors such as these make reproducibility seem a distant goal for this community: - The relative newness of the field. Serious study began in the late 1990s; - The highly multidisciplinary nature of the field. Researchers come from a range of physical and life science backgrounds; - Data scarcity. Domain researchers produce much of these data in their own onshore laboratories by analyzing cores from international ocean drilling expeditions. Allocation of cores is negotiated between researchers from many fields. These factors interact in multiple ways to inhibit reproducibility: - Incentive structures emphasize producing new data and new knowledge rather than reanalysing extant data; - Only a few steps of laboratory analyses can be reproduced - such as analysis of DNA sequences, but not extraction of DNA from cores -, due to scarcity of cores; - Methodological heterogeneity is a consequence of multidisciplinarity, as researchers bring different techniques from diverse fields. - Few standards for data collection or analysis are available at this early stage of the field; - While datasets from multiple biological and physical phenomena can be integrated into a single workflow, curation tends to be divergent. Each type of dataset may be subject to different disparate policies and contributed to different databases. Our study demonstrates that data scarcity can be particularly acute in emerging scientific fields, and often results from resource scarcity more generally. Reproducibility tends to be a low priority among the many other scientific challenges they face.
The MARTE VNIR imaging spectrometer experiment: design and analysis.
Brown, Adrian J; Sutter, Brad; Dunagan, Stephen
2008-10-01
We report on the design, operation, and data analysis methods employed on the VNIR imaging spectrometer instrument that was part of the Mars Astrobiology Research and Technology Experiment (MARTE). The imaging spectrometer is a hyperspectral scanning pushbroom device sensitive to VNIR wavelengths from 400-1000 nm. During the MARTE project, the spectrometer was deployed to the Río Tinto region of Spain. We analyzed subsets of three cores from Río Tinto using a new band modeling technique. We found most of the MARTE drill cores to contain predominantly goethite, though spatially coherent areas of hematite were identified in Core 23. We also distinguished non Fe-bearing minerals that were subsequently analyzed by X-ray diffraction (XRD) and found to be primarily muscovite. We present drill core maps that include spectra of goethite, hematite, and non Fe-bearing minerals.
The MARTE VNIR Imaging Spectrometer Experiment: Design and Analysis
NASA Astrophysics Data System (ADS)
Brown, Adrian J.; Sutter, Brad; Dunagan, Stephen
2008-10-01
We report on the design, operation, and data analysis methods employed on the VNIR imaging spectrometer instrument that was part of the Mars Astrobiology Research and Technology Experiment (MARTE). The imaging spectrometer is a hyperspectral scanning pushbroom device sensitive to VNIR wavelengths from 400-1000 nm. During the MARTE project, the spectrometer was deployed to the Río Tinto region of Spain. We analyzed subsets of three cores from Río Tinto using a new band modeling technique. We found most of the MARTE drill cores to contain predominantly goethite, though spatially coherent areas of hematite were identified in Core 23. We also distinguished non Fe-bearing minerals that were subsequently analyzed by X-ray diffraction (XRD) and found to be primarily muscovite. We present drill core maps that include spectra of goethite, hematite, and non Fe-bearing minerals.
Research in disaster settings: a systematic qualitative review of ethical guidelines.
Mezinska, Signe; Kakuk, Péter; Mijaljica, Goran; Waligóra, Marcin; O'Mathúna, Dónal P
2016-10-21
Conducting research during or in the aftermath of disasters poses many specific practical and ethical challenges. This is particularly the case with research involving human subjects. The extraordinary circumstances of research conducted in disaster settings require appropriate regulations to ensure the protection of human participants. The goal of this study is to systematically and qualitatively review the existing ethical guidelines for disaster research by using the constant comparative method (CCM). We performed a systematic qualitative review of disaster research ethics guidelines to collect and compare existing regulations. Guidelines were identified by a three-tiered search strategy: 1) searching databases (PubMed and Google Scholar), 2) an Internet search (Google), and 3) a search of the references in the included documents from the first two searches. We used the constant comparative method (CCM) for analysis of included guidelines. Fourteen full text guidelines were included for analysis. The included guidelines covered the period 2000-2014. Qualitative analysis of the included guidelines revealed two core themes: vulnerability and research ethics committee review. Within each of the two core themes, various categories and subcategories were identified. Some concepts and terms identified in analyzed guidelines are used in an inconsistent manner and applied in different contexts. Conceptual clarity is needed in this area as well as empirical evidence to support the statements and requirements included in analyzed guidelines.
Lang, Robert; Leinenbach, Andreas; Karl, Johann; Swiatek-de Lange, Magdalena; Kobold, Uwe; Vogeser, Michael
2018-05-01
Recently, site-specific fucosylation of glycoproteins has attracted attention as it can be associated with several types of cancers including prostate cancer. However, individual glycoproteins, which might serve as potential cancer markers, often are very low-concentrated in complex serum matrices and distinct glycan structures are hard to detect by immunoassays. Here, we present a mass spectrometry-based strategy for the simultaneous analysis of core-fucosylated and total prostate-specific antigen (PSA) in human serum in the low ng/ml concentration range. Sample preparation comprised an immunoaffinity capture step to enrich total PSA from human serum using anti-PSA antibody coated magnetic beads followed by consecutive two-step on-bead partial deglycosylation with endoglycosidase F3 and tryptic digestion prior to LC-MS/MS analysis. The method was shown to be linear from 0.5 to 60 ng/ml total PSA concentrations and allows the simultaneous quantification of core-fucosylated PSA down to 1 ng/ml and total PSA lower than 0.5 ng/ml. The imprecision of the method over two days ranged from 9.7-23.2% for core-fucosylated PSA and 10.3-18.3% for total PSA depending on the PSA level. The feasibility of the method in native sera was shown using three human specimens. To our knowledge, this is the first MS-based method for quantification of core-fucosylated PSA in the low ng/ml concentration range in human serum. This method could be used in large patient cohorts as core-fucosylated PSA may be a diagnostic biomarker for the differentiation of prostate cancer and other prostatic diseases, such as benign prostatic hyperplasia (BPH). Furthermore, the described strategy could be used to monitor potential changes in site-specific core-fucosylation of other low-concentrated glycoproteins, which could serve as more specific markers ("marker refinement") in cancer research. Copyright © 2018 Elsevier B.V. All rights reserved.
Metrics for Success: Strategies for Enabling Core Facility Performance and Assessing Outcomes
Hockberger, Philip E.; Meyn, Susan M.; Nicklin, Connie; Tabarini, Diane; Auger, Julie A.
2016-01-01
Core Facilities are key elements in the research portfolio of academic and private research institutions. Administrators overseeing core facilities (core administrators) require assessment tools for evaluating the need and effectiveness of these facilities at their institutions. This article discusses ways to promote best practices in core facilities as well as ways to evaluate their performance across 8 of the following categories: general management, research and technical staff, financial management, customer base and satisfaction, resource management, communications, institutional impact, and strategic planning. For each category, we provide lessons learned that we believe contribute to the effective and efficient overall management of core facilities. If done well, we believe that encouraging best practices and evaluating performance in core facilities will demonstrate and reinforce the importance of core facilities in the research and educational mission of institutions. It will also increase job satisfaction of those working in core facilities and improve the likelihood of sustainability of both facilities and personnel. PMID:26848284
Metrics for Success: Strategies for Enabling Core Facility Performance and Assessing Outcomes.
Turpen, Paula B; Hockberger, Philip E; Meyn, Susan M; Nicklin, Connie; Tabarini, Diane; Auger, Julie A
2016-04-01
Core Facilities are key elements in the research portfolio of academic and private research institutions. Administrators overseeing core facilities (core administrators) require assessment tools for evaluating the need and effectiveness of these facilities at their institutions. This article discusses ways to promote best practices in core facilities as well as ways to evaluate their performance across 8 of the following categories: general management, research and technical staff, financial management, customer base and satisfaction, resource management, communications, institutional impact, and strategic planning. For each category, we provide lessons learned that we believe contribute to the effective and efficient overall management of core facilities. If done well, we believe that encouraging best practices and evaluating performance in core facilities will demonstrate and reinforce the importance of core facilities in the research and educational mission of institutions. It will also increase job satisfaction of those working in core facilities and improve the likelihood of sustainability of both facilities and personnel.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Uyttenhove, W.; Baeten, P.; Kochetkov, A.
The GUINEVERE (Generation of Uninterrupted Intense Neutron pulses at the lead Venus Reactor) project was launched in 2006 within the framework of FP6 EUROTRANS in order to validate online reactivity monitoring and subcriticality level determination in accelerator driven systems (ADS). Therefore, the VENUS reactor at SCK.CEN in Mol, Belgium, was modified towards a fast core (VENUS-F) and coupled to the GENEPI-3C accelerator built by CNRS. The accelerator can operate in both continuous and pulsed mode. The VENUS-F core is loaded with enriched Uranium and reflected with solid lead. A well-chosen critical reference state is indispensable for the validation of themore » online subcriticality monitoring methodology. Moreover, a benchmarking tool is required for nuclear data research and code validation. In this paper, the design and the importance of the critical reference state for the GUINEVERE project are motivated. The results of the first experimental phase on the critical core are presented. The control rods worth is determined by the positive period method and the application of the Modified Source Multiplication (MSM) method allows the determination of the worth of the safety rods. The results are implemented in the VENUS-F core certificate for full exploitation of the critical core. (authors)« less
Wu, Ying; Xue, Yunzhen; Xue, Zhanling
2017-01-01
Abstract The medical university students in China whose school work is relatively heavy and educational system is long are a special professional group. Many students have psychological problems more or less. So, to understand their personality characteristics will provide a scientific basis for the intervention of psychological health. We selected top 30 personality trait words according to the order of frequency. Additionally, some methods such as social network analysis (SNA) and visualization technology of mapping knowledge domain were used in this study. Among these core personality trait words Family conscious had the 3 highest centralities and possessed the largest core status and influence. From the analysis of core-peripheral structure, we can see polarized core-perpheral structure was quite obvious. From the analysis of K-plex, there were in total 588 “K-2”K-plexs. From the analysis of Principal Components, we selected the 11 principal components. This study of personality not only can prevent disease, but also provide a scientific basis for students’ psychological healthy education. In addition, we have adopted SNA to pay more attention to the relationship between personality trait words and the connection among personality dimensions. This study may provide the new ideas and methods for the research of personality structure. PMID:28906409
Wu, Ying; Xue, Yunzhen; Xue, Zhanling
2017-09-01
The medical university students in China whose school work is relatively heavy and educational system is long are a special professional group. Many students have psychological problems more or less. So, to understand their personality characteristics will provide a scientific basis for the intervention of psychological health.We selected top 30 personality trait words according to the order of frequency. Additionally, some methods such as social network analysis (SNA) and visualization technology of mapping knowledge domain were used in this study.Among these core personality trait words Family conscious had the 3 highest centralities and possessed the largest core status and influence. From the analysis of core-peripheral structure, we can see polarized core-perpheral structure was quite obvious. From the analysis of K-plex, there were in total 588 "K-2"K-plexs. From the analysis of Principal Components, we selected the 11 principal components.This study of personality not only can prevent disease, but also provide a scientific basis for students' psychological healthy education. In addition, we have adopted SNA to pay more attention to the relationship between personality trait words and the connection among personality dimensions. This study may provide the new ideas and methods for the research of personality structure.
Wang, Zhenjun; Huang, Jiehao
2018-04-01
The phenomenon of water sensitivity often occurs in the oil reservoir core during the process of crude oil production, which seriously affects the efficiency of oil extraction. In recent years, near-well ultrasonic processing technology attaches more attention due to its safety and energy efficient. In this paper, the comparison of removing core water sensitivity by ultrasonic wave, chemical injection and ultrasound-chemical combination technique are investigated through experiments. Results show that: lower ultrasonic frequency and higher power can improve the efficiency of core water sensitivity removal; the effects of removing core water sensitivity under ultrasonic treatment get better with increase of core initial permeability; the effect of removing core water sensitivity using ultrasonic treatment won't get better over time. Ultrasonic treatment time should be controlled in a reasonable range; the effect of removing core water sensitivity using chemical agent alone is slightly better than that using ultrasonic treatment, however, chemical injection could be replaced by ultrasonic treatment for removing core water sensitivity from the viewpoint of oil reservoir protection and the sustainable development of oil field; ultrasound-chemical combination technique has the best effect for water sensitivity removal than using ultrasonic treatment or chemical injection alone. Copyright © 2017 Elsevier B.V. All rights reserved.
Rowland, Susan L; Smith, Christopher A; Gillam, Elizabeth M A; Wright, Tony
2011-07-01
A strong, recent movement in tertiary education is the development of conceptual, or "big idea" teaching. The emphasis in course design is now on promoting key understandings, core competencies, and an understanding of connections between different fields. In biochemistry teaching, this radical shift from the content-based tradition is being driven by the "omics" information explosion; we can no longer teach all the information we have available. Biochemistry is a core, enabling discipline for much of modern scientific research, and biochemistry teaching is in urgent need of a method for delivery of conceptual frameworks. In this project, we aimed to define the key concepts in biochemistry. We find that the key concepts we defined map well onto the core science concepts recommended by the Vision and Change project. We developed a new method to present biochemistry through the lenses of these concepts. This new method challenged the way we thought about biochemistry as teachers. It also stimulated the majority of the students to think more deeply about biochemistry and to make links between biochemistry and material in other courses. This method is applicable to the full spectrum of content usually taught in biochemistry. Copyright © 2011 Wiley Periodicals, Inc.
Integral Full Core Multi-Physics PWR Benchmark with Measured Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Forget, Benoit; Smith, Kord; Kumar, Shikhar
In recent years, the importance of modeling and simulation has been highlighted extensively in the DOE research portfolio with concrete examples in nuclear engineering with the CASL and NEAMS programs. These research efforts and similar efforts worldwide aim at the development of high-fidelity multi-physics analysis tools for the simulation of current and next-generation nuclear power reactors. Like all analysis tools, verification and validation is essential to guarantee proper functioning of the software and methods employed. The current approach relies mainly on the validation of single physic phenomena (e.g. critical experiment, flow loops, etc.) and there is a lack of relevantmore » multiphysics benchmark measurements that are necessary to validate high-fidelity methods being developed today. This work introduces a new multi-cycle full-core Pressurized Water Reactor (PWR) depletion benchmark based on two operational cycles of a commercial nuclear power plant that provides a detailed description of fuel assemblies, burnable absorbers, in-core fission detectors, core loading and re-loading patterns. This benchmark enables analysts to develop extremely detailed reactor core models that can be used for testing and validation of coupled neutron transport, thermal-hydraulics, and fuel isotopic depletion. The benchmark also provides measured reactor data for Hot Zero Power (HZP) physics tests, boron letdown curves, and three-dimensional in-core flux maps from 58 instrumented assemblies. The benchmark description is now available online and has been used by many groups. However, much work remains to be done on the quantification of uncertainties and modeling sensitivities. This work aims to address these deficiencies and make this benchmark a true non-proprietary international benchmark for the validation of high-fidelity tools. This report details the BEAVRS uncertainty quantification for the first two cycle of operations and serves as the final report of the project.« less
van der Stap, Djamilla K.D.; Rider, Lisa G.; Alexanderson, Helene; Huber, Adam M.; Gualano, Bruno; Gordon, Patrick; van der Net, Janjaap; Mathiesen, Pernille; Johnson, Liam G.; Ernste, Floranne C.; Feldman, Brian M.; Houghton, Kristin M.; Singh-Grewal, Davinder; Kutzbach, Abraham Garcia; Munters, Li Alemo; Takken, Tim
2015-01-01
OBJECTIVES Currently there are no evidence-based recommendations regarding which fitness and strength tests to use for patients with childhood or adult idiopathic inflammatory myopathies (IIM). This hinders clinicians and researchers in choosing the appropriate fitness- or muscle strength-related outcome measures for these patients. Through a Delphi survey, we aimed to identify a candidate core-set of fitness and strength tests for children and adults with IIM. METHODS Fifteen experts participated in a Delphi survey that consisted of five stages to achieve a consensus. Using an extensive search of published literature and through the expertise of the experts, a candidate core-set based on expert opinion and clinimetric properties was developed. Members of the International Myositis Assessment and Clinical Studies Group (IMACS) were invited to review this candidate core-set during the final stage, which led to a final candidate core-set. RESULTS A core-set of fitness- and strength-related outcome measures was identified for children and adults with IIM. For both children and adults, different tests were identified and selected for maximal aerobic fitness, submaximal aerobic fitness, anaerobic fitness, muscle strength tests and muscle function tests. CONCLUSIONS The core-set of fitness and strength-related outcome measures provided by this expert consensus process will assist practitioners and researchers in deciding which tests to use in IIM patients. This will improve the uniformity of fitness and strength tests across studies, thereby facilitating the comparison of study results and therapeutic exercise program outcomes among patients with IIM. PMID:26568594
The History of Early Polar Ice Cores
2008-01-01
different aspects of the research discussed herein; they include J. Weertman, R, Rutford, G. Denton, H. Ueda, J. Brown, G. Frankenstein , B. Stauffer, H...continuously measure micro-particle concentrations and eruptive volcanic-acid horizons by the new electrical conductivity method (ECM) invented by
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gryzinski, M.A.; Maciak, M.
MARIA reactor is an open-pool research reactor what gives the chance to install uranium fission converter on the periphery of the core. It could be installed far enough not to induce reactivity of the core but close enough to produce high flux of fast neutrons. Special design of the converter is now under construction. It is planned to set the research stand based on such uranium converter in the near future: in 2015 MARIA reactor infrastructure should be ready (preparation started in 2013), in 2016 the neutron beam starts and in 2017 opening the stand for material and biological researchmore » or for medical training concerning BNCT. Unused for many years, horizontal channel number H2 at MARIA research rector in Poland, is going to be prepared as a part of unique stand. The characteristics of the neutron beam will be significant advantage of the facility. High flux of neutrons at the level of 2x10{sup 9} cm{sup -2}s{sup -1} will be obtainable by uranium neutron converter located 90 cm far from the reactor core fuel elements (still inside reactor core basket between so called core reflectors). Due to reaction of core neutrons with converter U{sub 3}Si{sub 2} material it will produce high flux of fast neutrons. After conversion neutrons will be collimated and moderated in the channel by special set of filters and moderators. At the end of H2 channel i.e. at the entrance to the research room neutron energy will be in the epithermal energy range with neutron intensity at least at the level required for BNCT (2x10{sup 9} cm{sup -2}s{sup -1}). For other purposes density of the neutron flux could be smaller. The possibility to change type and amount of installed filters/moderators which enables getting different properties of the beam (neutron energy spectrum, neutron-gamma ratio and beam profile and shape) is taken into account. H2 channel is located in separate room which is adjacent to two other empty rooms under the preparation for research laboratories (200 m2). It is planned to create fully equipped complex facility possible to perform various experiments on the intensive neutron beam. Epithermal neutron beam enables development across the full spectrum of materials research for example shielding concrete tests or electronic devices construction improvement. Due to recent reports on the construction of the accelerator for the Boron Neutron Capture Therapy (BNCT) it has the opportunity to become useful and successful method in the fight against brain and other types of cancers not treated with well known medical methods. In Europe there is no such epithermal neutron source which could be used throughout the year for training and research for scientist working on BNCT what makes the stand unique in Europe. Also our research group which specializes in mixed radiation dosimetry around nuclear and medical facilities would be able to carry out research on new detectors and methods of measurements for radiological protection and in-beam (therapeutic) dosimetry. Another group of scientists from National Centre for Nuclear Research, where MARIA research reactor is located, is involved in research of gamma detector systems. There is an idea to develop Prompt-gamma Single Photon Emission Computed Tomography (Pg- SPECT). This method could be used as imaging system for compounds emitting gamma rays after nuclear reaction with thermal neutrons e.g. for boron concentration in BNCT. Inside the room, where H2 channel is located, there is another horizontal channel - H1 which is also unused. Simultaneously with the construction of the H2 stand it will be possible to create special pneumatic horizontal mail inside the H1 channel for irradiation material samples in the vicinity of the core i.e. in the distal part of the H1 channel. It might expand the scope of research at the planned neutron station. Secondly it is planned to equip both stands with moveable positioning system, video system and facilities to perform animal experiments (anaesthesia, vital signs control, imaging devices, positioning). These all above make constructed station unique in the world (uranium fission converter-based beam) and the only one of such intense neutron beam in the Europe. Moreover implementation of the station would allow the development of research on a number of issues for researchers from all over the Europe. One of very important advantages of the station is undisturbed exploitation of the reactor and other vertical and horizontal channels. MARIA reactor operates 6000 hours per year and that amount of time will be achievable for research on the neutron station. It have to be underlined that new neutron station will work parallel to all another ventures. (authors)« less
Chuter, V H; de Jonge, X A K Janse; Thompson, B M; Callister, R
2015-03-01
Poor core stability is linked to a range of musculoskeletal pathologies and core-strengthening programmes are widely used as treatment. Treatment outcomes, however, are highly variable, which may be related to the method of delivery of core strengthening programmes. We investigated the effect of identical 8 week core strengthening programmes delivered as either supervised or home-based on measures of core stability. Participants with poor core stability were randomised into three groups: supervised (n=26), home-based (n=26) or control (n=26). Primary outcomes were the Sahrmann test and the Star Excursion Balance Test (SEBT) for dynamic core stability and three endurance tests (side-bridge, flexor and Sorensen) for static core stability. The exercise programme was devised and supervised by an exercise physiologist. Analysis of covariance on the change from baseline over the 8 weeks showed that the supervised group performed significantly better on all core stability measures than both the home-based and control group. The home-based group produced significant improvements compared to the control group in all static core stability tests, but not in most of the dynamic core stability tests (Sahrmann test and two out of three directions of the SEBT). Our results support the use of a supervised core-strengthening programme over a home-based programme to maximise improvements in core stability, especially in its dynamic aspects. Based on our findings in healthy individuals with low core stability, further research is recommended on potential therapeutic benefits of supervised core-strengthening programmes for pathologies associated with low core stability. ACTRN12613000233729. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Global informetric perspective studies on translational medical research
2013-01-01
Background Translational medical research literature has increased rapidly in the last few decades and played a more and more important role during the development of medicine science. The main aim of this study is to evaluate the global performance of translational medical research during the past few decades. Methods Bibliometric, social network analysis, and visualization technologies were used for analyzing translational medical research performance from the aspects of subject categories, journals, countries, institutes, keywords, and MeSH terms. Meanwhile, the co-author, co-words and cluster analysis methods were also used to trace popular topics in translational medical research related work. Results Research output suggested a solid development in translational medical research, in terms of increasing scientific production and research collaboration. We identified the core journals, mainstream subject categories, leading countries, and institutions in translational medical research. There was an uneven distribution of publications at authorial, institutional, and national levels. The most commonly used keywords that appeared in the articles were “translational research”, “translational medicine”, “biomarkers”, “stroke”, “inflammation”, “cancer”, and “breast cancer”. Conclusions The subject categories of “Research & Experimental Medicine”, “Medical Laboratory Technology”, and “General & Internal Medicine” play a key role in translational medical research both in production and in its networks. Translational medical research and CTS, etc. are core journals of translational research. G7 countries are the leading nations for translational medical research. Some developing countries, such as P.R China, also play an important role in the communication of translational research. The USA and its institutions play a dominant role in the production, collaboration, citations and high quality articles. The research trends in translational medical research involve drug design and development, pathogenesis and treatment of disease, disease model research, evidence-based research, and stem and progenitor cells. PMID:23885955
Skirton, Heather; Barnoy, Sivia; Ingvoldstad, Charlotta; van Kessel, Ingrid; Patch, Christine; O'Connor, Anita; Serra-Juhe, Clara; Stayner, Barbara; Voelckel, Marie-Antoinette
2013-10-01
Genetic counsellors have been working in some European countries for at least 30 years. Although there are great disparities between the numbers, education, practice and acceptance of these professionals across Europe, it is evident that genetic counsellors and genetic nurses in Europe are working autonomously within teams to deliver patient care. The aim of this study was to use the Delphi research method to develop a core curriculum to guide the educational preparation of these professionals in Europe. The Delphi method enables the researcher to utilise the views and opinions of a group of recognised experts in the field of study; this study consisted of four phases. Phases 1 and 4 consisted of expert workshops, whereas data were collected in phases 2 and 3 (n=35) via online surveys. All participants in the study were considered experts in the field of genetic counselling. The topics considered essential for genetic counsellor training have been organised under the following headings: (1) counselling; (2) psychological issues; (3) medical genetics; (4) human genetics; (5) ethics, law and sociology; (6) professional practice; and (7) education and research. Each topic includes the knowledge, skills and attitudes required to enable genetic counsellors to develop competence. In addition, it was considered by the experts that clinical practice should comprise 50% of the educational programme. The core Master programme curriculum will enable current courses to be assessed and inform the design of future educational programmes for European genetic counsellors.
Zhang, Tiejun; Bai, Gang; Han, Yanqi; Xu, Jun; Gong, Suxiao; Li, Yazhuo; Zhang, Hongbing; Liu, Changxiao
2018-05-15
Quality of traditional Chinese medicine (TCM) plays a critical role in industry of TCM. Rapid development of TCM pharmaceutical areas is, however, greatly limited, since there are many issues not been resolved, concerning the quality study of TCM. Core concept of TCM quality as well as the characteristics of TCM was discussed, in order to guide the quality research and evaluation of TCM, further improve the level of TCM quality control. In this review, on the basis of systematic analysis of fundamental property and features of TCM in clinical application, the approaches and methods of quality marker (Q-marker) study were proposed through combination of transitivity and traceability of essentials of quality, correlation between chemical ingredients and drug property/efficacy, as well as analysis of endemicity of ingredients sharing similar pharmacophylogenetic and biosynthetic approaches. The approaches and methods of Q-marker study were proposed and the novel integrated pattern for quality assessment and control of TCM was established. The core concept of Q-marker has helped to break through the bottleneck of the current fragmented quality research of TCM and improved the scientificity, integrity and systematicness of quality control. Copyright © 2018 Elsevier GmbH. All rights reserved.
ERIC Educational Resources Information Center
Eaton, Judy; Long, Jennifer; Morris, David
2018-01-01
We developed a course, as part of our institution's core program, which provides students with a foundation in academic literacy in the social sciences: how to find, read, critically assess, and communicate about social science research. It is not a research methods course; rather, it is intended to introduce students to the social sciences and be…
Super-nodal methods for space-time kinetics
NASA Astrophysics Data System (ADS)
Mertyurek, Ugur
The purpose of this research has been to develop an advanced Super-Nodal method to reduce the run time of 3-D core neutronics models, such as in the NESTLE reactor core simulator and FORMOSA nuclear fuel management optimization codes. Computational performance of the neutronics model is increased by reducing the number of spatial nodes used in the core modeling. However, as the number of spatial nodes decreases, the error in the solution increases. The Super-Nodal method reduces the error associated with the use of coarse nodes in the analyses by providing a new set of cross sections and ADFs (Assembly Discontinuity Factors) for the new nodalization. These so called homogenization parameters are obtained by employing consistent collapsing technique. During this research a new type of singularity, namely "fundamental mode singularity", is addressed in the ANM (Analytical Nodal Method) solution. The "Coordinate Shifting" approach is developed as a method to address this singularity. Also, the "Buckling Shifting" approach is developed as an alternative and more accurate method to address the zero buckling singularity, which is a more common and well known singularity problem in the ANM solution. In the course of addressing the treatment of these singularities, an effort was made to provide better and more robust results from the Super-Nodal method by developing several new methods for determining the transverse leakage and collapsed diffusion coefficient, which generally are the two main approximations in the ANM methodology. Unfortunately, the proposed new transverse leakage and diffusion coefficient approximations failed to provide a consistent improvement to the current methodology. However, improvement in the Super-Nodal solution is achieved by updating the homogenization parameters at several time points during a transient. The update is achieved by employing a refinement technique similar to pin-power reconstruction. A simple error analysis based on the relative residual in the 3-D few group diffusion equation at the fine mesh level is also introduced in this work.
2017-01-01
Background To be meaningful, a core outcome set (COS) should be relevant to all stakeholders including patients and carers. This review aimed to explore the methods by which patients and carers have been included as participants in COS development exercises and, in particular, the use and reporting of qualitative methods. Methods In August 2015, a search of the Core Outcomes Measures in Effectiveness Trials (COMET) database was undertaken to identify papers involving patients and carers in COS development. Data were extracted to identify the data collection methods used in COS development, the number of health professionals, patients and carers participating in these, and the reported details of qualitative research undertaken. Results Fifty-nine papers reporting patient and carer participation were included in the review, ten of which reported using qualitative methods. Although patients and carers participated in outcome elicitation for inclusion in COS processes, health professionals tended to dominate the prioritisation exercises. Of the ten qualitative papers, only three were reported as a clear pre-designed part of a COS process. Qualitative data were collected using interviews, focus groups or a combination of these. None of the qualitative papers reported an underpinning methodological framework and details regarding data saturation, reflexivity and resource use associated with data collection were often poorly reported. Five papers reported difficulty in achieving a diverse sample of participants and two reported that a large and varied range of outcomes were often identified by participants making subsequent rating and ranking difficult. Conclusions Consideration of the best way to include patients and carers throughout the COS development process is needed. Additionally, further work is required to assess the potential role of qualitative methods in COS, to explore the knowledge produced by different qualitative data collection methods, and to evaluate the time and resources required to incorporate qualitative methods into COS development. PMID:28301485
Progress on core outcome sets for critical care research.
Blackwood, Bronagh; Marshall, John; Rose, Louise
2015-10-01
Appropriate selection and definition of outcome measures are essential for clinical trials to be maximally informative. Core outcome sets (an agreed, standardized collection of outcomes measured and reported in all trials for a specific clinical area) were developed due to established inconsistencies in trial outcome selection. This review discusses the rationale for, and methods of, core outcome set development, as well as current initiatives in critical care. Recent systematic reviews of reported outcomes and measurement instruments relevant to the critically ill highlight inconsistencies in outcome selection, definition, and measurement, thus establishing the need for core outcome sets. Current critical care initiatives include development of core outcome sets for trials aimed at reducing mechanical ventilation duration; rehabilitation following critical illness; long-term outcomes in acute respiratory failure; and epidemic and pandemic studies of severe acute respiratory infection. Development and utilization of core outcome sets for studies relevant to the critically ill is in its infancy compared to other specialties. Notwithstanding, core outcome set development frameworks and guidelines are available, several sets are in various stages of development, and there is strong support from international investigator-led collaborations including the International Forum for Acute Care Trialists.
Colditz, Graham A.; Dobbins, Maureen; Emmons, Karen M.; Kerner, Jon F.; Padek, Margaret; Proctor, Enola K.; Stange, Kurt C.
2015-01-01
Abstract Background This paper reports core competencies for dissemination and implementation (D&I) grant application writing and provides tips for writing a successful proposal. Methods Two related phases were used to collect the data: a card sorting process among D&I researchers and an expert review among a smaller set of researchers. Card sorting was completed by 123 respondents. In the second phase, a series of grant application writing tips were developed based on the combined 170 years of grant review experience of the writing team. Results The card sorting resulted in 12 core competencies for D&I grant application writing that covered the main sections in a grant application to the US National Institutes of Health: (a) specific aims that provide clear rationale, objectives, and an overview of the research plan; (b) significance that frames and justifies the importance of a D&I question; (c) innovation that articulates novel products and new knowledge; and (d) approach that uses a relevant D&I model, addresses measurement and the D&I context, and includes an analysis plan well‐tied to the aims and measures. Conclusions Writing a successful D&I grant application is a skill that can be learned with experience and attention to the core competencies articulated in this paper. PMID:26577630
The quest for connection in interpersonal and therapeutic relationships.
Wiseman, Hadas
2017-07-01
This paper focuses on the need for connection as a common core theme at the heart of both close relationships and therapeutic relationships and explores ways to connect these two research domains that have evolved as separate fields of study. Bowlby's attachment theory provides a strong conceptual and empirical base for linking human bonds and bonds in psychotherapy. The growing body of research intersecting attachment and psychotherapy (1980-2014) is documented, and meta-analytic studies on attachment-outcome and attachment-alliance links are highlighted. Five ways of studying attachment as a variable in psychotherapy are underscored: as moderator, as mediator, as outcome, client-therapist attachment match, and as process. By integrating conceptualizations and methods in studying relational narratives of client-therapist dyads (Core Conflictual Relationship Theme), measures of alliance, and client attachment to therapist during psychotherapy, we may discover unique client-therapist relational dances. Future fine-grained studies on how to promote core authentic relational relearning are important to clinicians, supervisors and trainers, who all share the common quest to alleviate interpersonal distress and enhance wellbeing. Directions for advancing research on interpersonal and therapeutic relationships are suggested. Learning from each other, both researchers of close relationships and of psychotherapy relationships can gain a deeper and multidimensional understanding of complex relational processes and outcomes.
Core data elements tracking elder sexual abuse.
Hanrahan, Nancy P; Burgess, Ann W; Gerolamo, Angela M
2005-05-01
Sexual abuse in the older adult population is an understudied vector of violent crimes with significant physical and psychological consequences for victims and families. Research requires a theoretical framework that delineates core elements using a standardized instrument. To develop a conceptual framework and identify core data elements specific to the older adult population, clinical, administrative, and criminal experts were consulted using a nominal group method to revise an existing sexual assault instrument. The revised instrument could be used to establish a national database of elder sexual abuse. The database could become a standard reference to guide the detection, assessment, and prosecution of elder sexual abuse crimes as well as build a base from which policy makers could plan and evaluate interventions that targeted risk factors.
Hardware/software codesign for embedded RISC core
NASA Astrophysics Data System (ADS)
Liu, Peng
2001-12-01
This paper describes hardware/software codesign method of the extendible embedded RISC core VIRGO, which based on MIPS-I instruction set architecture. VIRGO is described by Verilog hardware description language that has five-stage pipeline with shared 32-bit cache/memory interface, and it is controlled by distributed control scheme. Every pipeline stage has one small controller, which controls the pipeline stage status and cooperation among the pipeline phase. Since description use high level language and structure is distributed, VIRGO core has highly extension that can meet the requirements of application. We take look at the high-definition television MPEG2 MPHL decoder chip, constructed the hardware/software codesign virtual prototyping machine that can research on VIRGO core instruction set architecture, and system on chip memory size requirements, and system on chip software, etc. We also can evaluate the system on chip design and RISC instruction set based on the virtual prototyping machine platform.
Campbell, Malcolm; Gibson, Will; Hall, Andy; Richards, David; Callery, Peter
2008-05-01
Web-based technologies are increasingly being used to create modes of online learning for nurses but their effect has not been assessed in nurse education. Assess whether participation in face-to-face discussion seminars or online asynchronous discussion groups had different effects on educational attainment in a web-based course. Non-randomised or quasi-experimental design with two groups-students choosing to have face-to-face discussion seminars and students choosing to have online discussions. The Core Methods module of a postgraduate research methods course. All 114 students participating in the first 2 yr during which the course teaching material was delivered online. Assignment mark for Core Methods course module. Background details of the students, their choices of modules and assignment marks were collected as part of the routine course administration. Students' online activities were identified using the student tracking facility within WebCT. Regression models were fitted to explore the association between available explanatory variables and assignment mark. Students choosing online discussions had a higher Core Methods assignment mark (mean 60.8/100) than students choosing face-to-face discussions (54.4); the difference was statistically significant (t=3.13, df=102, p=0.002), although this ignores confounding variables. Among online discussion students, assignment mark was significantly correlated with the numbers of discussion messages read (Kendall's tau(b)=0.22, p=0.050) and posted (Kendall's tau(b)=0.27, p=0.017); among face-to-face discussion students, it was significantly associated with the number of non-discussion hits in WebCT (Kendall's tau(b)=0.19, p=0.036). In regression analysis, choice of discussion method, whether an M.Phil./Ph.D. student, number of non-discussion hits in WebCT, number of online discussion messages read and number posted were associated with assignment mark at the 5% level of significance when taken singly; in combination, only whether an M.Phil./Ph.D. student (p=0.024) and number of non-discussion hits (p=0.045) retained significance. This study demonstrates that a research methods course can be delivered to postgraduate healthcare students at least as successfully by an entirely online method in which students participate in online discussion as by a blended method in which students accessing web-based teaching material attend face-to-face seminar discussions. Increased online activity was associated with higher assignment marks. The study highlights new opportunities for educational research that arise from the use of virtual learning environments that routinely record the activities of learners and tutors.
Crawley-Low, Jill
2006-01-01
Objective: Bibliometric techniques were used to analyze the citation patterns of researchers publishing in the American Journal of Veterinary Research (AJVR). Methods: The more than 25,000 bibliographic references appearing in the AJVR from 2001 to 2003 were examined for material type, date of publication, and frequency of journals cited. Journal titles were ranked in decreasing order of productivity to create a core list of journals most frequently used by veterinary medical researchers. Results: The majority of items cited were journals (88.8%), followed by books (9.8%) and gray literature (2.1%). Current sources of information were favored; 65% of the journals and 77% of the books were published in 1990 or later. Dividing the cited articles into 3 even zones revealed that 24 journals produced 7,361 cited articles in the first zone. One hundred thirty-nine journals were responsible for 7,414 cited articles in zone 2, and 1,409 journals produced 7,422 cited articles in zone 3. Conclusions: A core collection of veterinary medicine journals would include 49 veterinary medicine journals from zones 1 and 2. Libraries supporting a veterinary curriculum or veterinary research should also include veterinary medical journals from Zone 3, as well as provide access to journals in non-veterinary subjects such as biochemistry, virology, orthopedics, and surgery and a selection of general science and medical journals. PMID:17082835
[Research progress on mechanical performance evaluation of artificial intervertebral disc].
Li, Rui; Wang, Song; Liao, Zhenhua; Liu, Weiqiang
2018-03-01
The mechanical properties of artificial intervertebral disc (AID) are related to long-term reliability of prosthesis. There are three testing methods involved in the mechanical performance evaluation of AID based on different tools: the testing method using mechanical simulator, in vitro specimen testing method and finite element analysis method. In this study, the testing standard, testing equipment and materials of AID were firstly introduced. Then, the present status of AID static mechanical properties test (static axial compression, static axial compression-shear), dynamic mechanical properties test (dynamic axial compression, dynamic axial compression-shear), creep and stress relaxation test, device pushout test, core pushout test, subsidence test, etc. were focused on. The experimental techniques using in vitro specimen testing method and testing results of available artificial discs were summarized. The experimental methods and research status of finite element analysis were also summarized. Finally, the research trends of AID mechanical performance evaluation were forecasted. The simulator, load, dynamic cycle, motion mode, specimen and test standard would be important research fields in the future.
Kumar, Sudhir; Stecher, Glen; Peterson, Daniel; Tamura, Koichiro
2012-10-15
There is a growing need in the research community to apply the molecular evolutionary genetics analysis (MEGA) software tool for batch processing a large number of datasets and to integrate it into analysis workflows. Therefore, we now make available the computing core of the MEGA software as a stand-alone executable (MEGA-CC), along with an analysis prototyper (MEGA-Proto). MEGA-CC provides users with access to all the computational analyses available through MEGA's graphical user interface version. This includes methods for multiple sequence alignment, substitution model selection, evolutionary distance estimation, phylogeny inference, substitution rate and pattern estimation, tests of natural selection and ancestral sequence inference. Additionally, we have upgraded the source code for phylogenetic analysis using the maximum likelihood methods for parallel execution on multiple processors and cores. Here, we describe MEGA-CC and outline the steps for using MEGA-CC in tandem with MEGA-Proto for iterative and automated data analysis. http://www.megasoftware.net/.
Torous, John
2017-01-01
Research studies that leverage emerging technologies, such as passive sensing devices and mobile apps, have demonstrated encouraging potential with respect to favorably influencing the human condition. As a result, the nascent fields of mHealth and digital medicine have gained traction over the past decade as demonstrated in the United States by increased federal funding for research that cuts across a broad spectrum of health conditions. The existence of mHealth and digital medicine also introduced new ethical and regulatory challenges that both institutional review boards (IRBs) and researchers are struggling to navigate. In response, the Connected and Open Research Ethics (CORE) initiative was launched. The CORE initiative has employed a participatory research approach, whereby researchers and IRB affiliates are involved in identifying the priorities and functionality of a shared resource. The overarching goal of CORE is to develop dynamic and relevant ethical practices to guide mHealth and digital medicine research. In this Viewpoint paper, we describe the CORE initiative and call for readers to join the CORE Network and contribute to the bigger conversation on ethics in the digital age. PMID:28179216
ACT-CCREC Core Research Program: Study Questions and Design. ACT Working Paper Series. WP-2015-01
ERIC Educational Resources Information Center
Cruce, Ty M.
2015-01-01
This report provides a non-technical overview of the guiding research questions and research design for the ACT-led core research program conducted on behalf of the GEAR UP College and Career Readiness Evaluation Consortium (CCREC). The core research program is a longitudinal study of the effectiveness of 14 GEAR UP state grants on the academic…
NASA Astrophysics Data System (ADS)
Ryang, Woo Hun; Han, Jooyoung
2017-04-01
Geoacoustic models provide submarine environmental data to predict sound transmission through submarine bottom layers of sedimentary strata and acoustic basement. This study reconstructed four geoacoustic models for sediments of 50 m thick at the Jeongdongjin area in the western continental margin of the East Sea. Bottom models were based on about 1100 line-km data of the high-resolution air-gun seismic and subbottom profiles (SBP) with sediment cores. The 4 piston cores were analyzed for reconstruction of the bottom and geoacoustic models in the study area, together with 2 long cores in the adjacent area. P-wave speed in the core sediment was measured by the pulse transmission technique, and the resonance frequency of piezoelectric transducers was maintained at 1 MHz. Measurements of 42 P-wave speeds and 41 attenuations were fulfilled in three core sediments. For actual modeling, the P-wave speeds of the models were compensated to in situ depth below the sea floor using the Hamilton method. These geoacoustic models of coastal bottom strata will be used for geoacoustic and underwater acoustic experiments reflecting vertical and lateral variability of geoacoustic properties in the Jeongdongjin area of the East Sea. Keywords: geoacosutic model, bottom model, P-wave speed, Jeongdongjin, East Sea Acknowledgements: This research was supported by the research grants from the Agency of Defense Development (UD140003DD and UE140033DD).
NASA Astrophysics Data System (ADS)
Damahuri, Abdul Hannan Bin; Mohamed, Hassan; Aziz Mohamed, Abdul; Idris, Faridah
2018-01-01
Thorium is one of the elements that needs to be explored for nuclear fuel research and development. One of the popular core configurations of thorium fuel is seed-blanket configuration or also known as Radkowsky Thorium Fuel concept. The seed will act as a supplier of neutrons, which will be placed inside of the core. The blanket, on the other hand, is the consumer of neutrons that is located at outermost of the core. In this work, a neutronic analysis of seed-blanket configuration for the TRIGA PUSPATI Reactor (RTP) is carried out using Monte Carlo method. The reactor, which has been operated since 1982 use uranium zirconium hydride (U-ZrH1.6) as the fuel and have multiple uranium weight which are 8.5, 12 and 20 wt.%. The pool type reactor is one and only research reactor that located in Malaysia. The design of core included the Uranium Zirconium Hydride located at the centre of the core that will act as the seed to supply neutron. The thorium oxide that will act as blanket situated outside of seed region will receive neutron to transmute 232Th to 233U. The neutron multiplication factor or criticality of each configuration is estimated. Results show that the highest initial criticality achieved is 1.30153.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jochen, J.E.; Hopkins, C.W.
1993-12-31
;Contents: Naturally fractured reservoir description; Geologic considerations; Shale-specific log model; Stress profiles; Berea reasearch; Benefits analysis; Summary of technologies; Novel well test methods; Natural fracture identification; Reverse drilling; Production data analysis; Fracture treatment quality control; Novel core analysis methods; and Shale well cleanouts.
Logging-while-coring method and apparatus
Goldberg, David S.; Myers, Gregory J.
2007-11-13
A method and apparatus for downhole coring while receiving logging-while-drilling tool data. The apparatus includes core collar and a retrievable core barrel. The retrievable core barrel receives core from a borehole which is sent to the surface for analysis via wireline and latching tool The core collar includes logging-while-drilling tools for the simultaneous measurement of formation properties during the core excavation process. Examples of logging-while-drilling tools include nuclear sensors, resistivity sensors, gamma ray sensors, and bit resistivity sensors. The disclosed method allows for precise core-log depth calibration and core orientation within a single borehole, and without at pipe trip, providing both time saving and unique scientific advantages.
Logging-while-coring method and apparatus
Goldberg, David S.; Myers, Gregory J.
2007-01-30
A method and apparatus for downhole coring while receiving logging-while-drilling tool data. The apparatus includes core collar and a retrievable core barrel. The retrievable core barrel receives core from a borehole which is sent to the surface for analysis via wireline and latching tool The core collar includes logging-while-drilling tools for the simultaneous measurement of formation properties during the core excavation process. Examples of logging-while-drilling tools include nuclear sensors, resistivity sensors, gamma ray sensors, and bit resistivity sensors. The disclosed method allows for precise core-log depth calibration and core orientation within a single borehole, and without at pipe trip, providing both time saving and unique scientific advantages.
Jones, Janet E; Jones, Laura L; Keeley, Thomas J H; Calvert, Melanie J; Mathers, Jonathan
2017-01-01
To be meaningful, a core outcome set (COS) should be relevant to all stakeholders including patients and carers. This review aimed to explore the methods by which patients and carers have been included as participants in COS development exercises and, in particular, the use and reporting of qualitative methods. In August 2015, a search of the Core Outcomes Measures in Effectiveness Trials (COMET) database was undertaken to identify papers involving patients and carers in COS development. Data were extracted to identify the data collection methods used in COS development, the number of health professionals, patients and carers participating in these, and the reported details of qualitative research undertaken. Fifty-nine papers reporting patient and carer participation were included in the review, ten of which reported using qualitative methods. Although patients and carers participated in outcome elicitation for inclusion in COS processes, health professionals tended to dominate the prioritisation exercises. Of the ten qualitative papers, only three were reported as a clear pre-designed part of a COS process. Qualitative data were collected using interviews, focus groups or a combination of these. None of the qualitative papers reported an underpinning methodological framework and details regarding data saturation, reflexivity and resource use associated with data collection were often poorly reported. Five papers reported difficulty in achieving a diverse sample of participants and two reported that a large and varied range of outcomes were often identified by participants making subsequent rating and ranking difficult. Consideration of the best way to include patients and carers throughout the COS development process is needed. Additionally, further work is required to assess the potential role of qualitative methods in COS, to explore the knowledge produced by different qualitative data collection methods, and to evaluate the time and resources required to incorporate qualitative methods into COS development.
NASA Astrophysics Data System (ADS)
Popov, Evgeny; Popov, Yury; Spasennykh, Mikhail; Kozlova, Elena; Chekhonin, Evgeny; Zagranovskaya, Dzhuliya; Belenkaya, Irina; Alekseev, Aleksey
2016-04-01
A practical method of organic-rich intervals identifying within the low-permeable dispersive rocks based on thermal conductivity measurements along the core is presented. Non-destructive non-contact thermal core logging was performed with optical scanning technique on 4 685 full size core samples from 7 wells drilled in four low-permeable zones of the Bazhen formation (B.fm.) in the Western Siberia (Russia). The method employs continuous simultaneous measurements of rock anisotropy, volumetric heat capacity, thermal anisotropy coefficient and thermal heterogeneity factor along the cores allowing the high vertical resolution (of up to 1-2 mm). B.fm. rock matrix thermal conductivity was observed to be essentially stable within the range of 2.5-2.7 W/(m*K). However, stable matrix thermal conductivity along with the high thermal anisotropy coefficient is characteristic for B.fm. sediments due to the low rock porosity values. It is shown experimentally that thermal parameters measured relate linearly to organic richness rather than to porosity coefficient deviations. Thus, a new technique employing the transformation of the thermal conductivity profiles into continuous profiles of total organic carbon (TOC) values along the core was developed. Comparison of TOC values, estimated from the thermal conductivity values, with experimental pyrolytic TOC estimations of 665 samples from the cores using the Rock-Eval and HAWK instruments demonstrated high efficiency of the new technique for the organic rich intervals separation. The data obtained with the new technique are essential for the SR hydrocarbon generation potential, for basin and petroleum system modeling application, and estimation of hydrocarbon reserves. The method allows for the TOC richness to be accurately assessed using the thermal well logs. The research work was done with financial support of the Russian Ministry of Education and Science (unique identification number RFMEFI58114X0008).
E-Assessment within the Bologna Paradigm: Evidence from Portugal
ERIC Educational Resources Information Center
Ferrao, Maria
2010-01-01
The Bologna Declaration brought reforms into higher education that imply changes in teaching methods, didactic materials and textbooks, infrastructures and laboratories, etc. Statistics and mathematics are disciplines that traditionally have the worst success rates, particularly in non-mathematics core curricula courses. This research project,…
Mineralization and nitrification patterns at eight northeastern USA forested research sites
Ross, D.S.; Lawrence, G.B.; Fredriksen, G.
2004-01-01
Nitrogen transformation rates in eight northeastern US research sites were measured in soil samples taken in the early season of 2000 and the late season of 2001. Net mineralization and nitrification rates were determined on Oa or A horizon samples by two different sampling methods - intact cores and repeated measurements on composite samples taken from around the cores. Net rates in the composite samples (n=30) showed three different temporal patterns: high net nitrification with minimal NH4+ accumulation, high net nitrification and high NH4+ accumulation, and minimal net nitrification and moderate NH4+ accumulation. The 4-week net rates in intact cores were about half that of the rates from the composite samples but were well related (R2 > 0.70). Composite samples from sites that exhibited high net nitrification were incubated with acetylene and net nitrification was completely stopped, suggesting an autotrophic pathway. Gross mineralization and nitrification (2000 only) rates were estimated using the isotope dilution technique. Gross rates of nitrification and consumption in intact cores were relatively low. Gross rates of mineralization and net rates of nitrification were both related to the soil C/N ratio, with higher rates generally occurring in sites containing Acer saccharum as a dominant or co-dominant species. The comparison of methods suggests that all provide a similar hierarchy of potential rates but that the degree of net nitrification is strongly influenced by the degree of sample disturbance. Differences between sites appear to be related to an interaction of soil (C/N) and vegetation (A. saccharum contribution) characteristics. ?? 2003 Elsevier B.V. All rights reserved.
Torous, John; Nebeker, Camille
2017-02-08
Research studies that leverage emerging technologies, such as passive sensing devices and mobile apps, have demonstrated encouraging potential with respect to favorably influencing the human condition. As a result, the nascent fields of mHealth and digital medicine have gained traction over the past decade as demonstrated in the United States by increased federal funding for research that cuts across a broad spectrum of health conditions. The existence of mHealth and digital medicine also introduced new ethical and regulatory challenges that both institutional review boards (IRBs) and researchers are struggling to navigate. In response, the Connected and Open Research Ethics (CORE) initiative was launched. The CORE initiative has employed a participatory research approach, whereby researchers and IRB affiliates are involved in identifying the priorities and functionality of a shared resource. The overarching goal of CORE is to develop dynamic and relevant ethical practices to guide mHealth and digital medicine research. In this Viewpoint paper, we describe the CORE initiative and call for readers to join the CORE Network and contribute to the bigger conversation on ethics in the digital age. ©John Torous, Camille Nebeker. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 08.02.2017.
2014-04-01
hydrostatic pressure vertical coordinate, which are the same as those used in the Weather Research and Forecasting ( WRF ) model, but a hybrid sigma...hydrostatic pressure vertical coordinate, which are the 33 same as those used in the Weather Research and Forecasting ( WRF ) model, but a hybrid 34 sigma...Weather Research and Forecasting 79 ( WRF ) Model. The Euler equations are in flux form based on the hydrostatic pressure vertical 80 coordinate. In
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dehart, Mark; Mausolff, Zander; Goluoglu, Sedat
This report summarizes university research activities performed in support of TREAT modeling and simulation research. It is a compilation of annual research reports from four universities: University of Florida, Texas A&M University, Massachusetts Institute of Technology and Oregon State University. The general research topics are, respectively, (1) 3-D time-dependent transport with TDKENO/KENO-VI, (2) implementation of the Improved Quasi-Static method in Rattlesnake/MOOSE for time-dependent radiation transport approximations, (3) improved treatment of neutron physics representations within TREAT using OpenMC, and (4) steady state modeling of the minimum critical core of the Transient Reactor Test Facility (TREAT).
Essential methodological considerations when using grounded theory.
Achora, Susan; Matua, Gerald Amandu
2016-07-01
To suggest important methodological considerations when using grounded theory. A research method widely used in nursing research is grounded theory, at the centre of which is theory construction. However, researchers still struggle with some of its methodological issues. Although grounded theory is widely used to study and explain issues in nursing practice, many researchers are still failing to adhere to its rigorous standards. Researchers should articulate the focus of their investigations - the substantive area of interest as well as the focal population. This should be followed by a succinct explanation of the strategies used to collect and analyse data, supported by clear coding processes. Finally, the resolution of the core issues, including the core category and related categories, should be explained to advance readers' understanding. Researchers should endeavour to understand the tenets of grounded theory. This enables 'neophytes' in particular to make methodological decisions that will improve their studies' rigour and fit with grounded theory. This paper complements the current dialogue on improving the understanding of grounded theory methodology in nursing research. The paper also suggests important procedural decisions researchers need to make to preserve their studies' scientific merit and fit with grounded theory.
Daudelin, Denise H; Selker, Harry P; Leslie, Laurel K
2015-12-01
There is growing appreciation that process improvement holds promise for improving quality and efficiency across the translational research continuum but frameworks for such programs are not often described. The purpose of this paper is to present a framework and case examples of a Research Process Improvement Program implemented at Tufts CTSI. To promote research process improvement, we developed online training seminars, workshops, and in-person consultation models to describe core process improvement principles and methods, demonstrate the use of improvement tools, and illustrate the application of these methods in case examples. We implemented these methods, as well as relational coordination theory, with junior researchers, pilot funding awardees, our CTRC, and CTSI resource and service providers. The program focuses on capacity building to address common process problems and quality gaps that threaten the efficient, timely and successful completion of clinical and translational studies. © 2015 The Authors. Clinical and Translational Science published by Wiley Periodicals, Inc.
Theory and methods in cultural neuroscience
Hariri, Ahmad R.; Harada, Tokiko; Mano, Yoko; Sadato, Norihiro; Parrish, Todd B.; Iidaka, Tetsuya
2010-01-01
Cultural neuroscience is an emerging research discipline that investigates cultural variation in psychological, neural and genomic processes as a means of articulating the bidirectional relationship of these processes and their emergent properties. Research in cultural neuroscience integrates theory and methods from anthropology, cultural psychology, neuroscience and neurogenetics. Here, we review a set of core theoretical and methodological challenges facing researchers when planning and conducting cultural neuroscience studies, and provide suggestions for overcoming these challenges. In particular, we focus on the problems of defining culture and culturally appropriate experimental tasks, comparing neuroimaging data acquired from different populations and scanner sites and identifying functional genetic polymorphisms relevant to culture. Implications of cultural neuroscience research for addressing current issues in population health disparities are discussed. PMID:20592044
Determination of Porosity in Shale by Double Headspace Extraction GC Analysis.
Zhang, Chun-Yun; Li, Teng-Fei; Chai, Xin-Sheng; Xiao, Xian-Ming; Barnes, Donald
2015-11-03
This paper reports on a novel method for the rapid determination of the shale porosity by double headspace extraction gas chromatography (DHE-GC). Ground core samples of shale were placed into headspace vials and DHE-GC measurements of released methane gas were performed at a given time interval. A linear correlation between shale porosity and the ratio of consecutive GC signals was established both theoretically and experimentally by comparing with the results from the standard helium pycnometry method. The results showed that (a) the porosity of ground core samples of shale can be measured within 30 min; (b) the new method is not significantly affected by particle size of the sample; (c) the uncertainties of measured porosities of nine shale samples by the present method range from 0.31 to 0.46 p.u.; and (d) the results obtained by the DHE-GC method are in a good agreement with those from the standard helium pycnometry method. In short, the new DHE-GC method is simple, rapid, and accurate, making it a valuable tool for shale gas-related research and applications.
Exploring the Role of Agriculture Teachers in Core Academic Integration
ERIC Educational Resources Information Center
McKim, Aaron J.; Sorenson, Tyson J.; Velez, Jonathan J.
2016-01-01
Core academic skills are essential for success in our society. However, an abundance of research has identified a large proportion of secondary school students are under performing in core academic areas such as literacy and math. Researchers have suggested integrating core academic content throughout all secondary coursework as a potential…
Research Perspectives on Core French: A Literature Review
ERIC Educational Resources Information Center
Lapkin, Sharon; Mady, Callie; Arnott, Stephanie
2009-01-01
This article reviews the research literature on core French in three main areas: student diversity, delivery models for the core French program, and instructional approaches. These topics are put into context through a discussion of studies on community attitudes to French as a second language (FSL), dissatisfaction with core French outcomes and…
ERIC Educational Resources Information Center
Beggrow, Elizabeth P.; Ha, Minsu; Nehm, Ross H.; Pearl, Dennis; Boone, William J.
2014-01-01
The landscape of science education is being transformed by the new "Framework for Science Education" (National Research Council, "A framework for K-12 science education: practices, crosscutting concepts, and core ideas." The National Academies Press, Washington, DC, 2012), which emphasizes the centrality of scientific…
Strategies for Teaching Fractions: Using Error Analysis for Intervention and Assessment
ERIC Educational Resources Information Center
Spangler, David B.
2011-01-01
Many students struggle with fractions and must understand them before learning higher-level math. Veteran educator David B. Spangler provides research-based tools that are aligned with NCTM and Common Core State Standards. He outlines powerful diagnostic methods for analyzing student work and providing timely, specific, and meaningful…
Disaster and Contingency Planning for Scientific Shared Resource Cores.
Mische, Sheenah; Wilkerson, Amy
2016-04-01
Progress in biomedical research is largely driven by improvements, innovations, and breakthroughs in technology, accelerating the research process, and an increasingly complex collaboration of both clinical and basic science. This increasing sophistication has driven the need for centralized shared resource cores ("cores") to serve the scientific community. From a biomedical research enterprise perspective, centralized resource cores are essential to increased scientific, operational, and cost effectiveness; however, the concentration of instrumentation and resources in the cores may render them highly vulnerable to damage from severe weather and other disasters. As such, protection of these assets and the ability to recover from a disaster is increasingly critical to the mission and success of the institution. Therefore, cores should develop and implement both disaster and business continuity plans and be an integral part of the institution's overall plans. Here we provide an overview of key elements required for core disaster and business continuity plans, guidance, and tools for developing these plans, and real-life lessons learned at a large research institution in the aftermath of Superstorm Sandy.
Doohan, Isabelle; Björnstig, Ulf; Östlund, Ulrika; Saveman, Britt-Inger
2017-04-01
The aim of this study was to explore physical and mental consequences and injury mechanisms among bus crash survivors to identify aspects that influence recovery. The study participants were the total population of survivors (N=56) from a bus crash in Sweden. The study had a mixed-methods design that provided quantitative and qualitative data on injuries, mental well-being, and experiences. Results from descriptive statistics and qualitative thematic analysis were interpreted and integrated in a mixed-methods analysis. Among the survivors, 11 passengers (20%) sustained moderate to severe injuries, and the remaining 45 (80%) had minor or no physical injuries. Two-thirds of the survivors screened for posttraumatic stress disorder (PTSD) risk were assessed, during the period of one to three months after the bus crash, as not being at-risk, and the remaining one-third were at-risk. The thematic analysis resulted in themes covering the consequences and varying aspects that affected the survivors' recoveries. The integrated findings are in the form of four "core cases" of survivors who represent a combination of characteristics: injury severity, mental well-being, social context, and other aspects hindering and facilitating recovery. Core case Avery represents a survivor who had minor or no injuries and who demonstrated a successful mental recovery. Core case Blair represents a survivor with moderate to severe injuries who experienced a successful mental recovery. Core case Casey represents a survivor who sustained minor injuries or no injuries in the crash but who was at-risk of developing PTSD. Core case Daryl represents a survivor who was at-risk of developing PTSD and who also sustained moderate to severe injuries in the crash. The present study provides a multi-faceted understanding of mass-casualty incident (MCI) survivors (ie, having minor injuries does not always correspond to minimal risk for PTSD and moderate to severe injuries do not always correspond to increased risk for PTSD). Injury mitigation measures (eg, safer roadside material and anti-lacerative windows) would reduce the consequences of bus crashes. A well-educated rescue team and a compassionate and competent social environment will facilitate recovery. Doohan I , Björnstig U , Östlund U , Saveman BI . Exploring injury panorama, consequences, and recovery among bus crash survivors: a mixed-methods research study. Prehosp Disaster Med. 2017;32(2):165-174.
Core principles of evolutionary medicine
Grunspan, Daniel Z; Nesse, Randolph M; Barnes, M Elizabeth; Brownell, Sara E
2018-01-01
Abstract Background and objectives Evolutionary medicine is a rapidly growing field that uses the principles of evolutionary biology to better understand, prevent and treat disease, and that uses studies of disease to advance basic knowledge in evolutionary biology. Over-arching principles of evolutionary medicine have been described in publications, but our study is the first to systematically elicit core principles from a diverse panel of experts in evolutionary medicine. These principles should be useful to advance recent recommendations made by The Association of American Medical Colleges and the Howard Hughes Medical Institute to make evolutionary thinking a core competency for pre-medical education. Methodology The Delphi method was used to elicit and validate a list of core principles for evolutionary medicine. The study included four surveys administered in sequence to 56 expert panelists. The initial open-ended survey created a list of possible core principles; the three subsequent surveys winnowed the list and assessed the accuracy and importance of each principle. Results Fourteen core principles elicited at least 80% of the panelists to agree or strongly agree that they were important core principles for evolutionary medicine. These principles over-lapped with concepts discussed in other articles discussing key concepts in evolutionary medicine. Conclusions and implications This set of core principles will be helpful for researchers and instructors in evolutionary medicine. We recommend that evolutionary medicine instructors use the list of core principles to construct learning goals. Evolutionary medicine is a young field, so this list of core principles will likely change as the field develops further. PMID:29493660
Core principles of evolutionary medicine: A Delphi study.
Grunspan, Daniel Z; Nesse, Randolph M; Barnes, M Elizabeth; Brownell, Sara E
2018-01-01
Evolutionary medicine is a rapidly growing field that uses the principles of evolutionary biology to better understand, prevent and treat disease, and that uses studies of disease to advance basic knowledge in evolutionary biology. Over-arching principles of evolutionary medicine have been described in publications, but our study is the first to systematically elicit core principles from a diverse panel of experts in evolutionary medicine. These principles should be useful to advance recent recommendations made by The Association of American Medical Colleges and the Howard Hughes Medical Institute to make evolutionary thinking a core competency for pre-medical education. The Delphi method was used to elicit and validate a list of core principles for evolutionary medicine. The study included four surveys administered in sequence to 56 expert panelists. The initial open-ended survey created a list of possible core principles; the three subsequent surveys winnowed the list and assessed the accuracy and importance of each principle. Fourteen core principles elicited at least 80% of the panelists to agree or strongly agree that they were important core principles for evolutionary medicine. These principles over-lapped with concepts discussed in other articles discussing key concepts in evolutionary medicine. This set of core principles will be helpful for researchers and instructors in evolutionary medicine. We recommend that evolutionary medicine instructors use the list of core principles to construct learning goals. Evolutionary medicine is a young field, so this list of core principles will likely change as the field develops further.
NASA Astrophysics Data System (ADS)
Stockmann, Dustin
The purpose of this mixed-methods action research study was to examine to what extent entomological research can promote students' hands-on learning in a high-poverty, urban, secondary setting. In reviewing the literature, the researcher was not able to find a specific study that investigated how entomological research could promote the hands-on learning of students. The researcher did find evidence that research on learning in a secondary setting was important to student growth. It should also be noted that support was established for the implementation of hands-on science inquiry in the classroom setting. The study's purpose was to aid educators in their instruction by combining research-based strategies and hands-on science inquiry. The surveys asked 30 students to rate their understanding of three basic ideas. These core ideas were entomological research, hands-on science inquiry, and urban studies. These core ideas provided the foundation for the study. The questionnaires were based on follow-up ideas from the surveys. Two interview sessions were used to facilitate this one-on-one focus. Because the study included only 30 student participants, its findings may not be totally replicable. Further study investigating the links between entomological research and hands-on science learning in an urban environment is needed.
Bornstein, Stephen; Heritage, Melissa; Chudak, Amanda; Tamblyn, Robyn; McMahon, Meghan; Brown, Adalsteinn
2018-03-11
To develop an enriched set of core competencies for health services and policy research (HSPR) doctoral training that will help graduates maximize their impact across a range of academic and nonacademic work environments and roles. Data were obtained from multiple sources, including literature reviews, key informant interviews, stakeholder consultations, and Expert Working Group (EWG) meetings between January 2015 and March 2016. The study setting is Canada. The study used qualitative methods and an iterative development process with significant stakeholder engagement throughout. The literature reviews, key informant interviews, existing data on graduate career trajectories, and EWG deliberations informed the identification of career profiles for HSPR graduates and the competencies required to succeed in these roles. Stakeholder consultations were held to vet, refine, and validate the competencies. The EWG reached consensus on six sectors and eight primary roles in which HSPR doctoral graduates can bring value to employers and the health system. Additionally, 10 core competencies were identified that should be included or further emphasized in the training of HSPR doctoral students to increase their preparedness and potential for impact in a variety of roles within and outside of traditional academic workplaces. The results offer an expanded view of potential career paths for HSPR doctoral graduates and provide recommendations for an expanded set of core competencies that will better equip graduates to maximize their impact on the health system. © Health Research and Educational Trust.
NASA Astrophysics Data System (ADS)
Judge, S. A.; Wilson, T. J.
2005-12-01
The International Polar Year (IPY) provides an excellent opportunity for highlighting polar research in education. The ultimate goal of our outreach and education program is to develop a series of modules that are focused on societally-relevant topics being investigated in Antarctic earth science, while teaching basic geologic concepts that are standard elements of school curricula. For example, we envision a university-level, undergraduate, introductory earth science class with the entire semester/quarter laboratory program focused on polar earth science research during the period of the International Polar Year. To attain this goal, a series of modules will be developed, including inquiry-based exercises founded on imagery (video, digital photos, digital core scans), GIS data layers, maps, and data sets available from OSU research groups. Modules that highlight polar research are also suitable for the K-12 audience. Scaleable/grade appropriate modules that use some of the same data sets as the undergraduate modules can be outlined for elementary through high school earth science classes. An initial module is being developed that focuses on paleoclimate data. The module provides a hands-on investigation of the climate history archived in both ice cores and sedimentary rock cores in order to understand time scales, drivers, and processes of global climate change. The paleoclimate module also demonstrates the types of polar research that are ongoing at OSU, allowing students to observe what research the faculty are undertaking in their respective fields. This will link faculty research with student education in the classroom, enhancing learning outcomes. Finally, this module will provide a direct link to U.S. Antarctic Program research related to the International Polar Year, when new ice and sedimentary rock cores will be obtained and analyzed. As a result of this laboratory exercise, the students will be able to: (1) Define an ice core and a sedimentary rock core. (Knowledge) (2) Identify climate indicators in each type of core by using digital core images. These include layers of particulate material (such as volcanic tephra) in ice cores and layers of larger grains (such as ice-rafted debris) in sedimentary rock cores. (Knowledge) (3) Describe how cores are taken in extreme environments, such as Antarctica. (Comprehension) (4) Use actual data from proxies in the ice and sedimentary records to graph changes through time in the cores. (Application) (5) Recognize variances in data sets that might illustrate periods of climate change. (Analysis) (6) Integrate data results from several proxies in order to construct a climate record for both ice cores and sedimentary rock cores. (Synthesis) (7) Interpret both the ice core and sedimentary rock core records to ascertain the effectiveness of both of these tools in archiving climate records. (Evaluation)
Research on power market technical analysis index system employing high-low matching mechanism
NASA Astrophysics Data System (ADS)
Li, Tao; Wang, Shengyu
2018-06-01
The power market trading technical analysis refers to a method that takes the bidding behavior of members in the power market as the research object, sums up some typical market rules and price trends by applying mathematical and logical methods, and finally can effectively assist members in the power market to make more reasonable trading decisions. In this paper, the following four indicators have been proposed: bidding price difference scale, extreme bidding price rate, dispersion of bidding price and monthly transaction satisfaction of electricity trading, which are the core of the index system.
Mallory, Melanie A; Lucic, Danijela; Ebbert, Mark T W; Cloherty, Gavin A; Toolsie, Dan; Hillyard, David R
2017-05-01
HCV genotyping remains a critical tool for guiding initiation of therapy and selecting the most appropriate treatment regimen. Current commercial genotyping assays may have difficulty identifying 1a, 1b and genotype 6. To evaluate the concordance for identifying 1a, 1b, and genotype 6 between two methods: the PLUS assay and core/NS5B sequencing. This study included 236 plasma and serum samples previously genotyped by core/NS5B sequencing. Of these, 25 samples were also previously tested by the Abbott RealTime HCV GT II Research Use Only (RUO) assay and yielded ambiguous results. The remaining 211 samples were routine genotype 1 (n=169) and genotype 6 (n=42). Genotypes obtained from sequence data were determined using a laboratory-developed HCV sequence analysis tool and the NCBI non-redundant database. Agreement between the PLUS assay and core/NS5B sequencing for genotype 1 samples was 95.8% (162/169), with 96% (127/132) and 95% (35/37) agreement for 1a and 1b samples respectively. PLUS results agreed with core/NS5B sequencing for 83% (35/42) of unselected genotype 6 samples, with the remaining seven "not detected" by the PLUS assay. Among the 25 samples with ambiguous GT II results, 15 were concordant by PLUS and core/NS5B sequencing, nine were not detected by PLUS, and one sample had an internal control failure. The PLUS assay is an automated method that identifies 1a, 1b and genotype 6 with good agreement with gold-standard core/NS5B sequencing and can aid in the resolution of certain genotype samples with ambiguous GT II results. Copyright © 2017 Elsevier B.V. All rights reserved.
Tracking Blade Tip Vortices for Numerical Flow Simulations of Hovering Rotorcraft
NASA Technical Reports Server (NTRS)
Kao, David L.
2016-01-01
Blade tip vortices generated by a helicopter rotor blade are a major source of rotor noise and airframe vibration. This occurs when a vortex passes closely by, and interacts with, a rotor blade. The accurate prediction of Blade Vortex Interaction (BVI) continues to be a challenge for Computational Fluid Dynamics (CFD). Though considerable research has been devoted to BVI noise reduction and experimental techniques for measuring the blade tip vortices in a wind tunnel, there are only a handful of post-processing tools available for extracting vortex core lines from CFD simulation data. In order to calculate the vortex core radius, most of these tools require the user to manually select a vortex core to perform the calculation. Furthermore, none of them provide the capability to track the growth of a vortex core, which is a measure of how quickly the vortex diffuses over time. This paper introduces an automated approach for tracking the core growth of a blade tip vortex from CFD simulations of rotorcraft in hover. The proposed approach offers an effective method for the quantification and visualization of blade tip vortices in helicopter rotor wakes. Keywords: vortex core, feature extraction, CFD, numerical flow visualization
Soft template synthesis of yolk/silica shell particles.
Wu, Xue-Jun; Xu, Dongsheng
2010-04-06
Yolk/shell particles possess a unique structure that is composed of hollow shells that encapsulate other particles but with an interstitial space between them. These structures are different from core/shell particles in that the core particles are freely movable in the shell. Yolk/shell particles combine the properties of each component, and can find potential applications in catalysis, lithium ion batteries, and biosensors. In this Research News article, a soft-template-assisted method for the preparation of yolk/silica shell particles is presented. The demonstrated method is simple and general, and can produce hollow silica spheres incorporated with different particles independent of their diameters, geometry, and composition. Furthermore, yolk/mesoporous silica shell particles and multishelled particles are also prepared through optimization of the experimental conditions. Finally, potential applications of these particles are discussed.
Choi, Won San; Koo, Hye Young; Kim, Dong-Yu
2008-05-06
Core-in-shell particles with controllable core size have been fabricated from core-shell particles by means of the controlled core-dissolution method. These cores in inorganic shells were employed as scaffolds for the synthesis of metal nanoparticles. After dissolution of the cores, metal nanoparticles embedded in cores were encapsulated into the interior of shell, without any damage or change. This article describes a very simple method for deriving core-in-shell particles with controllable core size and encapsulation of nanoparticles into the interior of shell.
Analysis of scientific collaboration in Chinese psychiatry research.
Wu, Ying; Jin, Xing
2016-05-26
In recent decades, China has changed profoundly, becoming the country with the world's second-largest economy. The proportion of the Chinese population suffering from mental disorder has grown in parallel with the rapid economic development, as social stresses have increased. The aim of this study is to shed light on the status of collaborations in the Chinese psychiatry field, of which there is currently limited research. We sampled 16,224 publications (2003-2012) from 10 core psychiatry journals from Chinese National Knowledge Infrastructure (CNKI) and WanFang Database. We used various social network analysis (SNA) methods such as centrality analysis, and Core-Periphery analysis to study collaboration. We also used hierarchical clustering analysis in this study. From 2003-2012, there were increasing collaborations at the level of authors, institutions and regions in the Chinese psychiatry field. Geographically, these collaborations were distributed unevenly. The 100 most prolific authors and institutions and 32 regions were used to construct the collaboration map, from which we detected the core author, institution and region. Collaborative behavior was affected by economic development. We should encourage collaborative behavior in the Chinese psychiatry field, as this facilitates knowledge distribution, resource sharing and information acquisition. Collaboration has also helped the field narrow its current research focus, providing further evidence to inform policymakers to fund research in order to tackle the increase in mental disorder facing modern China.
Effect of cavitation on flow structure of a tip vortex
NASA Astrophysics Data System (ADS)
Matthieu, Dreyer; Reclari, Martino; Farhat, Mohamed
2013-11-01
Tip vortices, which may develop in axial turbines and marine propellers, are often associated with the occurrence of cavitation because of the low pressure in their core. Although this issue has received a great deal of attention, it is still unclear how the phase transition affects the flow structure of such a vortex. In the present work, we investigate the change of the vortex structure due to cavitation incipience. The measurement of the velocity field is performed in the case of a tip vortex generated by an elliptical hydrofoil placed in the test section of EPFL high speed cavitation tunnel. To this end, a 3D stereo PIV is used with fluorescent seeding particles. A cost effective method is developed to produce in-house fluorescent seeding material, based on polyamide particles and Rhodamine-B dye. The amount of cavitation in the vortex core is controlled by the inlet pressure in the test section, starting with the non-cavitating case. We present an extensive analysis of the vorticity distribution, the vortex intensity and core size for various cavitation developments. This research is supported by CCEM and swisselectric research.
Methods and codes for neutronic calculations of the MARIA research reactor.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andrzejewski, K.; Kulikowska, T.; Bretscher, M. M.
2002-02-18
The core of the MARIA high flux multipurpose research reactor is highly heterogeneous. It consists of beryllium blocks arranged in 6 x 8 matrix, tubular fuel assemblies, control rods and irradiation channels. The reflector is also heterogeneous and consists of graphite blocks clad with aluminum. Its structure is perturbed by the experimental beam tubes. This paper presents methods and codes used to calculate the MARIA reactor neutronics characteristics and experience gained thus far at IAE and ANL. At ANL the methods of MARIA calculations were developed in connection with the RERTR program. At IAE the package of programs was developedmore » to help its operator in optimization of fuel utilization.« less
Core Noise - Increasing Importance
NASA Technical Reports Server (NTRS)
Hultgren, Lennart S.
2011-01-01
This presentation is a technical summary of and outlook for NASA-internal and NASA-sponsored external research on core (combustor and turbine) noise funded by the Fundamental Aeronautics Program Subsonic Fixed Wing (SFW) Project. Sections of the presentation cover: the SFW system-level noise metrics for the 2015, 2020, and 2025 timeframes; turbofan design trends and their aeroacoustic implications; the emerging importance of core noise and its relevance to the SFW Reduced-Perceived-Noise Technical Challenge; and the current research activities in the core-noise area, with additional details given about the development of a high-fidelity combustor-noise prediction capability as well as activities supporting the development of improved reduced-order, physics-based models for combustor-noise prediction. The need for benchmark data for validation of high-fidelity and modeling work and the value of a potential future diagnostic facility for testing of core-noise-reduction concepts are indicated. The NASA Fundamental Aeronautics Program has the principal objective of overcoming today's national challenges in air transportation. The SFW Reduced-Perceived-Noise Technical Challenge aims to develop concepts and technologies to dramatically reduce the perceived aircraft noise outside of airport boundaries. This reduction of aircraft noise is critical to enabling the anticipated large increase in future air traffic. Noise generated in the jet engine core, by sources such as the compressor, combustor, and turbine, can be a significant contribution to the overall noise signature at low-power conditions, typical of approach flight. At high engine power during takeoff, jet and fan noise have traditionally dominated over core noise. However, current design trends and expected technological advances in engine-cycle design as well as noise-reduction methods are likely to reduce non-core noise even at engine-power points higher than approach. In addition, future low-emission combustor designs could increase the combustion-noise component. The trend towards high-power-density cores also means that the noise generated in the low-pressure turbine will likely increase. Consequently, the combined result from these emerging changes will be to elevate the overall importance of turbomachinery core noise, which will need to be addressed in order to meet future noise goals.
A core curriculum for clinical fellowship training in pathology informatics
McClintock, David S.; Levy, Bruce P.; Lane, William J.; Lee, Roy E.; Baron, Jason M.; Klepeis, Veronica E.; Onozato, Maristela L.; Kim, JiYeon; Dighe, Anand S.; Beckwith, Bruce A.; Kuo, Frank; Black-Schaffer, Stephen; Gilbertson, John R.
2012-01-01
Background: In 2007, our healthcare system established a clinical fellowship program in Pathology Informatics. In 2010 a core didactic course was implemented to supplement the fellowship research and operational rotations. In 2011, the course was enhanced by a formal, structured core curriculum and reading list. We present and discuss our rationale and development process for the Core Curriculum and the role it plays in our Pathology Informatics Fellowship Training Program. Materials and Methods: The Core Curriculum for Pathology Informatics was developed, and is maintained, through the combined efforts of our Pathology Informatics Fellows and Faculty. The curriculum was created with a three-tiered structure, consisting of divisions, topics, and subtopics. Primary (required) and suggested readings were selected for each subtopic in the curriculum and incorporated into a curated reading list, which is reviewed and maintained on a regular basis. Results: Our Core Curriculum is composed of four major divisions, 22 topics, and 92 subtopics that cover the wide breadth of Pathology Informatics. The four major divisions include: (1) Information Fundamentals, (2) Information Systems, (3) Workflow and Process, and (4) Governance and Management. A detailed, comprehensive reading list for the curriculum is presented in the Appendix to the manuscript and contains 570 total readings (current as of March 2012). Discussion: The adoption of a formal, core curriculum in a Pathology Informatics fellowship has significant impacts on both fellowship training and the general field of Pathology Informatics itself. For a fellowship, a core curriculum defines a basic, common scope of knowledge that the fellowship expects all of its graduates will know, while at the same time enhancing and broadening the traditional fellowship experience of research and operational rotations. For the field of Pathology Informatics itself, a core curriculum defines to the outside world, including departments, companies, and health systems considering hiring a pathology informatician, the core knowledge set expected of a person trained in the field and, more fundamentally, it helps to define the scope of the field within Pathology and healthcare in general. PMID:23024890
Leppert, Wojciech; Majkowicz, Mikolaj
2013-05-01
Limited data exist on the validation of the European Organization for Research and Treatment of Cancer Quality of Life Questionnaire - Core 15 - Palliative Care in advanced cancer patients. To adapt the European Organization for Research and Treatment of Cancer Quality of Life Questionnaire - Core 15 - Palliative Care to the Polish clinical setting and to evaluate its psychometric properties in advanced cancer patients. Two quality-of-life measurements were performed at baseline and after 7 days. The concurrent validity of the European Organization for Research and Treatment of Cancer Quality of Life Questionnaire - Core 15 - Palliative Care was established by the Pearson correlation coefficients with the modified Edmonton Symptom Assessment System, the Karnofsky Performance Status and the Brief Pain Inventory - Short Form. Reliability was assessed using Cronbach's alpha coefficients and the Spearman correlation coefficients of the baseline and of the second measurement of European Organization for Research and Treatment of Cancer Quality of Life Questionnaire - Core 15 - Palliative Care items. A total of 160 consecutive patients in one academic palliative medicine centre were included. A total of 129 patients completed the study. The concurrent validity revealed significant correlations of the European Organization for Research and Treatment of Cancer Quality of Life Questionnaire - Core 15 - Palliative Care pain scale with the Brief Pain Inventory - Short Form, the European Organization for Research and Treatment of Cancer Quality of Life Questionnaire - Core 15 - Palliative Care symptom items with the modified Edmonton Symptom Assessment System and European Organization for Research and Treatment of Cancer Quality of Life Questionnaire - Core 15 - Palliative Care functional scales with the Karnofsky Performance Status scores. High Cronbach's alpha and standardised Cronbach's alpha values were found in the case of both functional (range: 0.830-0.925; 0.830-0.932) and symptom scales (range: 0.784-0.940; 0.794-0.941) of the European Organization for Research and Treatment of Cancer Quality of Life Questionnaire - Core 15 - Palliative Care, respectively. The Spearman correlation coefficients between the first and the second measurements were significant (p < 0.0001) for all European Organization for Research and Treatment of Cancer Quality of Life Questionnaire - Core 15 - Palliative Care items. Polish version of the European Organization for Research and Treatment of Cancer Quality of Life Questionnaire - Core 15 - Palliative Care is a valid and reliable tool recommended for quality-of-life assessment and monitoring in advanced cancer patients.
Tsichlaki, Aliki; O'Brien, Kevin; Johal, Ama; Marshman, Zoe; Benson, Philip; Colonio Salazar, Fiorella B; Fleming, Padhraig S
2017-08-04
Orthodontic treatment is commonly undertaken in young people, with over 40% of children in the UK needing treatment and currently one third having treatment, at a cost to the National Health Service in England and Wales of £273 million each year. Most current research about orthodontic care does not consider what patients truly feel about, or want, from treatment, and a diverse range of outcomes is being used with little consistency between studies. This study aims to address these problems, using established methodology to develop a core outcome set for use in future clinical trials of orthodontic interventions in children and young people. This is a mixed-methods study incorporating four distinct stages. The first stage will include a scoping review of the scientific literature to identify primary and secondary outcome measures that have been used in previous orthodontic clinical trials. The second stage will involve qualitative interviews and focus groups with orthodontic patients aged 10 to 16 years to determine what outcomes are important to them. The outcomes elicited from these two stages will inform the third stage of the study in which a long-list of outcomes will be ranked in terms of importance using electronic Delphi surveys involving clinicians and patients. The final stage of the study will involve face-to-face consensus meetings with all stakeholders to discuss and agree on the outcome measures that should be included in the final core outcome set. This research will help to inform patients, parents, clinicians and commissioners about outcomes that are important to young people undergoing orthodontic treatment. Adoption of the core outcome set in future clinical trials of orthodontic treatment will make it easier for results to be compared, contrasted and combined. This should translate into improved decision-making by all stakeholders involved. The project has been registered on the Core Outcome Measures in Effectiveness Trials ( COMET ) website, January 2016.
Van den Bussche, Karen; De Meyer, Dorien; Van Damme, Nele; Kottner, Jan; Beeckman, Dimitri
2017-10-01
This study protocol describes the methodology for the development of a core set of outcomes and a core set of measurements for incontinence-associated dermatitis. Incontinence is a widespread disorder with an important impact on quality of life. One of the most common complications is incontinence-associated dermatitis, resulting from chemical and physical irritation of the skin barrier, triggering inflammation and skin damage. Managing incontinence-associated dermatitis is an important challenge for nurses. Several interventions have been assessed in clinical trials, but heterogeneity in study outcomes complicates the comparability and standardization. To overcome this challenge, the development of a core outcome set, a minimum set of outcomes and measurements to be assessed in clinical research, is needed. A project team, International Steering Committee and panelists will be involved to guide the development of the core outcome set. The framework of the Harmonizing Outcomes Measures for Eczema roadmap endorsed by Cochrane Skin Group Core Outcomes Set Initiative, is used to inform the project design. A systematic literature review, interviews to integrate the patients' perspective and a consensus study with healthcare researchers and providers using the Delphi procedure will be performed. The project was approved by the Ethics review Committee (April 2016). This is the first project that will identify a core outcome set of outcomes and measurements for incontinence-associated dermatitis research. A core outcome set will reduce possible reporting bias, allow results comparisons and statistical pooling across trials and strengthen evidence-based practice and decision-making. This project has been registered in the Core Outcome Measures in Effectiveness Trials (COMET) database and is part of the Cochrane Skin Group Core Outcomes Set Initiative (CSG-COUSIN). © 2016 John Wiley & Sons Ltd.
[caCORE: core architecture of bioinformation on cancer research in America].
Gao, Qin; Zhang, Yan-lei; Xie, Zhi-yun; Zhang, Qi-peng; Hu, Zhang-zhi
2006-04-18
A critical factor in the advancement of biomedical research is the ease with which data can be integrated, redistributed and analyzed both within and across domains. This paper summarizes the Biomedical Information Core Infrastructure built by National Cancer Institute Center for Bioinformatics in America (NCICB). The main product from the Core Infrastructure is caCORE--cancer Common Ontologic Reference Environment, which is the infrastructure backbone supporting data management and application development at NCICB. The paper explains the structure and function of caCORE: (1) Enterprise Vocabulary Services (EVS). They provide controlled vocabulary, dictionary and thesaurus services, and EVS produces the NCI Thesaurus and the NCI Metathesaurus; (2) The Cancer Data Standards Repository (caDSR). It provides a metadata registry for common data elements. (3) Cancer Bioinformatics Infrastructure Objects (caBIO). They provide Java, Simple Object Access Protocol and HTTP-XML application programming interfaces. The vision for caCORE is to provide a common data management framework that will support the consistency, clarity, and comparability of biomedical research data and information. In addition to providing facilities for data management and redistribution, caCORE helps solve problems of data integration. All NCICB-developed caCORE components are distributed under open-source licenses that support unrestricted usage by both non-profit and commercial entities, and caCORE has laid the foundation for a number of scientific and clinical applications. Based on it, the paper expounds caCORE-base applications simply in several NCI projects, of which one is CMAP (Cancer Molecular Analysis Project), and the other is caBIG (Cancer Biomedical Informatics Grid). In the end, the paper also gives good prospects of caCORE, and while caCORE was born out of the needs of the cancer research community, it is intended to serve as a general resource. Cancer research has historically contributed to many areas beyond tumor biology. At the same time, the paper makes some suggestions about the study at the present time on biomedical informatics in China.
Many-core computing for space-based stereoscopic imaging
NASA Astrophysics Data System (ADS)
McCall, Paul; Torres, Gildo; LeGrand, Keith; Adjouadi, Malek; Liu, Chen; Darling, Jacob; Pernicka, Henry
The potential benefits of using parallel computing in real-time visual-based satellite proximity operations missions are investigated. Improvements in performance and relative navigation solutions over single thread systems can be achieved through multi- and many-core computing. Stochastic relative orbit determination methods benefit from the higher measurement frequencies, allowing them to more accurately determine the associated statistical properties of the relative orbital elements. More accurate orbit determination can lead to reduced fuel consumption and extended mission capabilities and duration. Inherent to the process of stereoscopic image processing is the difficulty of loading, managing, parsing, and evaluating large amounts of data efficiently, which may result in delays or highly time consuming processes for single (or few) processor systems or platforms. In this research we utilize the Single-Chip Cloud Computer (SCC), a fully programmable 48-core experimental processor, created by Intel Labs as a platform for many-core software research, provided with a high-speed on-chip network for sharing information along with advanced power management technologies and support for message-passing. The results from utilizing the SCC platform for the stereoscopic image processing application are presented in the form of Performance, Power, Energy, and Energy-Delay-Product (EDP) metrics. Also, a comparison between the SCC results and those obtained from executing the same application on a commercial PC are presented, showing the potential benefits of utilizing the SCC in particular, and any many-core platforms in general for real-time processing of visual-based satellite proximity operations missions.
NASA Technical Reports Server (NTRS)
Spinks, Debra (Compiler)
1990-01-01
This report contains the 1989 annual progress reports of the Research Fellows of the Center for Turbulence Research. It is intended as a year end report to NASA, Ames Research Center which supports this group through core funding and by making available physical and intellectual resources. The Center for Turbulence Research is devoted to the fundamental study of turbulent flows; its objectives are to simulate advances in the physical understanding of turbulence, in turbulence modeling and simulation, and in turbulence control. The reports appearing in the following pages are grouped in the general areas of modeling, experimental research, theory, simulation and numerical methods, and compressible and reacting flows.
Lai, Wyman W.; Richmond, Marc; Li, Jennifer S.; Saul, J. Philip; Mital, Seema; Colan, Steven D.; Newburger, Jane W.; Sleeper, Lynn A.; McCrindle, Brain W.; Minich, L. LuAnn; Goldmuntz, Elizabeth; Marino, Bradley S.; Williams, Ismee A.; Pearson, Gail D.; Evans, Frank; Scott, Jane D.; Cohen, Meryl S.
2013-01-01
Background Wyman W. Lai, MD, MPH, and Victoria L. Vetter, MD, MPH. The Pediatric Heart Network (PHN), funded under the U.S. National Institutes of Health-National Heart, Lung, and Blood Institute (NIH–NHLBI), includes two Clinical Research Skills Development (CRSD) Cores, which were awarded to The Children's Hospital of Philadelphia and to the Morgan Stanley Children's Hospital of New York–Presbyterian. To provide information on how to develop a clinical research career to a larger number of potential young investigators in pediatric cardiology, the directors of these two CRSD Cores jointly organized a one-day seminar for fellows and junior faculty from all of the PHN Core sites. The participants included faculty members from the PHN and the NHLBI. The day-long seminar was held on April 29, 2009, at the NHLBI site, immediately preceding the PHN Steering Committee meeting in Bethesda, MD. Methods The goals of the seminar were 1) to provide fellows and early investigators with basic skills in clinical research 2) to provide a forum for discussion of important research career choices 3) to introduce attendees to each other and to established clinical researchers in pediatric cardiology, and 4) to publish a commentary on the future of clinical research in pediatric cardiology. Results The following chapters are compilations of the talks given at the 2009 PHN Clinical Research Skills Development Seminar, published to share the information provided with a broader audience of those interested in learning how to develop a clinical research career in pediatric cardiology. The discussions of types of clinical research, research skills, career development strategies, funding, and career management are applicable to research careers in other areas of clinical medicine as well. Conclusions The aim of this compilation is to stimulate those who might be interested in the research career options available to investigators. PMID:21167335
Institutional management of core facilities during challenging financial times.
Haley, Rand
2011-12-01
The economic downturn is likely to have lasting effects on institutions of higher education, prioritizing proactive institutional leadership and planning. Although by design, core research facilities are more efficient and effective than supporting individual pieces of research equipment, cores can have significant underlying financial requirements and challenges. This paper explores several possible institutional approaches to managing core facilities during challenging financial times.
Closed-loop 15N measurement of N2O and its isotopomers for real-time greenhouse gas tracing
NASA Astrophysics Data System (ADS)
Slaets, Johanna; Mayr, Leopold; Heiling, Maria; Zaman, Mohammad; Resch, Christian; Weltin, Georg; Gruber, Roman; Dercon, Gerd
2016-04-01
Quantifying sources of nitrous oxide is essential to improve understanding of the global N cycle and to develop climate-smart agriculture, as N2O has a global warming potential 300 times higher than CO2. The isotopic signature and the intramolecular distribution (site preference) of 15N are powerful tools to trace N2O, but the application of these methods is limited as conventional methods cannot provide continuous and in situ data. Here we present a method for closed-loop, real time monitoring of the N2O flux, the isotopic signature and the intramolecular distribution of 15N by using off-axis integrated cavity output spectroscopy (ICOS, Los Gatos Research). The developed method was applied to a fertilizer inhibitor experiment, in which N2O emissions were measured on undisturbed soil cores for three weeks. The treatments consisted of enriched urea-N (100 kg urea-N/ha), the same fertilizer combined with the nitrification inhibitor nitrapyrin (375 g/100 kg urea), and control cores. Monitoring the isotopic signature makes it possible to distinguish emissions from soil and fertilizer. Characterization of site preference could additionally provide a tool to identify different microbial processes leading to N2O emissions. Furthermore, the closed-loop approach enables direct measurement on site and does not require removal of CO2 and H2O. Results showed that 75% of total N2O emissions (total=11 346 μg N2O-N/m2) in the fertilized cores originated from fertilizer, while only 55% of total emissions (total=2 450 μg N2ON/m2) stemmed from fertilizer for the cores treated with nitrapyrin. In the controls, N2O derived from soil was only 40% of the size of the corresponding pool from the fertilized cores, pointing towards a priming effect on the microbial community from the fertilizer and demonstrating the bias that could be introduced by relying on non-treated cores to estimate soil emission rates, rather than using the isotopic signature. The site preference increased linearly over time for the cores with fertilizer and those with nitrapyrin, but the increase was stronger for the fertilized cores: during the first 10 days of the experiment, theses cores showed a more negative site preference than the cores with inhibitor, while during the last 10 days, the site preference for the fertilized cores was more positive than that of the inhibitor. This change indicates that the site preference of 15N can be used to distinguish the processes of nitrification and denitrification, the former having been supressed by nitrapyrin in the cores treated with the inhibitor. Low enrichment levels (5% atomic excess in this study) sufficed in order to separate emissions from soil and fertilizer, making the proposed closed-loop approach a cost-effective and practical tool to obtain a continuous, in situ characterization of N2O sources.
Bristol, R. Sky; Euliss, Ned H.; Booth, Nathaniel L.; Burkardt, Nina; Diffendorfer, Jay E.; Gesch, Dean B.; McCallum, Brian E.; Miller, David M.; Morman, Suzette A.; Poore, Barbara S.; Signell, Richard P.; Viger, Roland J.
2013-01-01
Core Science Systems is a new mission of the U.S. Geological Survey (USGS) that resulted from the 2007 Science Strategy, "Facing Tomorrow's Challenges: U.S. Geological Survey Science in the Decade 2007-2017." This report describes the Core Science Systems vision and outlines a strategy to facilitate integrated characterization and understanding of the complex Earth system. The vision and suggested actions are bold and far-reaching, describing a conceptual model and framework to enhance the ability of the USGS to bring its core strengths to bear on pressing societal problems through data integration and scientific synthesis across the breadth of science. The context of this report is inspired by a direction set forth in the 2007 Science Strategy. Specifically, ecosystem-based approaches provide the underpinnings for essentially all science themes that define the USGS. Every point on Earth falls within a specific ecosystem where data, other information assets, and the expertise of USGS and its many partners can be employed to quantitatively understand how that ecosystem functions and how it responds to natural and anthropogenic disturbances. Every benefit society obtains from the planet-food, water, raw materials to build infrastructure, homes and automobiles, fuel to heat homes and cities, and many others, are derived from or affect ecosystems. The vision for Core Science Systems builds on core strengths of the USGS in characterizing and understanding complex Earth and biological systems through research, modeling, mapping, and the production of high quality data on the Nation's natural resource infrastructure. Together, these research activities provide a foundation for ecosystem-based approaches through geologic mapping, topographic mapping, and biodiversity mapping. The vision describes a framework founded on these core mapping strengths that makes it easier for USGS scientists to discover critical information, share and publish results, and identify potential collaborations that transcend all USGS missions. The framework is designed to improve the efficiency of scientific work within USGS by establishing a means to preserve and recall data for future applications, organizing existing scientific knowledge and data to facilitate new use of older information, and establishing a future workflow that naturally integrates new data, applications, and other science products to make interdisciplinary research easier and more efficient. Given the increasing need for integrated data and interdisciplinary approaches to solve modern problems, leadership by the Core Science Systems mission will facilitate problem solving by all USGS missions in ways not formerly possible. The report lays out a strategy to achieve this vision through three goals with accompanying objectives and actions. The first goal builds on and enhances the strengths of the Core Science Systems mission in characterizing and understanding the Earth system from the geologic framework to the topographic characteristics of the land surface and biodiversity across the Nation. The second goal enhances and develops new strengths in computer and information science to make it easier for USGS scientists to discover data and models, share and publish results, and discover connections between scientific information and knowledge. The third goal brings additional focus to research and development methods to address complex issues affecting society that require integration of knowledge and new methods for synthesizing scientific information. Collectively, the report lays out a strategy to create a seamless connection between all USGS activities to accelerate and make USGS science more efficient by fully integrating disciplinary expertise within a new and evolving science paradigm for a changing world in the 21st century.
Science strategy for Core Science Systems in the U.S. Geological Survey, 2013-2023
Bristol, R. Sky; Euliss, Ned H.; Booth, Nathaniel L.; Burkardt, Nina; Diffendorfer, Jay E.; Gesch, Dean B.; McCallum, Brian E.; Miller, David M.; Morman, Suzette A.; Poore, Barbara S.; Signell, Richard P.; Viger, Roland J.
2012-01-01
Core Science Systems is a new mission of the U.S. Geological Survey (USGS) that grew out of the 2007 Science Strategy, “Facing Tomorrow’s Challenges: U.S. Geological Survey Science in the Decade 2007–2017.” This report describes the vision for this USGS mission and outlines a strategy for Core Science Systems to facilitate integrated characterization and understanding of the complex earth system. The vision and suggested actions are bold and far-reaching, describing a conceptual model and framework to enhance the ability of USGS to bring its core strengths to bear on pressing societal problems through data integration and scientific synthesis across the breadth of science.The context of this report is inspired by a direction set forth in the 2007 Science Strategy. Specifically, ecosystem-based approaches provide the underpinnings for essentially all science themes that define the USGS. Every point on earth falls within a specific ecosystem where data, other information assets, and the expertise of USGS and its many partners can be employed to quantitatively understand how that ecosystem functions and how it responds to natural and anthropogenic disturbances. Every benefit society obtains from the planet—food, water, raw materials to build infrastructure, homes and automobiles, fuel to heat homes and cities, and many others, are derived from or effect ecosystems.The vision for Core Science Systems builds on core strengths of the USGS in characterizing and understanding complex earth and biological systems through research, modeling, mapping, and the production of high quality data on the nation’s natural resource infrastructure. Together, these research activities provide a foundation for ecosystem-based approaches through geologic mapping, topographic mapping, and biodiversity mapping. The vision describes a framework founded on these core mapping strengths that makes it easier for USGS scientists to discover critical information, share and publish results, and identify potential collaborations that transcend all USGS missions. The framework is designed to improve the efficiency of scientific work within USGS by establishing a means to preserve and recall data for future applications, organizing existing scientific knowledge and data to facilitate new use of older information, and establishing a future workflow that naturally integrates new data, applications, and other science products to make it easier and more efficient to conduct interdisciplinary research over time. Given the increasing need for integrated data and interdisciplinary approaches to solve modern problems, leadership by the Core Science Systems mission will facilitate problem solving by all USGS missions in ways not formerly possible.The report lays out a strategy to achieve this vision through three goals with accompanying objectives and actions. The first goal builds on and enhances the strengths of the Core Science Systems mission in characterizing and understanding the earth system from the geologic framework to the topographic characteristics of the land surface and biodiversity across the nation. The second goal enhances and develops new strengths in computer and information science to make it easier for USGS scientists to discover data and models, share and publish results, and discover connections between scientific information and knowledge. The third goal brings additional focus to research and development methods to address complex issues affecting society that require integration of knowledge and new methods for synthesizing scientific information. Collectively, the report lays out a strategy to create a seamless connection between all USGS activities to accelerate and make USGS science more efficient by fully integrating disciplinary expertise within a new and evolving science paradigm for a changing world in the 21st century.
Historical ecology of the northern Adriatic Sea: Field methods and coring device
NASA Astrophysics Data System (ADS)
Haselmair, Alexandra; Gallmetzer, Ivo; Tomasovych, Adam; Stachowitsch, Michael; Zuschin, Martin
2014-05-01
For an ongoing study on the historical ecology of the northern Adriatic Sea, the objective was to retrieve a high number of sediment cores at seven sampling stations spread across the entire basin. One set of cores is intended for sediment analyses including radiometric Pb-sediment-dating, grain size, TOC, TAC and heavy metal analyses. The other set of cores delivered enough shelly remains of endo- or epibenthic hard part producers (e.g. molluscs, crustaceans, echinoderms) to enable the reconstruction of death assemblages in core layers from top to bottom. The down-core changes of such assemblages record ecological shifts in a marine environment that has endured strong human impacts over several centuries. A 1.5 m-long core could, according to the available sedimentation data for the area, cover up to 2000 or even more years of ecological history. The coring method had to meet the following requirements: a) deliver 1.5-m-long cores from different sediment settings (mud to sand, reflecting a wide range of benthic habitats in the northern Adriatic); b) enable quick and easy deployment to ensure that multiple cores can be taken at the individual sampling stations within a short time; c) be relatively affordable and allow handling by the researchers themselves, potentially using a small vessel in order to further contain the operating costs. Two types of UWITEC™ piston corers were used to meet these requirements. A model with 90 mm of diameter (samples for sediment analysis) and another one with 160 mm, specifically designed to obtain the large amount of material needed for shell analysis, successfully delivered a total of 54 cores. The device consists of a stabilizing tripod and the interchangeable coring cylinders. It is equipped with a so-called hammer action that makes it possible, at least for the smaller cylinder, to penetrate even harder sediments. A closing mechanism of the corer retains the sediment in the cylinder upon extraction; it works either automatically through hydraulic pressure once the final core-length is reached, or can be triggered manually anytime from the surface using a connected hose and water pump. The whole coring device weighs less than 300 kg and can readily be transported in a van. It can easily be assembled, disassembled and operated by two to three persons after a brief training. With a newly designed, very simple and effective slicing device, the cores can be sliced in an upright position directly on board after extraction. This type of corer can be highly recommended for any smaller coring operations on lakes, streams, or at sea.
Loustau, Marie-Therese; Verhoog, Roelof; Precigout, Claude
1996-09-24
A method of bonding a metal connection to an electrode including a core having a fiber or foam-type structure for an electrochemical cell, in which method at least one metal strip is pressed against one edge of the core and is welded thereto under compression, wherein, at least in line with the region in which said strip is welded to the core, which is referred to as the "main core", a retaining core of a type analogous to that of the main core is disposed prior to the welding.
caGrid 1.0 : an enterprise Grid infrastructure for biomedical research.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oster, S.; Langella, S.; Hastings, S.
To develop software infrastructure that will provide support for discovery, characterization, integrated access, and management of diverse and disparate collections of information sources, analysis methods, and applications in biomedical research. Design: An enterprise Grid software infrastructure, called caGrid version 1.0 (caGrid 1.0), has been developed as the core Grid architecture of the NCI-sponsored cancer Biomedical Informatics Grid (caBIG{trademark}) program. It is designed to support a wide range of use cases in basic, translational, and clinical research, including (1) discovery, (2) integrated and large-scale data analysis, and (3) coordinated study. Measurements: The caGrid is built as a Grid software infrastructure andmore » leverages Grid computing technologies and the Web Services Resource Framework standards. It provides a set of core services, toolkits for the development and deployment of new community provided services, and application programming interfaces for building client applications. Results: The caGrid 1.0 was released to the caBIG community in December 2006. It is built on open source components and caGrid source code is publicly and freely available under a liberal open source license. The core software, associated tools, and documentation can be downloaded from the following URL:
A Research Agenda for the Common Core State Standards: What Information Do Policymakers Need?
ERIC Educational Resources Information Center
Rentner, Diane Stark; Ferguson, Maria
2014-01-01
This report looks specifically at the information and data needs of policymakers related to the Common Core State Standards (CCSS) and the types of research that could provide this information. The ideas in this report were informed by a series of meetings and discussions about a possible research agenda for the Common Core, sponsored by the…
NASA Astrophysics Data System (ADS)
Olson, Richard F.
2013-05-01
Rendering of point scatterer based radar scenes for millimeter wave (mmW) seeker tests in real-time hardware-in-the-loop (HWIL) scene generation requires efficient algorithms and vector-friendly computer architectures for complex signal synthesis. New processor technology from Intel implements an extended 256-bit vector SIMD instruction set (AVX, AVX2) in a multi-core CPU design providing peak execution rates of hundreds of GigaFLOPS (GFLOPS) on one chip. Real world mmW scene generation code can approach peak SIMD execution rates only after careful algorithm and source code design. An effective software design will maintain high computing intensity emphasizing register-to-register SIMD arithmetic operations over data movement between CPU caches or off-chip memories. Engineers at the U.S. Army Aviation and Missile Research, Development and Engineering Center (AMRDEC) applied two basic parallel coding methods to assess new 256-bit SIMD multi-core architectures for mmW scene generation in HWIL. These include use of POSIX threads built on vector library functions and more portable, highlevel parallel code based on compiler technology (e.g. OpenMP pragmas and SIMD autovectorization). Since CPU technology is rapidly advancing toward high processor core counts and TeraFLOPS peak SIMD execution rates, it is imperative that coding methods be identified which produce efficient and maintainable parallel code. This paper describes the algorithms used in point scatterer target model rendering, the parallelization of those algorithms, and the execution performance achieved on an AVX multi-core machine using the two basic parallel coding methods. The paper concludes with estimates for scale-up performance on upcoming multi-core technology.
NASA Astrophysics Data System (ADS)
Ilham, Muhammad; Su'ud, Zaki
2017-01-01
Growing energy needed due to increasing of the world’s population encourages development of technology and science of nuclear power plant in its safety and security. In this research, it will be explained about design study of modular fast reactor with helium gas cooling (GCFR) small long life reactor, which can be operated over 20 years. It had been conducted about neutronic design GCFR with Mixed Oxide (UO2-PuO2) fuel in range of 100-200 MWth NPPs of power and 50-60% of fuel fraction variation with cylindrical pin cell and cylindrical balance of reactor core geometry. Calculation method used SRAC-CITATION code. The obtained results are the effective multiplication factor and density value of core reactor power (with geometry optimalization) to obtain optimum design core reactor power, whereas the obtained of optimum core reactor power is 200 MWth with 55% of fuel fraction and 9-13% of percentages.
Case Study: The Chemistry of Cocaine
ERIC Educational Resources Information Center
Dewprashad, Brahmadeo
2011-01-01
This column provides original articles on innovations in case study teaching, assessment of the method, as well as case studies with teaching notes. This month's case study focuses on the chemistry of cocaine to teach a number of core concepts in organic chemistry. It also requires that students read and analyze an original research paper on…
ERIC Educational Resources Information Center
Bedford, Denise A. D.
2015-01-01
The knowledge life cycle is applied to two core capabilities of library and information science (LIS) education--teaching, and research and development. The knowledge claim validation, invalidation and integration steps of the knowledge life cycle are translated to learning, unlearning and relearning processes. Mixed methods are used to determine…
The Post-Human I: Encountering "Data" in New Materialism
ERIC Educational Resources Information Center
Somerville, Margaret
2016-01-01
The editors of a recent special edition of "Qualitative Studies in Education" map a new field of post-qualitative research and raise fundamental questions about core concepts such as "method" and "data." They ask whether qualitative inquiry as we know it is any longer possible if we understand language, the human and…
ERIC Educational Resources Information Center
Letwinsky, Karim Medico; Cavender, Monica
2018-01-01
Many preservice teacher (PST) programs throughout the world are preparing students to implement the Core Standards, which require deeper conceptual understandings of mathematics and an informed approach for teaching. In this qualitative multi-case study, researchers explored the teaching methods for two university instructors and changes in PSTs…
The US EPA is pursuing a variety of research efforts to assess the susceptibility of the aged to neurotoxicants. The BN strain is a popular animal model for aging studies but there is a need for improved methods of monitoring their physiological responses to neurotoxicants over t...
Traversing Theory and Transgressing Academic Discourses: Arts-Based Research in Teacher Education
ERIC Educational Resources Information Center
Dixon, Mary; Senior, Kim
2009-01-01
Pre-service teacher education is marked by linear and sequential programming which offers a plethora of strategies and methods (Cochran-Smith & Zeichner, 2005; Darling Hammond & Bransford, 2005; Grant & Zeichner, 1997). This paper emerges from a three year study within a core education subject in pre-service teacher education in…
The Effect of Appreciative Inquiry on Student Engagement and Attendance in the Community College
ERIC Educational Resources Information Center
Robbins, Frances Virginia Turner
2012-01-01
This mixed-methods research study investigated the effects of Appreciative Inquiry on student-course engagement and attendance in core academic classes at a community college in central Mississippi. In an increasingly competitive global economy, most individuals need education or technical skills beyond high school to secure employment offering…
Revisiting a Meta-Analysis of Helpful Aspects of Therapy in a Community Counselling Service
ERIC Educational Resources Information Center
Quick, Emma L; Dowd, Claire; Spong, Sheila
2018-01-01
This small scale mixed methods study examines helpful events in a community counselling setting, categorising impacts of events according to Timulak's [(2007). Identifying core categories of client-identified impact of helpful events in psychotherapy: A qualitative meta-analysis. "Psychotherapy Research," 17, 305-314] meta-synthesis of…
Perspectives and methods of scaling
Jianguo Wu; Harbin Li
2006-01-01
Transferring information between or across scales or organizational levels is inevitable in both basic research and its applications, a process generally known as "scaling" (Wu and Li, Chapter 1). Scaling is the essence of prediction and understanding both of which require cross-scale translation of information, and is at the core of ecological theory and...
Canfield, Caitlin; Angove, Rebekah; Boselovic, Joseph; Brown, Lisanne F.; Gauthe, Sharon; Bui, Tap; Gauthe, David; Bogen, Donald; Denham, Stacey; Nguyen, Tuan; Lichtveld, Maureen Y.
2017-01-01
Background The Transdisciplinary Research Consortium for Gulf Resilience on Women’s Health (GROWH) addresses reproductive health disparities in the Gulf Coast by linking communities and scientists through community-engaged research. Funded by the National Institutes of Environmental Health Sciences, GROWH’s Community Outreach and Dissemination Core (CODC) seeks to utilize community-based participatory research (CBPR) and other community-centered outreach strategies to strengthen resilience in vulnerable Gulf Coast populations. The CODC is an academic-community partnership comprised of Tulane University, Mary Queen of Vietnam Community Development Corporation, Bayou Interfaith Shared Community Organizing, and the Louisiana Public Health Institute (LPHI). Methods Alongside its CODC partners, LPHI collaboratively developed, piloted and evaluated an innovative CBPR curriculum. In addition to helping with curriculum design, the CODC’s community and academic partners participated in the pilot. The curriculum was designed to impart applied, practical knowledge to community-based organizations and academic researchers on the successful formulation, execution and sustaining of CBPR projects and partnerships within the context of environmental health research. Results The curriculum resulted in increased knowledge about CBPR methods among both community and academic partners as well as improved relationships within the GROWH CODC partnership. Conclusion The efforts of the GROWH partnership and curriculum were successful. This curriculum may serve as an anchor for future GROWH efforts including: competency development, translation of the curriculum into education and training products, community development of a CBPR curriculum for academic partners, community practice of CBPR, and future environmental health work. PMID:28890934
[JSPS-NRCT Core university program on natural medicine in pharmaceutical sciences].
Saiki, Ikuo; Yamazaki, Mikako; Matsumoto, Kinzo
2009-04-01
The Core University Program provides a framework for international cooperative research in specifically designated fields and topics, centering around a core university in Japan and its counterpart university in other countries. In this program, individual scientists in the affiliated countries carry out cooperative research projects with sharply focused topics and explicitly delineated goals under leadership of the core universities. The Core University Program which we introduce here has been renewed since 2001 under the support of both the Japan Society for the Promotion of Science (JSPS) and the National Research Council of Thailand (NRCT). Our program aims to conduct cooperative researches particularly focusing on Natural Medicine in the field of Pharmaceutical Sciences. Institute of Natural Medicine at University of Toyama (Japan), Faculty of Pharmaceutical Sciences at Chulalongkorn University (Thailand), and Chulabhorn Research Institute (Thailand) have been taking part in this JSPS-NRCT Core University Program as core universities. The Program is also supported by the 20 institution members in both countries. This program is running the five research subject under a key word of natural medicine which are related to i) age-related diseases, ii) allergy and cancer, iii) hepatitis and infectious diseases, iv) structure, synthesis, and bioactivity of natural medicines, and v) molecular biology of Thai medicinal plant components and database assembling of Thai medicinal plants. The program also encourages university members to strengthen related research activities, to share advanced academic and scientific knowledge on natural medicines.
Taylor, Johanna; Böhnke, Jan R; Wright, Judy; Kellar, Ian; Alderson, Sarah L; Hughes, Tom; Holt, Richard I G; Siddiqi, Najma
2017-02-14
People with diabetes and comorbid severe mental illness (SMI) form a growing population at risk of increased mortality and morbidity compared to those with diabetes or SMI alone. There is increasing interest in interventions that target diabetes in SMI in order to help to improve physical health and reduce the associated health inequalities. However, there is a lack of consensus about which outcomes are important for this comorbid population, with trials differing in their focus on physical and mental health. A core outcome set, which includes outcomes across both conditions that are relevant to patients and other key stakeholders, is needed. This study protocol describes methods to develop a core outcome set for use in effectiveness trials of self-management interventions for adults with comorbid type-2 diabetes and SMI. We will use a modified Delphi method to identify, rank, and agree core outcomes. This will comprise a two-round online survey and multistakeholder workshops involving patients and carers, health and social care professionals, health care commissioners, and other experts (e.g. academic researchers and third sector organisations). We will also select appropriate measurement tools for each outcome in the proposed core set and identify gaps in measures, where these exist. The proposed core outcome set will provide clear guidance about what outcomes should be measured, as a minimum, in trials of interventions for people with coexisting type-2 diabetes and SMI, and improve future synthesis of trial evidence in this area. We will also explore the challenges of using online Delphi methods for this hard-to-reach population, and examine differences in opinion about which outcomes matter to diverse stakeholder groups. COMET registration: http://www.comet-initiative.org/studies/details/911 . Registered on 1 July 2016.
The HTA core model: a novel method for producing and reporting health technology assessments.
Lampe, Kristian; Mäkelä, Marjukka; Garrido, Marcial Velasco; Anttila, Heidi; Autti-Rämö, Ilona; Hicks, Nicholas J; Hofmann, Björn; Koivisto, Juha; Kunz, Regina; Kärki, Pia; Malmivaara, Antti; Meiesaar, Kersti; Reiman-Möttönen, Päivi; Norderhaug, Inger; Pasternack, Iris; Ruano-Ravina, Alberto; Räsänen, Pirjo; Saalasti-Koskinen, Ulla; Saarni, Samuli I; Walin, Laura; Kristensen, Finn Børlum
2009-12-01
The aim of this study was to develop and test a generic framework to enable international collaboration for producing and sharing results of health technology assessments (HTAs). Ten international teams constructed the HTA Core Model, dividing information contained in a comprehensive HTA into standardized pieces, the assessment elements. Each element contains a generic issue that is translated into practical research questions while performing an assessment. Elements were described in detail in element cards. Two pilot assessments, designated as Core HTAs were also produced. The Model and Core HTAs were both validated. Guidance on the use of the HTA Core Model was compiled into a Handbook. The HTA Core Model considers health technologies through nine domains. Two applications of the Model were developed, one for medical and surgical interventions and another for diagnostic technologies. Two Core HTAs were produced in parallel with developing the model, providing the first real-life testing of the Model and input for further development. The results of formal validation and public feedback were primarily positive. Development needs were also identified and considered. An online Handbook is available. The HTA Core Model is a novel approach to HTA. It enables effective international production and sharing of HTA results in a structured format. The face validity of the Model was confirmed during the project, but further testing and refining are needed to ensure optimal usefulness and user-friendliness. Core HTAs are intended to serve as a basis for local HTA reports. Core HTAs do not contain recommendations on technology use.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Soner Yorgun, M.; Rood, Richard B.
An object-based evaluation method using a pattern recognition algorithm (i.e., classification trees) is applied to the simulated orographic precipitation for idealized experimental setups using the National Center of Atmospheric Research (NCAR) Community Atmosphere Model (CAM) with the finite volume (FV) and the Eulerian spectral transform dynamical cores with varying resolutions. Daily simulations were analyzed and three different types of precipitation features were identified by the classification tree algorithm. The statistical characteristics of these features (i.e., maximum value, mean value, and variance) were calculated to quantify the difference between the dynamical cores and changing resolutions. Even with the simple and smoothmore » topography in the idealized setups, complexity in the precipitation fields simulated by the models develops quickly. The classification tree algorithm using objective thresholding successfully detected different types of precipitation features even as the complexity of the precipitation field increased. The results show that the complexity and the bias introduced in small-scale phenomena due to the spectral transform method of CAM Eulerian spectral dynamical core is prominent, and is an important reason for its dissimilarity from the FV dynamical core. The resolvable scales, both in horizontal and vertical dimensions, have significant effect on the simulation of precipitation. The results of this study also suggest that an efficient and informative study about the biases produced by GCMs should involve daily (or even hourly) output (rather than monthly mean) analysis over local scales.« less
Soner Yorgun, M.; Rood, Richard B.
2016-11-11
An object-based evaluation method using a pattern recognition algorithm (i.e., classification trees) is applied to the simulated orographic precipitation for idealized experimental setups using the National Center of Atmospheric Research (NCAR) Community Atmosphere Model (CAM) with the finite volume (FV) and the Eulerian spectral transform dynamical cores with varying resolutions. Daily simulations were analyzed and three different types of precipitation features were identified by the classification tree algorithm. The statistical characteristics of these features (i.e., maximum value, mean value, and variance) were calculated to quantify the difference between the dynamical cores and changing resolutions. Even with the simple and smoothmore » topography in the idealized setups, complexity in the precipitation fields simulated by the models develops quickly. The classification tree algorithm using objective thresholding successfully detected different types of precipitation features even as the complexity of the precipitation field increased. The results show that the complexity and the bias introduced in small-scale phenomena due to the spectral transform method of CAM Eulerian spectral dynamical core is prominent, and is an important reason for its dissimilarity from the FV dynamical core. The resolvable scales, both in horizontal and vertical dimensions, have significant effect on the simulation of precipitation. The results of this study also suggest that an efficient and informative study about the biases produced by GCMs should involve daily (or even hourly) output (rather than monthly mean) analysis over local scales.« less
Compression After Impact on Honeycomb Core Sandwich Panels with Thin Facesheets, Part 2: Analysis
NASA Technical Reports Server (NTRS)
Mcquigg, Thomas D.; Kapania, Rakesh K.; Scotti, Stephen J.; Walker, Sandra P.
2012-01-01
A two part research study has been completed on the topic of compression after impact (CAI) of thin facesheet honeycomb core sandwich panels. The research has focused on both experiments and analysis in an effort to establish and validate a new understanding of the damage tolerance of these materials. Part 2, the subject of the current paper, is focused on the analysis, which corresponds to the CAI testings described in Part 1. Of interest, are sandwich panels, with aerospace applications, which consist of very thin, woven S2-fiberglass (with MTM45-1 epoxy) facesheets adhered to a Nomex honeycomb core. Two sets of materials, which were identical with the exception of the density of the honeycomb core, were tested in Part 1. The results highlighted the need for analysis methods which taken into account multiple failure modes. A finite element model (FEM) is developed here, in Part 2. A commercial implementation of the Multicontinuum Failure Theory (MCT) for progressive failure analysis (PFA) in composite laminates, Helius:MCT, is included in this model. The inclusion of PFA in the present model provided a new, unique ability to account for multiple failure modes. In addition, significant impact damage detail is included in the model. A sensitivity study, used to assess the effect of each damage parameter on overall analysis results, is included in an appendix. Analysis results are compared to the experimental results for each of the 32 CAI sandwich panel specimens tested to failure. The failure of each specimen is predicted using the high-fidelity, physicsbased analysis model developed here, and the results highlight key improvements in the understanding of honeycomb core sandwich panel CAI failure. Finally, a parametric study highlights the strength benefits compared to mass penalty for various core densities.
Michelson, Kelly N; Frader, Joel; Sorce, Lauren; Clayman, Marla L; Persell, Stephen D; Fragen, Patricia; Ciolino, Jody D; Campbell, Laura C; Arenson, Melanie; Aniciete, Danica Y; Brown, Melanie L; Ali, Farah N; White, Douglas
2016-12-01
Stakeholder-developed interventions are needed to support pediatric intensive care unit (PICU) communication and decision-making. Few publications delineate methods and outcomes of stakeholder engagement in research. We describe the process and impact of stakeholder engagement on developing a PICU communication and decision-making support intervention. We also describe the resultant intervention. Stakeholders included parents of PICU patients, healthcare team members (HTMs), and research experts. Through a year-long iterative process, we involved 96 stakeholders in 25 meetings and 26 focus groups or interviews. Stakeholders adapted an adult navigator model by identifying core intervention elements and then determining how to operationalize those core elements in pediatrics. The stakeholder input led to PICU-specific refinements, such as supporting transitions after PICU discharge and including ancillary tools. The resultant intervention includes navigator involvement with parents and HTMs and navigator-guided use of ancillary tools. Subsequent research will test the feasibility and efficacy of our intervention.
A New Curriculum for Physics Graduate Students
NASA Astrophysics Data System (ADS)
Griesshammer, Harald W.
2012-03-01
Effective Fall 2008, GW Physics implemented a new graduate curriculum, addressing nation-wide problems: (1) wide gap between 50-year-old curricula and the proficiencies expected to start research; (2) high attrition rates and long times to degree; (3) limited resources in small departments to cover all topics deemed essential. The new curriculum: (1) extends each course to 4 hours weekly for better in-depth coverage and cautious additions; (2) decreases the number of core-courses per semester to 2, with less ``parallel-processing'' of only loosely correlated lectures; (3) increases synergies by stricter logical ordering and synchronisation of courses; (4) frees faculty to regularly offer advanced courses; (5) integrates examples tied to ongoing research in our department; (6) integrates computational methods into core-lectures; (7) encourages focusing on concepts and ``meta-cognitive skills'' in studio-like settings. The new curriculum and qualifying exam, its rationale and assessment criteria will be discussed. This concept is tailored to the needs of small departments with only a few research fields and a close student-teacher relationship.
Hodges, Mary K.V.; Davis, Linda C.; Bartholomay, Roy C.
2018-01-30
In 1990, the U.S. Geological Survey, in cooperation with the U.S. Department of Energy Idaho Operations Office, established the Lithologic Core Storage Library at the Idaho National Laboratory (INL). The facility was established to consolidate, catalog, and permanently store nonradioactive drill cores and cuttings from subsurface investigations conducted at the INL, and to provide a location for researchers to examine, sample, and test these materials.The facility is open by appointment to researchers for examination, sampling, and testing of cores and cuttings. This report describes the facility and cores and cuttings stored at the facility. Descriptions of cores and cuttings include the corehole names, corehole locations, and depth intervals available.Most cores and cuttings stored at the facility were drilled at or near the INL, on the eastern Snake River Plain; however, two cores drilled on the western Snake River Plain are stored for comparative studies. Basalt, rhyolite, sedimentary interbeds, and surficial sediments compose most cores and cuttings, most of which are continuous from land surface to their total depth. The deepest continuously drilled core stored at the facility was drilled to 5,000 feet below land surface. This report describes procedures and researchers' responsibilities for access to the facility and for examination, sampling, and return of materials.
Munoz-Plaza, Corrine E; Parry, Carla; Hahn, Erin E; Tang, Tania; Nguyen, Huong Q; Gould, Michael K; Kanter, Michael H; Sharp, Adam L
2016-08-15
Despite reports advocating for integration of research into healthcare delivery, scant literature exists describing how this can be accomplished. Examples highlighting application of qualitative research methods embedded into a healthcare system are particularly needed. This article describes the process and value of embedding qualitative research as the second phase of an explanatory, sequential, mixed methods study to improve antibiotic stewardship for acute sinusitis. Purposive sampling of providers for in-depth interviews improved understanding of unwarranted antibiotic prescribing and elicited stakeholder recommendations for improvement. Qualitative data collection, transcription and constant comparative analyses occurred iteratively. Emerging themes and sub-themes identified primary drivers of unwarranted antibiotic prescribing patterns and recommendations for improving practice. These findings informed the design of a health system intervention to improve antibiotic stewardship for acute sinusitis. Core components of the intervention are also described. Qualitative research can be effectively applied in learning healthcare systems to elucidate quantitative results and inform improvement efforts.
Disciplinary perspectives on later-life migration in the core journals of social gerontology.
Walters, William H; Wilder, Esther I
2003-10-01
The authors examine the bibliographic structure of recent research on later-life migration, highlighting the contributions of particular journals and disciplines. The authors identify the primary journals publishing research in this area, including a set of four core journals within the field of social gerontology. They evaluate the disciplinary affiliations of authors publishing in the core journals and the extent to which those journals cite relevant research published elsewhere. Geographical and economic perspectives on later-life migration are underrepresented within the core journals of social gerontology. In particular, major articles published outside the core journals are seldom cited within those journals. Although the core journals of social gerontology account for over a third of the recent literature on later-life migration, they present only a partial (chiefly sociological) perspective on the subject.
Defining Tobacco Regulatory Science Competencies.
Wipfli, Heather L; Berman, Micah; Hanson, Kacey; Kelder, Steven; Solis, Amy; Villanti, Andrea C; Ribeiro, Carla M P; Meissner, Helen I; Anderson, Roger
2017-02-01
In 2013, the National Institutes of Health and the Food and Drug Administration funded a network of 14 Tobacco Centers of Regulatory Science (TCORS) with a mission that included research and training. A cross-TCORS Panel was established to define tobacco regulatory science (TRS) competencies to help harmonize and guide their emerging educational programs. The purpose of this paper is to describe the Panel's work to develop core TRS domains and competencies. The Panel developed the list of domains and competencies using a semistructured Delphi method divided into four phases occurring between November 2013 and August 2015. The final proposed list included a total of 51 competencies across six core domains and 28 competencies across five specialized domains. There is a need for continued discussion to establish the utility of the proposed set of competencies for emerging TRS curricula and to identify the best strategies for incorporating these competencies into TRS training programs. Given the field's broad multidisciplinary nature, further experience is needed to refine the core domains that should be covered in TRS training programs versus knowledge obtained in more specialized programs. Regulatory science to inform the regulation of tobacco products is an emerging field. The paper provides an initial list of core and specialized domains and competencies to be used in developing curricula for new and emerging training programs aimed at preparing a new cohort of scientists to conduct critical TRS research. © The Author 2016. Published by Oxford University Press on behalf of the Society for Research on Nicotine and Tobacco. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
NASA Astrophysics Data System (ADS)
Huang, W. J.; Hsu, C. H.; Chang, L. C.; Chiang, C. J.; Wang, Y. S.; Lu, W. C.
2017-12-01
Hydrogeological framework is the most important basis for groundwater analysis and simulation. Conventionally, the core drill is a most commonly adopted skill to acquire the core's data with the help of other research methods to artificially determine the result. Now, with the established groundwater station network, there are a lot of groundwater level information available. Groundwater level is an integrated presentation of the hydrogeological framework and the external pumping and recharge system. Therefore, how to identify the hydrogeological framework from a large number of groundwater level data is an important subject. In this study, the frequency analysis method and rainfall recharge mechanism were used to identify the aquifer where the groundwater level's response frequency and amplitude react to the earth tide. As the earth tide change originates from the gravity caused by the paths of sun and moon, it leads to soil stress and strain changes, which further affects the groundwater level. The scale of groundwater level's change varies with the influence of aquifer pressure systems such as confined or unconfined aquifers. This method has been applied to the identification of aquifers in the Cho-Shui River Alluvial Fan. The results of the identification are compared to the records of core drill and they both are quite consistent. It is shown that the identification methods developed in this study can considerably contribute to the identification of hydrogeological framework.
Takeda, Itaru; Umemura, Myco; Koike, Hideaki; Asai, Kiyoshi; Machida, Masayuki
2014-08-01
Despite their biological importance, a significant number of genes for secondary metabolite biosynthesis (SMB) remain undetected due largely to the fact that they are highly diverse and are not expressed under a variety of cultivation conditions. Several software tools including SMURF and antiSMASH have been developed to predict fungal SMB gene clusters by finding core genes encoding polyketide synthase, nonribosomal peptide synthetase and dimethylallyltryptophan synthase as well as several others typically present in the cluster. In this work, we have devised a novel comparative genomics method to identify SMB gene clusters that is independent of motif information of the known SMB genes. The method detects SMB gene clusters by searching for a similar order of genes and their presence in nonsyntenic blocks. With this method, we were able to identify many known SMB gene clusters with the core genes in the genomic sequences of 10 filamentous fungi. Furthermore, we have also detected SMB gene clusters without core genes, including the kojic acid biosynthesis gene cluster of Aspergillus oryzae. By varying the detection parameters of the method, a significant difference in the sequence characteristics was detected between the genes residing inside the clusters and those outside the clusters. © The Author 2014. Published by Oxford University Press on behalf of Kazusa DNA Research Institute.
NASA Astrophysics Data System (ADS)
Liu, Yu; Shi, Zhanjie; Wang, Bangbing; Yu, Tianxiang
2018-01-01
As a method with high resolution, GPR has been extensively used in archaeological surveys. However, conventional GPR profile can only provide limited geometry information, such as the shape or location of the interface, but can't give the distribution of physical properties which could help identify the historical remains more directly. A common way for GPR to map parameter distribution is the common-midpoint velocity analysis, but it provides limited resolution. Another research hotspot, the full-waveform inversion, is unstable and relatively dependent on the initial model. Coring method could give direct information in drilling site, while the accurate result is only limited in several boreholes. In this paper, we propose a new scheme to enhance imaging and characterization of archaeological targets by fusion of GPR and coring data. The scheme mainly involves the impedance inversion of conventional common-offset GPR data, which uses well log to compensate GPR data and finally obtains a high-resolution estimation of permittivity. The core analysis result also contributes to interpretation of the inversion result. To test this method, we did a case study at Mudu city site in Suzhou, China. The results provide clear images of the ancient city's moat and wall subsurface and improve the characterization of archaeological targets. It is shown that this method is effective and feasible for archaeological exploration.
Evidence-based librarianship: an overview
Eldredge, Jonathan D.
2000-01-01
Objective: To demonstrate how the core characteristics of both evidence-based medicine (EBM) and evidence-based health care (EBHC) can be adapted to health sciences librarianship. Method: Narrative review essay involving development of a conceptual framework. The author describes the central features of EBM and EBHC. Following each description of a central feature, the author then suggests ways that this feature applies to health sciences librarianship. Results: First, the decision-making processes of EBM and EBHC are compatible with health sciences librarianship. Second, the EBM and EBHC values of favoring rigorously produced scientific evidence in decision making are congruent with the core values of librarianship. Third, the hierarchical levels of evidence can be applied to librarianship with some modifications. Library researchers currently favor descriptive-survey and case-study methods over systematic reviews, randomized controlled trials, or other higher levels of evidence. The library literature nevertheless contains diverse examples of randomized controlled trials, controlled-comparison studies, and cohort studies conducted by health sciences librarians. Conclusions: Health sciences librarians are confronted with making many practical decisions. Evidence-based librarianship offers a decision-making framework, which integrates the best available research evidence. By employing this framework and the higher levels of research evidence it promotes, health sciences librarians can lay the foundation for more collaborative and scientific endeavors. PMID:11055296
Forbes, Miriam K; Kotov, Roman; Ruggero, Camilo J; Watson, David; Zimmerman, Mark; Krueger, Robert F
2017-11-01
A large body of research has focused on identifying the optimal number of dimensions - or spectra - to model individual differences in psychopathology. Recently, it has become increasingly clear that ostensibly competing models with varying numbers of spectra can be synthesized in empirically derived hierarchical structures. We examined the convergence between top-down (bass-ackwards or sequential principal components analysis) and bottom-up (hierarchical agglomerative cluster analysis) statistical methods for elucidating hierarchies to explicate the joint hierarchical structure of clinical and personality disorders. Analyses examined 24 clinical and personality disorders based on semi-structured clinical interviews in an outpatient psychiatric sample (n=2900). The two methods of hierarchical analysis converged on a three-tier joint hierarchy of psychopathology. At the lowest tier, there were seven spectra - disinhibition, antagonism, core thought disorder, detachment, core internalizing, somatoform, and compulsivity - that emerged in both methods. These spectra were nested under the same three higher-order superspectra in both methods: externalizing, broad thought dysfunction, and broad internalizing. In turn, these three superspectra were nested under a single general psychopathology spectrum, which represented the top tier of the hierarchical structure. The hierarchical structure mirrors and extends upon past research, with the inclusion of a novel compulsivity spectrum, and the finding that psychopathology is organized in three superordinate domains. This hierarchy can thus be used as a flexible and integrative framework to facilitate psychopathology research with varying levels of specificity (i.e., focusing on the optimal level of detailed information, rather than the optimal number of factors). Copyright © 2017 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Date, Kumi; Ishigure, Takaaki
2017-02-01
Polymer optical waveguides with graded-index (GI) circular cores are fabricated using the Mosquito method, in which the positions of parallel cores are accurately controlled. Such an accurate arrangement is of great importance for a high optical coupling efficiency with other optical components such as fiber ribbons. In the Mosquito method that we developed, a core monomer with a viscous liquid state is dispensed into another liquid state monomer for cladding via a syringe needle. Hence, the core positions are likely to shift during or after the dispensing process due to several factors. We investigate the factors, specifically affecting the core height. When the core and cladding monomers are selected appropriately, the effect of the gravity could be negligible, so the core height is maintained uniform, resulting in accurate core heights. The height variance is controlled in +/-2 micrometers for the 12 cores. Meanwhile, larger shift in the core height is observed when the needle-tip position is apart from the substrate surface. One of the possible reasons of the needle-tip height dependence is the asymmetric volume contraction during the monomer curing. We find a linear relationship between the original needle-tip height and the core-height observed. This relationship is implemented in the needle-scan program to stabilize the core height in different layers. Finally, the core heights are accurately controlled even if the cores are aligned on various heights. These results indicate that the Mosquito method enables to fabricate waveguides in which the cores are 3-dimensionally aligned with a high position accuracy.
Qualitative research in CKD: an overview of methods and applications.
Tong, Allison; Winkelmayer, Wolfgang C; Craig, Jonathan C
2014-09-01
There recently has been a paradigm shift in health care policies and research toward greater patient centeredness. A core tenet of patient-centered care is that patients' needs, values, and preferences are respected in clinical decision making. Qualitative research methods are designed to generate insights about patients' priorities, values, and beliefs. However, in the past 5 years (2008-2013), only 23 (0.4%) of the 6,043 original articles published in the top 5 nephrology journals (assessed by impact factor) were qualitative studies. Given this observation, it seems important to promote awareness and better understanding within the nephrology community about qualitative research and how the findings can contribute to improving the quality and outcomes of care for patients with chronic kidney disease. This article outlines examples of how qualitative research can generate insight into the values and preferences of patients with chronic kidney disease, provides an overview of qualitative health research methods, and discusses practical applications for research, practice, and policy. Copyright © 2014 National Kidney Foundation, Inc. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Kiaalhosseini, Saeed
In modern contaminant hydrology, management of contaminated sites requires a holistic characterization of subsurface conditions. Delineation of contaminant distribution in all phases (i.e., aqueous, non-aqueous liquid, sorbed, and gas), as well as associated biogeochemical processes in a complex heterogeneous subsurface, is central to selecting effective remedies. Arguably, a factor contributing to the lack of success of managing contaminated sites effectively has been the limitations of site characterization methods that rely on monitoring wells and grab sediment samples. The overarching objective of this research is to advance a set of third-generation (3G) site characterization methods to overcome shortcomings of current site characterization techniques. 3G methods include 1) cryogenic core collection (C3) from unconsolidated geological subsurface to improve recovery of sediments and preserving key attributes, 2) high-throughput analysis (HTA) of frozen core in the laboratory to provide high-resolution, depth discrete data of subsurface conditions and processes, 3) resolution of non-aqueous phase liquid (NAPL) distribution within the porous media using a nuclear magnetic resonance (NMR) method, and 4) application of a complex resistivity method to track NAPL depletion in shallow geological formation over time. A series of controlled experiments were conducted to develop the C 3 tools and methods. The critical aspects of C3 are downhole circulation of liquid nitrogen via a cooling system, the strategic use of thermal insulation to focus cooling into the core, and the use of back pressure to optimize cooling. The C3 methods were applied at two contaminated sites: 1) F.E. Warren (FEW) Air Force Base near Cheyenne, WY and 2) a former refinery in the western U.S. The results indicated that the rate of core collection using the C3 methods is on the order of 30 foot/day. The C3 methods also improve core recovery and limits potential biases associated with flowing sands. HTA of frozen core was employed at the former refinery and FEW. Porosity and fluid saturations (i.e., aqueous, non-aqueous liquid, and gas) from the former refinery indicate that given in situ freezing, the results are not biased by drainage of pore fluids from the core during sample collection. At FEW, a comparison between the results of HTA of the frozen core collected in 2014 and the results of site characterization using unfrozen core, (second-generation (2G) methods) at the same locations (performed in 2010) indicate consistently higher contaminant concentrations using C 3. Many factors contribute to the higher quantification of contaminant concentrations using C3. The most significant factor is the preservation of the sediment attributes, in particular, pore fluids and volatile organic compounds (VOCs) in comparison to the unfrozen conventional sediment core. The NMR study was performed on laboratory-fabricated sediment core to resolve NAPL distribution within the porous media qualitatively and quantitatively. The fabricated core consisted of Colorado silica sand saturated with deionized water and trichloroethylene (TCE). The cores were scanned with a BRUKER small-animal scanner (2.3 Tesla, 100 MHz) at 20 °C and while the core was frozen at -25 °C. The acquired images indicated that freezing the water within the core suppressed the NMR signals of water-bound hydrogen. The hydrogen associated with TCE was still detectable since the TCE was in its liquid state (melting point of TCE is -73 °C). Therefore, qualitative detection of TCE within the sediment core was performed via the NMR scanning by freezing the water. A one-dimensional NMR scanning method was used for quantification of TCE mass distribution within the frozen core. However, the results indicated inconsistency in estimating the total TCE mass within the porous media. Downhole NMR logging was performed at the former refinery in the western U.S. to detect NAPL and to discriminate NAPL from water in the formation. The results indicated that detection of NMR signals to discriminate NAPL from water is compromised by the noise stemming from the active facilities and/or power lines passing over the site. A laboratory experiment was performed to evaluate the electrical response of unconsolidated porous media through time (30 days) while NAPL was being depleted. Sand columns (Colorado silica sand) contaminated with methyl tert-butyl ether (MTBE, a light non-aqueous phase liquid (LNAPL)) were studied. A multilevel electrode system was used to measure electrical resistivity of impacted sand by imposing alternative current. The trend of reduction in resistivity through the depth of columns over time followed depletion of LNAPL by volatilization. Finally, a field experiment was performed at the former refinery in the western U.S. to track natural losses of LNAPL over time. Multilevel systems consisting of water samplers, thermocouples, and electrodes were installed at a clean zone (background zone) and an LNAPL-impacted zone. In situ measurements of complex resistivity and temperature were taken and water sampling was performed for each depth (from 3 to 14 feet below the ground surface at one-foot spacing) within almost a year. At both locations, the results indicated decreases in apparent resistivity below the water table over time. This trend was supported by the geochemistry of the pore fluids. Overall, results indicate that application of the electrical resistivity method to track LNAPL depletion at field sites is difficult due to multiple conflicting factors affecting the geoelectrical response of LNAPL-impacted zones over time.
Context-Aware Adaptive Hybrid Semantic Relatedness in Biomedical Science
NASA Astrophysics Data System (ADS)
Emadzadeh, Ehsan
Text mining of biomedical literature and clinical notes is a very active field of research in biomedical science. Semantic analysis is one of the core modules for different Natural Language Processing (NLP) solutions. Methods for calculating semantic relatedness of two concepts can be very useful in solutions solving different problems such as relationship extraction, ontology creation and question / answering [1--6]. Several techniques exist in calculating semantic relatedness of two concepts. These techniques utilize different knowledge sources and corpora. So far, researchers attempted to find the best hybrid method for each domain by combining semantic relatedness techniques and data sources manually. In this work, attempts were made to eliminate the needs for manually combining semantic relatedness methods targeting any new contexts or resources through proposing an automated method, which attempted to find the best combination of semantic relatedness techniques and resources to achieve the best semantic relatedness score in every context. This may help the research community find the best hybrid method for each context considering the available algorithms and resources.
ERIC Educational Resources Information Center
Ghanem, Eman; Long, S. Reid; Rodenbusch, Stacia E.; Shear, Ruth I.; Beckham, Josh T.; Procko, Kristen; DePue, Lauren; Stevenson, Keith J.; Robertus, Jon D.; Martin, Stephen; Holliday, Bradley; Jones, Richard A.; Anslyn, Eric V.; Simmons, Sarah L.
2018-01-01
Innovative models of teaching through research have broken the long-held paradigm that core chemistry competencies must be taught with predictable, scripted experiments. We describe here five fundamentally different, course-based undergraduate research experiences that integrate faculty research projects, accomplish ACS accreditation objectives,…
The algorithm of central axis in surface reconstruction
NASA Astrophysics Data System (ADS)
Zhao, Bao Ping; Zhang, Zheng Mei; Cai Li, Ji; Sun, Da Ming; Cao, Hui Ying; Xing, Bao Liang
2017-09-01
Reverse engineering is an important technique means of product imitation and new product development. Its core technology -- surface reconstruction is the current research for scholars. In the various algorithms of surface reconstruction, using axis reconstruction is a kind of important method. For the various reconstruction, using medial axis algorithm was summarized, pointed out the problems existed in various methods, as well as the place needs to be improved. Also discussed the later surface reconstruction and development of axial direction.
Pharmacotherapy for the Core Symptoms in Autistic Disorder: Current Status of the Research
Farmer, Cristan; Thurm, Audrey; Grant, Paul
2013-01-01
The current review covers extant literature on pharmacotherapy for core symptoms of autism. The core symptoms of autism include impairments in social interaction and communication, as well as the presence of restricted and repetitive behaviors. There are no known efficacious treatments for the core social symptoms, although effects on repetitive behaviors are indicated with some data. While studies of fenfluramine, secretin, opiates, and mood stabilizers generally find no effect, mixed results suggest more research is needed on antidepressants and atypical antipsychotics. Newer lines of research, including cholinergic and glutamatergic agents and oxytocin, will be of considerable interest in the future. However, research on the treatment of core symptoms is plagued by limitations in study design, statistical power and other issues inherent to the study of treatments for autism (e.g., heterogeneity of the disorder) that continue to prevent the elucidation of efficacious treatments. PMID:23504356
Pharmacotherapy for the core symptoms in autistic disorder: current status of the research.
Farmer, Cristan; Thurm, Audrey; Grant, Paul
2013-03-01
The current review covers extant literature on pharmacotherapy for core symptoms of autism. The core symptoms of autism include impairments in social interaction and communication, as well as the presence of restricted and repetitive behaviors. There are no known efficacious treatments for the core social symptoms, although effects on repetitive behaviors are indicated with some data. While studies of fenfluramine, secretin, opiates, and mood stabilizers generally find no effect, mixed results suggest more research is needed on antidepressants and atypical antipsychotics. Newer lines of research, including cholinergic and glutamatergic agents and oxytocin, will be of considerable interest in the future. However, research on the treatment of core symptoms is plagued by limitations in study design, statistical power, and other issues inherent to the study of treatments for autism (e.g., heterogeneity of the disorder) that continue to prevent the elucidation of efficacious treatments.
U.S. National Institutes of Health core consolidation-investing in greater efficiency.
Chang, Michael C; Birken, Steven; Grieder, Franziska; Anderson, James
2015-04-01
The U.S. National Institutes of Health (NIH) invests substantial resources in core research facilities (cores) that support research by providing advanced technologies and scientific and technical expertise as a shared resource. In 2010, the NIH issued an initiative to consolidate multiple core facilities into a single, more efficient core. Twenty-six institutions were awarded supplements to consolidate a number of similar core facilities. Although this approach may not work for all core settings, this effort resulted in consolidated cores that were more efficient and of greater benefit to investigators. The improvements in core operations resulted in both increased services and more core users through installation of advanced instrumentation, access to higher levels of management expertise; integration of information management and data systems; and consolidation of billing; purchasing, scheduling, and tracking services. Cost recovery to support core operations also benefitted from the consolidation effort, in some cases severalfold. In conclusion, this program of core consolidation resulted in improvements in the effective operation of core facilities, benefiting both investigators and their supporting institutions.
A PILOT STUDY OF CORE STABILITY AND ATHLETIC PERFORMANCE: IS THERE A RELATIONSHIP?
Sharrock, Chris; Cropper, Jarrod; Mostad, Joel; Johnson, Matt
2011-01-01
Study Design: Correlation study Objectives: To objectively evaluate the relationship between core stability and athletic performance measures in male and female collegiate athletes. Background: The relationship between core stability and athletic performance has yet to be quantified in the available literature. The current literature does not demonstrate whether or not core strength relates to functional performance. Questions remain regarding the most important components of core stability, the role of sport specificity, and the measurement of core stability in relation to athletic performance. Methods: A sample of 35 volunteer student athletes from Asbury College (NAIA Division II) provided informed consent. Participants performed a series of five tests: double leg lowering (core stability test), the forty yard dash, the T-test, vertical jump, and a medicine ball throw. Participants performed three trials of each test in a randomized order. Results: Correlations between the core stability test and each of the other four performance tests were determined using a General Linear Model. Medicine ball throw negatively correlated to the core stability test (r –0.389, p=0.023). Participants that performed better on the core stability test had a stronger negative correlation to the medicine ball throw (r =–0.527). Gender was the most strongly correlated variable to core strength, males with a mean measurement of double leg lowering of 47.43 degrees compared to females having a mean of 54.75 degrees. Conclusions: There appears to be a link between a core stability test and athletic performance tests; however, more research is needed to provide a definitive answer on the nature of this relationship. Ideally, specific performance tests will be able to better define and to examine relationships to core stability. Future studies should also seek to determine if there are specific sub-categories of core stability which are most important to allow for optimal training and performance for individual sports. PMID:21713228
Methodological and Pedagogical Potential of Reflection in Development of Contemporary Didactics
ERIC Educational Resources Information Center
Chupina, Valentina A.; Pleshakova, Anastasiia Yu.; Konovalova, Maria E.
2016-01-01
Applicability of the issue under research is preconditioned by the need of practical pedagogics to expand methodological and methodical tools of contemporary didactics. The purpose of the article is to detect the methodological core of reflection as a form of thinking and to provide insight thereunto on the basis of systematic attributes of the…
ERIC Educational Resources Information Center
Anderson, Stephen E.; Macri, Joelle Rodway
2009-01-01
Our analysis explores the agenda for student learning communicated in interviews with school district officials from four Ontario districts. Using research methods drawn from collective action framing theory, we identified six core frames and one broader frame in the discourse on student learning: (a) measureable academic achievement, (b)…
Using Technology Supported Strategies to Improve Pre-Service Teacher Preparation in Social Studies
ERIC Educational Resources Information Center
Bafumo, Mary Ellen; Noel, Andrea M.
2014-01-01
The National Assessment of Educational Progress shows that many US students are deficient in core knowledge in geography, civics and current events. In this paper, a professor of social studies methods describes an action research project developed to assess and improve teacher candidates' knowledge in these areas. The article explains how data…
ERIC Educational Resources Information Center
Guseva, Liudmila G.; Solomonovich, Mark
2017-01-01
This article overviews the theoretical and applied works of the psychologist and pedagogue Leonid Zankov. Zankov's model of teaching is based on Vygotsky's theory that appropriate teaching methods stimulate cognitive development, whose core notion is the Zone of Proximal Development. This educational psychology research was verified by large scale…
ERIC Educational Resources Information Center
Al-Azawei, Ahmed; Lundqvist, Karsten
2015-01-01
Online learning constitutes the most popular distance-learning method, with flexibility, accessibility, visibility, manageability and availability as its core features. However, current research indicates that its efficacy is not consistent across all learners. This study aimed to modify and extend the factors of the Technology Acceptance Model…
ERIC Educational Resources Information Center
Williamson, Kathryn E.; Jakobson, Lorna S.
2014-01-01
Background: Research has shown that children born very prematurely are at substantially elevated risk for social and behavioral difficulties similar to those seen in full-term children with autism spectrum disorders (ASDs). Methods: To gain insight into core deficits that may underlie these difficulties, in this study, we assessed the social…
User Policies | Center for Cancer Research
User Policies 1. Authorship and Acknowledgement: The SAXS Core facility is a CCR resource dedicated to the CCR researchers. But we also make this resource accessible to non-CCR users free of charge. There are three ways to make use the SAXS Core resource. Asking the SAXS Core staff to collect, process and analyze data, and jointly interpret data with your teams. Asking the core staff to collect data and send it to you.
DART Core/Combustor-Noise Initial Test Results
NASA Technical Reports Server (NTRS)
Boyle, Devin K.; Henderson, Brenda S.; Hultgren, Lennart S.
2017-01-01
Contributions from the combustor to the overall propulsion noise of civilian transport aircraft are starting to become important due to turbofan design trends and advances in mitigation of other noise sources. Future propulsion systems for ultra-efficient commercial air vehicles are projected to be of increasingly higher bypass ratio from larger fans combined with much smaller cores, with ultra-clean burning fuel-flexible combustors. Unless effective noise-reduction strategies are developed, combustor noise is likely to become a prominent contributor to overall airport community noise in the future. The new NASA DGEN Aero0propulsion Research Turbofan (DART) is a cost-efficient testbed for the study of core-noise physics and mitigation. This presentation gives a brief description of the recently completed DART core combustor-noise baseline test in the NASA GRC Aero-Acoustic Propulsion Laboratory (AAPL). Acoustic data was simultaneously acquired using the AAPL overhead microphone array in the engine aft quadrant far field, a single midfield microphone, and two semi-infinite-tube unsteady pressure sensors at the core-nozzle exit. An initial assessment shows that the data is of high quality and compares well with results from a quick 2014 feasibility test. Combustor noise components of measured total-noise signatures were educed using a two-signal source-separation method an dare found to occur in the expected frequency range. The research described herein is aligned with the NASA Ultra-Efficient Commercial Transport strategic thrust and is supported by the NASA Advanced Air Vehicle Program, Advanced Air Transport Technology Project, under the Aircraft Noise Reduction Subproject.
NASA Astrophysics Data System (ADS)
Launhardt, R.; Stutz, A. M.; Schmiedeke, A.; Henning, Th.; Krause, O.; Balog, Z.; Beuther, H.; Birkmann, S.; Hennemann, M.; Kainulainen, J.; Khanzadyan, T.; Linz, H.; Lippok, N.; Nielbock, M.; Pitann, J.; Ragan, S.; Risacher, C.; Schmalzl, M.; Shirley, Y. L.; Stecklum, B.; Steinacker, J.; Tackenberg, J.
2013-03-01
Context. The temperature and density structure of molecular cloud cores are the most important physical quantities that determine the course of the protostellar collapse and the properties of the stars they form. Nevertheless, density profiles often rely either on the simplifying assumption of isothermality or on observationally poorly constrained model temperature profiles. The instruments of the Herschel satellite provide us for the first time with both the spectral coverage and the spatial resolution that is needed to directly measure the dust temperature structure of nearby molecular cloud cores. Aims: With the aim of better constraining the initial physical conditions in molecular cloud cores at the onset of protostellar collapse, in particular of measuring their temperature structure, we initiated the guaranteed time key project (GTKP) "The Earliest Phases of Star Formation" (EPoS) with the Herschel satellite. This paper gives an overview of the low-mass sources in the EPoS project, the Herschel and complementary ground-based observations, our analysis method, and the initial results of the survey. Methods: We study the thermal dust emission of 12 previously well-characterized, isolated, nearby globules using FIR and submm continuum maps at up to eight wavelengths between 100 μm and 1.2 mm. Our sample contains both globules with starless cores and embedded protostars at different early evolutionary stages. The dust emission maps are used to extract spatially resolved SEDs, which are then fit independently with modified blackbody curves to obtain line-of-sight-averaged dust temperature and column density maps. Results: We find that the thermal structure of all globules (mean mass 7 M⊙) is dominated by external heating from the interstellar radiation field and moderate shielding by thin extended halos. All globules have warm outer envelopes (14-20 K) and colder dense interiors (8-12 K) with column densities of a few 1022 cm-2. The protostars embedded in some of the globules raise the local temperature of the dense cores only within radii out to about 5000 AU, but do not significantly affect the overall thermal balance of the globules. Five out of the six starless cores in the sample are gravitationally bound and approximately thermally stabilized. The starless core in CB 244 is found to be supercritical and is speculated to be on the verge of collapse. For the first time, we can now also include externally heated starless cores in the Lsmm/Lbol vs. Tbol diagram and find that Tbol < 25 K seems to be a robust criterion to distinguish starless from protostellar cores, including those that only have an embedded very low-luminosity object. Herschel is an ESA space observatory with science instruments provided by European-led Principal Investigator consortia and with important participation from NASA.Partially based on observations carried out with the IRAM 30 m Telescope, with the Atacama Pathfinder Experiment (APEX), and with the James Clerk Maxwell Telescope (JCMT). IRAM is supported by INSU/CNRS (France), MPG (Germany) and IGN (Spain). APEX is a collaboration between Max Planck Institut für Radioastronomie (MPIfR), Onsala Space Observatory (OSO), and the European Southern Observatory (ESO). The JCMT is operated by the Joint Astronomy Centre on behalf of the Particle Physics and Astronomy Research Council of the United Kingdom, the Netherlands Association for Scientific Research, and the National Research Council of Canada.Appendices A, B and C are available in electronic form at http://www.aanda.org
Standardized Methods for Enhanced Quality and Comparability of Tuberculous Meningitis Studies.
Marais, Ben J; Heemskerk, Anna D; Marais, Suzaan S; van Crevel, Reinout; Rohlwink, Ursula; Caws, Maxine; Meintjes, Graeme; Misra, Usha K; Mai, Nguyen T H; Ruslami, Rovina; Seddon, James A; Solomons, Regan; van Toorn, Ronald; Figaji, Anthony; McIlleron, Helen; Aarnoutse, Robert; Schoeman, Johan F; Wilkinson, Robert J; Thwaites, Guy E
2017-02-15
Tuberculous meningitis (TBM) remains a major cause of death and disability in tuberculosis-endemic areas, especially in young children and immunocompromised adults. Research aimed at improving outcomes is hampered by poor standardization, which limits study comparison and the generalizability of results. We propose standardized methods for the conduct of TBM clinical research that were drafted at an international tuberculous meningitis research meeting organized by the Oxford University Clinical Research Unit in Vietnam. We propose a core dataset including demographic and clinical information to be collected at study enrollment, important aspects related to patient management and monitoring, and standardized reporting of patient outcomes. The criteria proposed for the conduct of observational and intervention TBM studies should improve the quality of future research outputs, can facilitate multicenter studies and meta-analyses of pooled data, and could provide the foundation for a global TBM data repository.
Mills, Jane; Yates, Karen; Harrison, Helena; Woods, Cindy; Chamberlain-Salaun, Jennifer; Trueman, Scott; Hitchins, Marnie
2016-08-01
Postgraduate nursing students' negative perceptions about a core research subject at an Australian university led to a revision and restructure of the subject using a Communities of Inquiry framework. Negative views are often expressed by nursing and midwifery students about the research process. The success of evidence-based practice is dependent on changing these views. A Community of Inquiry is an online teaching, learning, thinking, and sharing space created through the combination of three domains-teacher presence (related largely to pedagogy), social presence, and cognitive presence (critical thinking). Evaluate student satisfaction with a postgraduate core nursing and midwifery subject in research design, theory, and methodology, which was delivered using a Communities of Inquiry framework. This evaluative study incorporated a validated Communities of Inquiry survey (n=29) and interviews (n=10) and was conducted at an Australian university. Study participants were a convenience sample drawn from 56 postgraduate students enrolled in a core research subject. Survey data were analysed descriptively and interviews were coded thematically. Five main themes were identified: subject design and delivery; cultivating community through social interaction; application-knowledge, practice, research; student recommendations; and technology and technicalities. Student satisfaction was generally high, particularly in the areas of cognitive presence (critical thinking) and teacher presence (largely pedagogy related). Students' views about the creation of a "social presence" were varied but overall, the framework was effective in stimulating both inquiry and a sense of community. The process of research is, in itself, the creation of a "community of inquiry." This framework showed strong potential for use in the teaching of nurse research subjects; satisfaction was high as students reported learning, not simply the theory and the methods of research, but also how to engage in "doing" research by forging professional and intellectual communities. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Ja'fari, Ahmad; Hamidzadeh Moghadam, Rasoul
2012-10-01
Routine core analysis provides useful information for petrophysical study of the hydrocarbon reservoirs. Effective porosity and fluid conductivity (permeability) could be obtained from core analysis in laboratory. Coring hydrocarbon bearing intervals and analysis of obtained cores in laboratory is expensive and time consuming. In this study an improved method to make a quantitative correlation between porosity and permeability obtained from core and conventional well log data by integration of different artificial intelligent systems is proposed. The proposed method combines the results of adaptive neuro-fuzzy inference system (ANFIS) and neural network (NN) algorithms for overall estimation of core data from conventional well log data. These methods multiply the output of each algorithm with a weight factor. Simple averaging and weighted averaging were used for determining the weight factors. In the weighted averaging method the genetic algorithm (GA) is used to determine the weight factors. The overall algorithm was applied in one of SW Iran’s oil fields with two cored wells. One-third of all data were used as the test dataset and the rest of them were used for training the networks. Results show that the output of the GA averaging method provided the best mean square error and also the best correlation coefficient with real core data.
Toye, Francine; Seers, Kate; Allcock, Nick; Briggs, Michelle; Carr, Eloise; Barker, Karen
2014-06-21
Studies that systematically search for and synthesise qualitative research are becoming more evident in health care, and they can make an important contribution to patient care. Our team was funded to complete a meta-ethnography of patients' experience of chronic musculoskeletal pain. It has been 25 years since Noblit and Hare published their core text on meta-ethnography, and the current health research environment brings additional challenges to researchers aiming to synthesise qualitative research. Noblit and Hare propose seven stages of meta-ethnography which take the researcher from formulating a research idea to expressing the findings. These stages are not discrete but form part of an iterative research process. We aimed to build on the methods of Noblit and Hare and explore the challenges of including a large number of qualitative studies into a qualitative systematic review. These challenges hinge upon epistemological and practical issues to be considered alongside expectations about what determines high quality research. This paper describes our method and explores these challenges. Central to our method was the process of collaborative interpretation of concepts and the decision to exclude original material where we could not decipher a concept. We use excerpts from our research team's reflexive statements to illustrate the development of our methods.
2014-01-01
Studies that systematically search for and synthesise qualitative research are becoming more evident in health care, and they can make an important contribution to patient care. Our team was funded to complete a meta-ethnography of patients’ experience of chronic musculoskeletal pain. It has been 25 years since Noblit and Hare published their core text on meta-ethnography, and the current health research environment brings additional challenges to researchers aiming to synthesise qualitative research. Noblit and Hare propose seven stages of meta-ethnography which take the researcher from formulating a research idea to expressing the findings. These stages are not discrete but form part of an iterative research process. We aimed to build on the methods of Noblit and Hare and explore the challenges of including a large number of qualitative studies into a qualitative systematic review. These challenges hinge upon epistemological and practical issues to be considered alongside expectations about what determines high quality research. This paper describes our method and explores these challenges. Central to our method was the process of collaborative interpretation of concepts and the decision to exclude original material where we could not decipher a concept. We use excerpts from our research team’s reflexive statements to illustrate the development of our methods. PMID:24951054
Reference point detection for camera-based fingerprint image based on wavelet transformation.
Khalil, Mohammed S
2015-04-30
Fingerprint recognition systems essentially require core-point detection prior to fingerprint matching. The core-point is used as a reference point to align the fingerprint with a template database. When processing a larger fingerprint database, it is necessary to consider the core-point during feature extraction. Numerous core-point detection methods are available and have been reported in the literature. However, these methods are generally applied to scanner-based images. Hence, this paper attempts to explore the feasibility of applying a core-point detection method to a fingerprint image obtained using a camera phone. The proposed method utilizes a discrete wavelet transform to extract the ridge information from a color image. The performance of proposed method is evaluated in terms of accuracy and consistency. These two indicators are calculated automatically by comparing the method's output with the defined core points. The proposed method is tested on two data sets, controlled and uncontrolled environment, collected from 13 different subjects. In the controlled environment, the proposed method achieved a detection rate 82.98%. In uncontrolled environment, the proposed method yield a detection rate of 78.21%. The proposed method yields promising results in a collected-image database. Moreover, the proposed method outperformed compare to existing method.
A first step to compare geodynamical models and seismic observations of the inner core
NASA Astrophysics Data System (ADS)
Lasbleis, M.; Waszek, L.; Day, E. A.
2016-12-01
Seismic observations have revealed a complex inner core, with lateral and radial heterogeneities at all observable scales. The dominant feature is the east-west hemispherical dichotomy in seismic velocity and attenuation. Several geodynamical models have been proposed to explain the observed structure: convective instabilities, external forces, crystallisation processes or influence of outer core convection. However, interpreting such geodynamical models in terms of the seismic observations is difficult, and has been performed only for very specific models (Geballe 2013, Lincot 2014, 2016). Here, we propose a common framework to make such comparisons. We have developed a Python code that propagates seismic ray paths through kinematic geodynamical models for the inner core, computing a synthetic seismic data set that can be compared to seismic observations. Following the method of Geballe 2013, we start with the simple model of translation. For this, the seismic velocity is proposed to be function of the age or initial growth rate of the material (since there is no deformation included in our models); the assumption is reasonable when considering translation, growth and super rotation of the inner core. Using both artificial (random) seismic ray data sets and a real inner core data set (from Waszek et al. 2011), we compare these different models. Our goal is to determine the model which best matches the seismic observations. Preliminary results show that super rotation successfully creates an eastward shift in properties with depth, as has been observed seismically. Neither the growth rate of inner core material nor the relationship between crystal size and seismic velocity are well constrained. Consequently our method does not directly compute the seismic travel times. Instead, here we use age, growth rate and other parameters as proxies for the seismic properties, which represent a good first step to compare geodynamical and seismic observations.Ultimately we aim to release our codes to broader scientific community, allowing researchers from all disciplines to test their models of inner core growth against seismic observations or create a kinematic model for the evolution of the inner core which matches new geophysical observations.
Fan, Jing; Yang, Haowen; Liu, Ming; Wu, Dan; Jiang, Hongrong; Zeng, Xin; Elingarami, Sauli; Ll, Zhiyang; Li, Song; Liu, Hongna; He, Nongyue
2015-02-01
In this research, a novel method for relative fluorescent quantification of DNA based on Fe3O4@SiO2@Au gold-coated magnetic nanocomposites (GMNPs) and multiplex ligation- dependent probe amplification (MLPA) has been developed. With the help of self-assembly, seed-mediated growth and chemical reduction method, core-shell Fe3O4@SiO2@Au GMNPs were synthesized. Through modified streptavidin on the GMNPs surface, we obtained a bead chip which can capture the biotinylated probes. Then we designed MLPA probes which were tagged with biotin or Cy3 and target DNA on the basis of human APP gene sequence. The products from the thermostable DNA ligase induced ligation reactions and PCR amplifications were incubated with SA-GMNPs. After washing, magnetic separation, spotting, the fluorescent scanning results showed our method can be used for the relative quantitative analysis of the target DNA in the concentration range of 03004~0.5 µM.
Spectral Element Method for the Simulation of Unsteady Compressible Flows
NASA Technical Reports Server (NTRS)
Diosady, Laslo Tibor; Murman, Scott M.
2013-01-01
This work uses a discontinuous-Galerkin spectral-element method (DGSEM) to solve the compressible Navier-Stokes equations [1{3]. The inviscid ux is computed using the approximate Riemann solver of Roe [4]. The viscous fluxes are computed using the second form of Bassi and Rebay (BR2) [5] in a manner consistent with the spectral-element approximation. The method of lines with the classical 4th-order explicit Runge-Kutta scheme is used for time integration. Results for polynomial orders up to p = 15 (16th order) are presented. The code is parallelized using the Message Passing Interface (MPI). The computations presented in this work are performed using the Sandy Bridge nodes of the NASA Pleiades supercomputer at NASA Ames Research Center. Each Sandy Bridge node consists of 2 eight-core Intel Xeon E5-2670 processors with a clock speed of 2.6Ghz and 2GB per core memory. On a Sandy Bridge node the Tau Benchmark [6] runs in a time of 7.6s.
Scaling of Multimillion-Atom Biological Molecular Dynamics Simulation on a Petascale Supercomputer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schulz, Roland; Lindner, Benjamin; Petridis, Loukas
2009-01-01
A strategy is described for a fast all-atom molecular dynamics simulation of multimillion-atom biological systems on massively parallel supercomputers. The strategy is developed using benchmark systems of particular interest to bioenergy research, comprising models of cellulose and lignocellulosic biomass in an aqueous solution. The approach involves using the reaction field (RF) method for the computation of long-range electrostatic interactions, which permits efficient scaling on many thousands of cores. Although the range of applicability of the RF method for biomolecular systems remains to be demonstrated, for the benchmark systems the use of the RF produces molecular dipole moments, Kirkwood G factors,more » other structural properties, and mean-square fluctuations in excellent agreement with those obtained with the commonly used Particle Mesh Ewald method. With RF, three million- and five million atom biological systems scale well up to 30k cores, producing 30 ns/day. Atomistic simulations of very large systems for time scales approaching the microsecond would, therefore, appear now to be within reach.« less
Scaling of Multimillion-Atom Biological Molecular Dynamics Simulation on a Petascale Supercomputer.
Schulz, Roland; Lindner, Benjamin; Petridis, Loukas; Smith, Jeremy C
2009-10-13
A strategy is described for a fast all-atom molecular dynamics simulation of multimillion-atom biological systems on massively parallel supercomputers. The strategy is developed using benchmark systems of particular interest to bioenergy research, comprising models of cellulose and lignocellulosic biomass in an aqueous solution. The approach involves using the reaction field (RF) method for the computation of long-range electrostatic interactions, which permits efficient scaling on many thousands of cores. Although the range of applicability of the RF method for biomolecular systems remains to be demonstrated, for the benchmark systems the use of the RF produces molecular dipole moments, Kirkwood G factors, other structural properties, and mean-square fluctuations in excellent agreement with those obtained with the commonly used Particle Mesh Ewald method. With RF, three million- and five million-atom biological systems scale well up to ∼30k cores, producing ∼30 ns/day. Atomistic simulations of very large systems for time scales approaching the microsecond would, therefore, appear now to be within reach.
Density-based cluster algorithms for the identification of core sets
NASA Astrophysics Data System (ADS)
Lemke, Oliver; Keller, Bettina G.
2016-10-01
The core-set approach is a discretization method for Markov state models of complex molecular dynamics. Core sets are disjoint metastable regions in the conformational space, which need to be known prior to the construction of the core-set model. We propose to use density-based cluster algorithms to identify the cores. We compare three different density-based cluster algorithms: the CNN, the DBSCAN, and the Jarvis-Patrick algorithm. While the core-set models based on the CNN and DBSCAN clustering are well-converged, constructing core-set models based on the Jarvis-Patrick clustering cannot be recommended. In a well-converged core-set model, the number of core sets is up to an order of magnitude smaller than the number of states in a conventional Markov state model with comparable approximation error. Moreover, using the density-based clustering one can extend the core-set method to systems which are not strongly metastable. This is important for the practical application of the core-set method because most biologically interesting systems are only marginally metastable. The key point is to perform a hierarchical density-based clustering while monitoring the structure of the metric matrix which appears in the core-set method. We test this approach on a molecular-dynamics simulation of a highly flexible 14-residue peptide. The resulting core-set models have a high spatial resolution and can distinguish between conformationally similar yet chemically different structures, such as register-shifted hairpin structures.
Core Practices for Teaching History: The Results of a Delphi Panel Survey
ERIC Educational Resources Information Center
Fogo, Bradley
2014-01-01
Recent education literature and research has focused on identifying effective core teaching practices to inform and help shape teacher education and professional development. Although a rich literature on the teaching and learning of history has continued to develop over the past decade, core practice research has largely overlooked…
ERIC Educational Resources Information Center
Bissett, Rachel L.; Cheng, Michael S. H.; Brannan, Robert G.
2010-01-01
Professional organizations have linked core competency to professional success and competitive strategy. The Research Chefs Assn. (RCA) recently released 43 core competencies for practicing culinologists. Culinology[R] is a profession that links skills of culinary arts and food science and technology in the development of food products. An online…
Disaster and Contingency Planning for Scientific Shared Resource Cores
Wilkerson, Amy
2016-01-01
Progress in biomedical research is largely driven by improvements, innovations, and breakthroughs in technology, accelerating the research process, and an increasingly complex collaboration of both clinical and basic science. This increasing sophistication has driven the need for centralized shared resource cores (“cores”) to serve the scientific community. From a biomedical research enterprise perspective, centralized resource cores are essential to increased scientific, operational, and cost effectiveness; however, the concentration of instrumentation and resources in the cores may render them highly vulnerable to damage from severe weather and other disasters. As such, protection of these assets and the ability to recover from a disaster is increasingly critical to the mission and success of the institution. Therefore, cores should develop and implement both disaster and business continuity plans and be an integral part of the institution’s overall plans. Here we provide an overview of key elements required for core disaster and business continuity plans, guidance, and tools for developing these plans, and real-life lessons learned at a large research institution in the aftermath of Superstorm Sandy. PMID:26848285
A vortex-filament and core model for wings with edge vortex separation
NASA Technical Reports Server (NTRS)
Pao, J. L.; Lan, C. E.
1982-01-01
A vortex filament-vortex core method for predicting aerodynamic characteristics of slender wings with edge vortex separation was developed. Semi-empirical but simple methods were used to determine the initial positions of the free sheet and vortex core. Comparison with available data indicates that: (1) the present method is generally accurate in predicting the lift and induced drag coefficients but the predicted pitching moment is too positive; (2) the spanwise lifting pressure distributions estimated by the one vortex core solution of the present method are significantly better than the results of Mehrotra's method relative to the pressure peak values for the flat delta; (3) the two vortex core system applied to the double delta and strake wings produce overall aerodynamic characteristics which have good agreement with data except for the pitching moment; and (4) the computer time for the present method is about two thirds of that of Mehrotra's method.
Badenhorst, Anna; Mansoori, Parisa; Chan, Kit Yee
2016-01-01
Background The past two decades have seen a large increase in investment in global public health research. There is a need for increased coordination and accountability, particularly in understanding where funding is being allocated and who has capacity to perform research. In this paper, we aim to assess global, regional, national and sub–national capacity for public health research and how it is changing over time in different parts of the world. Methods To allow comparisons of regions, countries and universities/research institutes over time, we relied on Web of ScienceTM database and used Hirsch (h) index based on 5–year–periods (h5). We defined articles relevant to public health research with 98% specificity using the combination of search terms relevant to public health, epidemiology or meta–analysis. Based on those selected papers, we computed h5 for each country of the world and their main universities/research institutes for these 5–year time periods: 1996–2000, 2001–2005 and 2006–2010. We computed h5 with a 3–year–window after each time period, to allow citations from more recent years to accumulate. Among the papers contributing to h5–core, we explored a topic/disease under investigation, “instrument” of health research used (eg, descriptive, discovery, development or delivery research); and universities/research institutes contributing to h5–core. Results Globally, the majority of public health research has been conducted in North America and Europe, but other regions (particularly Eastern Mediterranean and South–East Asia) are showing greater improvement rate and are rapidly gaining capacity. Moreover, several African nations performed particularly well when their research output is adjusted by their gross domestic product (GDP). In the regions gaining capacity, universities are contributing more substantially to the h–core publications than other research institutions. In all regions of the world, the topics of articles in h–core are shifting from communicable to non–communicable diseases (NCDs). There is also a trend of reduction in “discovery” research and increase in “delivery” research. Conclusion Funding agencies and research policy makers should recognise nations where public health research capacity is increasing. These countries are worthy of increased investment in order to further increase the production of high quality local research and continue to develop their research capacity. Similarly, universities that contribute substantially to national research capacity should be recognised and supported. Biomedical journals should also take notice to ensure equity in peer–review process and provide researchers from all countries an equal opportunity to publish high–quality research and reduce financial barriers to accessing these journals. PMID:27350875
Advanced Materials and Solids Analysis Research Core (AMSARC)
The Advanced Materials and Solids Analysis Research Core (AMSARC), centered at the U.S. Environmental Protection Agency's (EPA) Andrew W. Breidenbach Environmental Research Center in Cincinnati, Ohio, is the foundation for the Agency's solids and surfaces analysis capabilities. ...
Mitchell, Diane C; Castro, Javier; Armitage, Tracey L; Vega-Arroyo, Alondra J; Moyce, Sally C; Tancredi, Daniel J; Bennett, Deborah H; Jones, James H; Kjellstrom, Tord; Schenker, Marc B
2017-07-01
The California heat illness prevention study (CHIPS) devised methodology and collected physiological data to assess heat related illness (HRI) risk in Latino farmworkers. Bilingual researchers monitored HRI across a workshift, recording core temperature, work rate (metabolic equivalents [METs]), and heart rate at minute intervals. Hydration status was assessed by changes in weight and blood osmolality. Personal data loggers and a weather station measured exposure to heat. Interviewer administered questionnaires were used to collect demographic and occupational information. California farmworkers (n = 588) were assessed. Acceptable quality data was obtained from 80% of participants (core temperature) to 100% of participants (weight change). Workers (8.3%) experienced a core body temperature more than or equal to 38.5 °C and 11.8% experienced dehydration (lost more than 1.5% of body weight). Methodology is presented for the first comprehensive physiological assessment of HRI risk in California farmworkers.
Cook, Sandra; Fillion, Lise; Fitch, Margaret; Veillette, Anne-Marie; Matheson, Tanya; Aubin, Michèle; de Serres, Marie; Doll, Richard; Rainville, François
2013-01-01
Fillion et al. (2012) recently designed a conceptual framework for professional cancer navigators describing key functions of professional cancer navigation. Building on this framework, this study defines the core areas of practice and associated competencies for professional cancer navigators. The methods used in this study included: literature review, mapping of navigation functions against practice standards and competencies, and validation of this mapping process with professional navigators, their managers and nursing experts and comparison of roles in similar navigation programs. Associated competencies were linked to the three identified core areas of practice, which are: 1) providing information and education, 2) providing emotional and supportive care, and 3) facilitating coordination and continuity of care. Cancer navigators are in a key position to improve patient and family empowerment and continuity of care. This is an important step for advancing the role of oncology nurses in navigator positions and identifying areas for further research.
NASA Technical Reports Server (NTRS)
Hultgren, Lennart S.
2012-01-01
This presentation is a technical summary of and outlook for NASA-internal and NASA-sponsored external research on core noise funded by the Fundamental Aeronautics Program Subsonic Fixed Wing (SFW) Project. Sections of the presentation cover: the SFW system-level noise metrics for the 2015 (N+1), 2020 (N+2), and 2025 (N+3) timeframes; SFW strategic thrusts and technical challenges; SFW advanced subsystems that are broadly applicable to N+3 vehicle concepts, with an indication where further noise research is needed; the components of core noise (compressor, combustor and turbine noise) and a rationale for NASA's current emphasis on the combustor-noise component; the increase in the relative importance of core noise due to turbofan design trends; the need to understand and mitigate core-noise sources for high-efficiency small gas generators; and the current research activities in the core-noise area, with additional details given about forthcoming updates to NASA's Aircraft Noise Prediction Program (ANOPP) core-noise prediction capabilities, two NRA efforts (Honeywell International, Phoenix, AZ and University of Illinois at Urbana-Champaign, respectively) to improve the understanding of core-noise sources and noise propagation through the engine core, and an effort to develop oxide/oxide ceramic-matrix-composite (CMC) liners for broadband noise attenuation suitable for turbofan-core application. Core noise must be addressed to ensure that the N+3 noise goals are met. Focused, but long-term, core-noise research is carried out to enable the advanced high-efficiency small gas-generator subsystem, common to several N+3 conceptual designs, needed to meet NASA's technical challenges. Intermediate updates to prediction tools are implemented as the understanding of the source structure and engine-internal propagation effects is improved. The NASA Fundamental Aeronautics Program has the principal objective of overcoming today's national challenges in air transportation. The SFW Quiet-Aircraft Subproject aims to develop concepts and technologies to reduce perceived community noise attributable to aircraft with minimal impact on weight and performance. This reduction of aircraft noise is critical to enabling the anticipated large increase in future air traffic.
Szymanski, Jacek; Wilson, David L; Zhang, Guo-Qiang
2009-10-01
The rapid expansion of biomedical research has brought substantial scientific and administrative data management challenges to modern core facilities. Scientifically, a core facility must be able to manage experimental workflow and the corresponding set of large and complex scientific data. It must also disseminate experimental data to relevant researchers in a secure and expedient manner that facilitates collaboration and provides support for data interpretation and analysis. Administratively, a core facility must be able to manage the scheduling of its equipment and to maintain a flexible and effective billing system to track material, resource, and personnel costs and charge for services to sustain its operation. It must also have the ability to regularly monitor the usage and performance of its equipment and to provide summary statistics on resources spent on different categories of research. To address these informatics challenges, we introduce a comprehensive system called MIMI (multimodality, multiresource, information integration environment) that integrates the administrative and scientific support of a core facility into a single web-based environment. We report the design, development, and deployment experience of a baseline MIMI system at an imaging core facility and discuss the general applicability of such a system in other types of core facilities. These initial results suggest that MIMI will be a unique, cost-effective approach to addressing the informatics infrastructure needs of core facilities and similar research laboratories.
Name It! Store It! Protect It!: A Systems Approach to Managing Data in Research Core Facilities.
DeVries, Matthew; Fenchel, Matthew; Fogarty, R E; Kim, Byong-Do; Timmons, Daniel; White, A Nicole
2017-12-01
As the capabilities of technology increase, so do the production of data and the need for data management. The need for data storage at many academic institutions is increasing exponentially. Technology is expanding rapidly, and institutions are recognizing the need to incorporate data management that can be available for future data sharing as a critical component of institutional services. The establishment of a process to manage the surge in data storage is complex and often hindered by not having a plan. Simple file naming-nomenclature-is also becoming ever more important to leave an established understanding of the contents in a file. This is especially the case as research experiences turnover from research projects and personnel. The indexing of files consistently also helps to identify past work. Finally, the protection of the data contents is becoming increasing challenging. As the genomic field expands, and medicine becomes more personalized, the identification of methods to protect the contents of data in both short- and long-term storage needs to be established so as not to risk the potential of revealing identifiable information. This is often something we do not consider in a nonclinical research environment. The need for establishing basic guidelines for institutions is critical, as individual research laboratories are unable to handle the scope of data storage required for their own research. In addition to the immediate needs for establishing guidelines on data storage and file naming and how to protect information, the recognition of the need for specialized support for data management supporting research cores and laboratories at academic institutions is becoming a critical component of institutional services. Here, we outline some case studies and methods that you may be able to adopt at your own institution.
Simultaneous Neutron and X-ray Tomography for Quantitative analysis of Geological Samples
NASA Astrophysics Data System (ADS)
LaManna, J.; Hussey, D. S.; Baltic, E.; Jacobson, D. L.
2016-12-01
Multiphase flow is a critical area of research for shale gas, oil recovery, underground CO2 sequestration, geothermal power, and aquifer management. It is critical to understand the porous structure of the geological formations in addition to the fluid/pore and fluid/fluid interactions. Difficulties for analyzing flow characteristics of rock cores are in obtaining 3D distribution information on the fluid flow and maintaining the cores in a state for other analysis methods. Two powerful non-destructive methods for obtaining 3D structural and compositional information are X-ray and neutron tomography. X-ray tomography produces information on density and structure while neutrons excel at acquiring the liquid phase and produces compositional information. These two methods can offer strong complementary information but are typically conducted at separate times and often at different facilities. This poses issues for obtaining dynamic and stochastic information as the sample will change between analysis modes. To address this, NIST has developed a system that allows for multimodal, simultaneous tomography using thermal neutrons and X-rays by placing a 90 keVp micro-focus X-ray tube 90° to the neutron beam. High pressure core holders that simulate underground conditions have been developed to facilitate simultaneous tomography. These cells allow for the control of confining pressure, axial load, temperature, and fluid flow through the core. This talk will give an overview the simultaneous neutron and x-ray tomography capabilities at NIST, the benefits of multimodal imaging, environmental equipment for geology studies, and several case studies that have been conducted at NIST.
Efficiency of static core turn-off in a system-on-a-chip with variation
Cher, Chen-Yong; Coteus, Paul W; Gara, Alan; Kursun, Eren; Paulsen, David P; Schuelke, Brian A; Sheets, II, John E; Tian, Shurong
2013-10-29
A processor-implemented method for improving efficiency of a static core turn-off in a multi-core processor with variation, the method comprising: conducting via a simulation a turn-off analysis of the multi-core processor at the multi-core processor's design stage, wherein the turn-off analysis of the multi-core processor at the multi-core processor's design stage includes a first output corresponding to a first multi-core processor core to turn off; conducting a turn-off analysis of the multi-core processor at the multi-core processor's testing stage, wherein the turn-off analysis of the multi-core processor at the multi-core processor's testing stage includes a second output corresponding to a second multi-core processor core to turn off; comparing the first output and the second output to determine if the first output is referring to the same core to turn off as the second output; outputting a third output corresponding to the first multi-core processor core if the first output and the second output are both referring to the same core to turn off.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hong, Yang-Ki; Haskew, Timothy; Myryasov, Oleg
2014-06-05
The research we conducted focuses on the rare-earth (RE)-free permanent magnet by modeling, simulating, and synthesizing exchange coupled two-phase (hard/soft) RE-free core-shell nano-structured magnet. The RE-free magnets are made of magnetically hard core materials (high anisotropy materials including Mn-Bi-X and M-type hexaferrite) coated by soft shell materials (high magnetization materials including Fe-Co or Co). Therefore, our research helps understand the exchange coupling conditions of the core/shell magnets, interface exchange behavior between core and shell materials, formation mechanism of core/shell structures, stability conditions of core and shell materials, etc.
The core role of the nurse practitioner: practice, professionalism and clinical leadership.
Carryer, Jenny; Gardner, Glenn; Dunn, Sandra; Gardner, Anne
2007-10-01
To draw on empirical evidence to illustrate the core role of nurse practitioners in Australia and New Zealand. Enacted legislation provides for mutual recognition of qualifications, including nursing, between New Zealand and Australia. As the nurse practitioner role is relatively new in both countries, there is no consistency in role expectation and hence mutual recognition has not yet been applied to nurse practitioners. A study jointly commissioned by both countries' Regulatory Boards developed information on the core role of the nurse practitioner, to develop shared competency and educational standards. Reporting on this study's process and outcomes provides insights that are relevant both locally and internationally. This interpretive study used multiple data sources, including published and grey literature, policy documents, nurse practitioner program curricula and interviews with 15 nurse practitioners from the two countries. Data were analysed according to the appropriate standard for each data type and included both deductive and inductive methods. The data were aggregated thematically according to patterns within and across the interview and material data. The core role of the nurse practitioner was identified as having three components: dynamic practice, professional efficacy and clinical leadership. Nurse practitioner practice is dynamic and involves the application of high level clinical knowledge and skills in a wide range of contexts. The nurse practitioner demonstrates professional efficacy, enhanced by an extended range of autonomy that includes legislated privileges. The nurse practitioner is a clinical leader with a readiness and an obligation to advocate for their client base and their profession at the systems level of health care. A clearly articulated and research informed description of the core role of the nurse practitioner provides the basis for development of educational and practice competency standards. These research findings provide new perspectives to inform the international debate about this extended level of nursing practice. The findings from this research have the potential to achieve a standardised approach and internationally consistent nomenclature for the nurse practitioner role.
NASA Astrophysics Data System (ADS)
Blot, R.; Nedelec, P.; Petetin, H.; Thouret, V.; Cohen, Y.
2017-12-01
The In-Service Aircraft for a Global Observing System (IAGOS; http://www.iagos.org) is an European Research Infrastructure that provides cost-effective global atmospheric composition measurements at high resolution using commercial passenger aircraft. It is the continuation of the MOZAIC (1994-2014) and the CARIBIC (since 1997) programs that has provided a unique scientific database using 6 aircraft operated by European airlines over two decades. Thanks to growing interests of several international Airlines to contribute to the academic climate research, the IAGOS aircraft fleet (started in 2011), with the IAGOS-CORE basic instrumentation, has expanded to 9 Airbus A340/A330 aircraft up to now. Here, we present this IAGOS-CORE instrumentation that continuously sample carbon monoxide, ozone, water vapor and cloud droplets. We focus on carbon monoxide and ozone measurements which are performed by optimized, but well known, methods such as UV absorption and IR correlation. We describe the data processing/validation and the data quality control. With already more than 20 and 15 years of continuous ozone and carbon monoxide measurements, respectively, the IAGOS/MOZAIC data are particularly suitable for climatologies and trends. Also, since commercial aircraft are daily operated, the near-real time IAGOS-CORE data are also used to observe pollution plumes and to validate air-quality models as well as satellite products.
Overview of JSPS Core-to-Core Program: Forming Research and Educational Hubs of Medical Physics.
Koizumi, Masahiko; Takashina, Masaaki
To foster medical physicists, we introduce the achievement we made since 2011 under the national research project of the Japan Society for the Promotion of Science (JSPS) Core-to-Core program; 'Forming Research and Educational Hubs of Medical Physics.' On this basis and under the JSPS program, we promoted research and educational exchange with Indiana University (IU) in USA, University of Groningen (The UG) in the Netherland and other cooperating institutions such as University of Minnesota (UM).A total of 23 students and researchers were sent. UG accepted the most among three institutions. In turn, 12 foreign researchers including post-doctor fellows came to Japan for academic seminars or educational lectures.Fifteen international seminars were held; 8 in Japan, 4 in USA, and 3 in the Netherland.Lots of achievement were made through these activities in 5 years. Total of 23 research topics at the international conferences were presented. Total of 12 articles were published in international journals.This program clearly promoted the establishment of international collaboration, and many young researchers and graduate students were exchanged and collaborated with foreign researchers.
Characterizing Facesheet/Core Disbonding in Honeycomb Core Sandwich Structure
NASA Technical Reports Server (NTRS)
Rinker, Martin; Ratcliffe, James G.; Adams, Daniel O.; Krueger, Ronald
2013-01-01
Results are presented from an experimental investigation into facesheet core disbonding in carbon fiber reinforced plastic/Nomex honeycomb sandwich structures using a Single Cantilever Beam test. Specimens with three, six and twelve-ply facesheets were tested. Specimens with different honeycomb cores consisting of four different cell sizes were also tested, in addition to specimens with three different widths. Three different data reduction methods were employed for computing apparent fracture toughness values from the test data, namely an area method, a compliance calibration technique and a modified beam theory method. The compliance calibration and modified beam theory approaches yielded comparable apparent fracture toughness values, which were generally lower than those computed using the area method. Disbonding in the three-ply facesheet specimens took place at the facesheet/core interface and yielded the lowest apparent fracture toughness values. Disbonding in the six and twelve-ply facesheet specimens took place within the core, near to the facesheet/core interface. Specimen width was not found to have a significant effect on apparent fracture toughness. The amount of scatter in the apparent fracture toughness data was found to increase with honeycomb core cell size.
Shaping Core Health Messages: Rural, Low-Income Mothers Speak Through Participatory Action Research.
Mammen, Sheila; Sano, Yoshie; Braun, Bonnie; Maring, Elisabeth Fost
2018-04-23
Rural, low-income families are disproportionately impacted by health problems owing to structural barriers (e.g., transportation, health insurance coverage) and personal barriers (e.g., health literacy). This paper presents a Participatory Action Research (PAR) model of co-created Core Health Messages (CHMs) in the areas of dental health, food security, health insurance, and physical activity. The research project engaged a multi-disciplinary team of experts to design initial health messages; rural, low-income mothers to respond to, and co-create, health messages; and stakeholders who work with families to share their insights. Findings reveal the perceptions of mothers and community stakeholders regarding messages and channels of message dissemination. By using PAR, a learner engagement approach, the researchers intend to increase the likelihood that the CHMs are culturally appropriate and relevant to specific populations. The CHM-PAR model visually illustrates an interactive, iterative process of health message generation and testing. The paper concludes with implications for future research and outreach in a technological landscape where dissemination channels are dynamic. This paper provides a model for researchers and health educators to co-create messages in a desired format (e.g., length, voice, level of empathy, tone) preferred by their audiences and to examine dissemination methods that will best reach those audiences.
Scheduler for multiprocessor system switch with selective pairing
Gara, Alan; Gschwind, Michael Karl; Salapura, Valentina
2015-01-06
System, method and computer program product for scheduling threads in a multiprocessing system with selective pairing of processor cores for increased processing reliability. A selective pairing facility is provided that selectively connects, i.e., pairs, multiple microprocessor or processor cores to provide one highly reliable thread (or thread group). The method configures the selective pairing facility to use checking provide one highly reliable thread for high-reliability and allocate threads to corresponding processor cores indicating need for hardware checking. The method configures the selective pairing facility to provide multiple independent cores and allocate threads to corresponding processor cores indicating inherent resilience.
Sonochemical Synthesis of Zinc Oxide Nanostructures for Sensing and Energy Harvesting
NASA Astrophysics Data System (ADS)
Vabbina, Phani Kiran
Semiconductor nanostructures have attracted considerable research interest due to their unique physical and chemical properties at nanoscale which open new frontiers for applications in electronics and sensing. Zinc oxide nanostructures with a wide range of applications, especially in optoelectronic devices and bio sensing, have been the focus of research over the past few decades. However ZnO nanostructures have failed to penetrate the market as they were expected to, a few years ago. The two main reasons widely recognized as bottleneck for ZnO nanostructures are (1) Synthesis technique which is fast, economical, and environmentally benign which would allow the growth on arbitrary substrates and (2) Difficulty in producing stable p-type doping. The main objective of this research work is to address these two bottlenecks and find a solution that is inexpensive, environmentally benign and CMOS compatible. To achieve this, we developed a Sonochemical method to synthesize 1D ZnO Nanorods, core-shell nanorods, 2D nanowalls and nanoflakes on arbitrary substrates which is a rapid, inexpensive, CMOS compatible and environmentally benign method and allows us to grow ZnO nanostructures on any arbitrary substrate at ambient conditions while most other popular methods used are either very slow or involve extreme conditions such as high temperatures and low pressure. A stable, reproducible p-type doping in ZnO is one of the most sought out application in the field of optoelectronics. Here in this project, we doped ZnO nanostructures using sonochemical method to achieve a stable and reproducible doping in ZnO. We have fabricated a homogeneous ZnO radial p-n junction by growing a p-type shell around an n-type core in a controlled way using the sonochemical synthesis method to realize ZnO homogeneous core-shell radial p-n junction for UV detection. ZnO has a wide range of applications from sensing to energy harvesting. In this work, we demonstrate the successful fabrication of an electrochemical immunosensor using ZnO nanoflakes to detect Cortisol and compare their performance with that of ZnO nanorods. We have explored the use of ZnO nanorods in energy harvesting in the form of Dye Sensitized Solar Cells (DSSC) and Perovskite Solar Cells.
Core-to-core uniformity improvement in multi-core fiber Bragg gratings
NASA Astrophysics Data System (ADS)
Lindley, Emma; Min, Seong-Sik; Leon-Saval, Sergio; Cvetojevic, Nick; Jovanovic, Nemanja; Bland-Hawthorn, Joss; Lawrence, Jon; Gris-Sanchez, Itandehui; Birks, Tim; Haynes, Roger; Haynes, Dionne
2014-07-01
Multi-core fiber Bragg gratings (MCFBGs) will be a valuable tool not only in communications but also various astronomical, sensing and industry applications. In this paper we address some of the technical challenges of fabricating effective multi-core gratings by simulating improvements to the writing method. These methods allow a system designed for inscribing single-core fibers to cope with MCFBG fabrication with only minor, passive changes to the writing process. Using a capillary tube that was polished on one side, the field entering the fiber was flattened which improved the coverage and uniformity of all cores.
Challenges for proteomics core facilities.
Lilley, Kathryn S; Deery, Michael J; Gatto, Laurent
2011-03-01
Many analytical techniques have been executed by core facilities established within academic, pharmaceutical and other industrial institutions. The centralization of such facilities ensures a level of expertise and hardware which often cannot be supported by individual laboratories. The establishment of a core facility thus makes the technology available for multiple researchers in the same institution. Often, the services within the core facility are also opened out to researchers from other institutions, frequently with a fee being levied for the service provided. In the 1990s, with the onset of the age of genomics, there was an abundance of DNA analysis facilities, many of which have since disappeared from institutions and are now available through commercial sources. Ten years on, as proteomics was beginning to be utilized by many researchers, this technology found itself an ideal candidate for being placed within a core facility. We discuss what in our view are the daily challenges of proteomics core facilities. We also examine the potential unmet needs of the proteomics core facility that may also be applicable to proteomics laboratories which do not function as core facilities. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Stenius, Kerstin; Ramstedt, Mats; Olsson, Börje
2010-03-01
The Centre for Social Research on Alcohol and Drugs (SoRAD) was established as a national research centre and department within the Faculty of Social Science at Stockholm University in 1997, following a Government Report and with the aim to strengthen social alcohol and drug research. Initially, core funding came from the Swedish Council for Working Life and Social Research and from the Ministry of Health and Social Affairs for several long-term projects. Today, SoRAD, with 25 senior and junior researchers, has core funding from the university but most of its funding comes from external national and international grants. Research is organized under three themes: consumption, problems and norms, alcohol and drug policy and societal reactions, treatment and recovery processes. SoRADs scientific approach, multi-disciplinarity, a mix of qualitative and quantitative methods and international comparisons was established by the centre's first leader, Robin Room. Regular internal seminars are held and young researchers are encouraged to attend scientific meetings and take part in collaborative projects. SoRAD researchers produce government-funded monthly statistics on alcohol consumption and purchase, and take part in various national government committees, but SoRADs research has no clear political or bureaucratic constraints. One of the future challenges for SoRAD will be the proposed system for university grants allocation, where applied social science will have difficulties competing with basic biomedical research if decisions are based on publication and citation measures.
Core dynamics and the nutations of the Earth.
NASA Astrophysics Data System (ADS)
Dehant, V. M. A.; Laguerre, R.; Rekier, J.; Rivoldini, A.; Trinh, A.; Triana, A. S.; Van Hoolst, T.; Zhu, P.
2016-12-01
We here present an overview of the recent activities within the project RotaNut - Rotation and Nutation of a Wobbly Earth, an ERC Advanced Grant funding from the European Research Council. We have recomputed the Basic Earth Parameters from recent VLBI series and we interpret them in terms of physics of the Earth's deep interior. This includes updates of the nutational constraints on Earth's internal magnetic field and inner core viscosity, as well as of the coupling constants at the core-mantle boundary (CMB) and inner core boundary ICB. We have explored on simplified Earth models the interactions between rotational and gravito-inertial modes. With the help of numerical simulations, we have also addressed the coupling between the global rotation and the inertial waves in the fluid core through parametric instabilities. Special interests have been given to the influence of the inner core onto the stability properties of the liquid core and the large scale formation in the turbulent flow through inverse cascade of energy. The role of precession and nutation forcing for the liquid core is characterized as well as the interaction between the Free Core Nutation (in the fluid core community called the tilt-over mode) and the inertial waves. This research represents the first steps in the project RotaNut financed by the European Research Council under ERC Advanced Grant 670874 for 2015-2020.
2015-06-18
public release; distribution is unlimited. Micro-Photoluminescence (micro-PL) Study of Core-Shell GaAs/GaAsSb Nanowires grown by Self-Assisted Molecular...U.S. Army Research Office P.O. Box 12211 Research Triangle Park, NC 27709-2211 GaAsSb, Core Shell Nanowires , Micro Photoluminescence...University 1601 East Market Street Greensboro, NC 27411 -0001 ABSTRACT Micro-Photoluminescence (micro-PL) Study of Core-Shell GaAs/GaAsSb Nanowires grown by
Core Cutting Test with Vertical Rock Cutting Rig (VRCR)
NASA Astrophysics Data System (ADS)
Yasar, Serdar; Osman Yilmaz, Ali
2017-12-01
Roadheaders are frequently used machines in mining and tunnelling, and performance prediction of roadheaders is important for project economics and stability. Several methods were proposed so far for this purpose and, rock cutting tests are the best choice. Rock cutting tests are generally divided into two groups which are namely, full scale rock cutting tests and small scale rock cutting tests. These two tests have some superiorities and deficiencies over themselves. However, in many cases, where rock sampling becomes problematic, small scale rock cutting test (core cutting test) is preferred for performance prediction, since small block samples and core samples can be conducted to rock cutting testing. Common problem for rock cutting tests are that they can be found in very limited research centres. In this study, a new mobile rock cutting testing equipment, vertical rock cutting rig (VRCR) was introduced. Standard testing procedure was conducted on seven rock samples which were the part of a former study on cutting rocks with another small scale rock cutting test. Results showed that core cutting test can be realized successfully with VRCR with the validation of paired samples t-test.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Luther, Erik; Rooyen, Isabella van; Leckie, Rafael
2015-03-01
In an effort to explore fuel systems that are more robust under accident scenarios, the DOE-NE has identified the need to resume transient testing. The Transient Reactor Test (TREAT) facility has been identified as the preferred option for the resumption of transient testing of nuclear fuel in the United States. In parallel, NNSA’s Global Threat Reduction Initiative (GTRI) Convert program is exploring the needs to replace the existing highly enriched uranium (HEU) core with low enriched uranium (LEU) core. In order to construct a new LEU core, materials and fabrication processes similar to those used in the initial core fabricationmore » must be identified, developed and characterized. In this research, graphite matrix fuel blocks were extruded and materials properties of were measured. Initially the extrusion process followed the historic route; however, the project was expanded to explore methods to increase the graphite content of the fuel blocks and explore modern resins. Materials properties relevant to fuel performance including density, heat capacity and thermal diffusivity were measured. The relationship between process defects and materials properties will be discussed.« less
McFARLIN, Brian K; Breslin, Whitney L; Carpenter, Katie C; Strohacker, Kelley; Weintraub, Randi J
2010-01-01
Today's students have unique learning needs and lack knowledge of core research skills. In this program report, we describe an online approach that we developed to teach core research skills to freshman and sophomore undergraduates. Specifically, we used two undergraduate kinesiology (KIN) courses designed to target students throughout campus (KIN1304: Public Health Issues in Physical Activity and Obesity) and specifically kinesiology majors (KIN1252: Foundations of Kinesiology). Our program was developed and validated at the 2 nd largest ethnically diverse research university in the United States, thus we believe that it would be effective in a variety of student populations.
Consensus classification of posterior cortical atrophy
Crutch, Sebastian J.; Schott, Jonathan M.; Rabinovici, Gil D.; Murray, Melissa; Snowden, Julie S.; van der Flier, Wiesje M.; Dickerson, Bradford C.; Vandenberghe, Rik; Ahmed, Samrah; Bak, Thomas H.; Boeve, Bradley F.; Butler, Christopher; Cappa, Stefano F.; Ceccaldi, Mathieu; de Souza, Leonardo Cruz; Dubois, Bruno; Felician, Olivier; Galasko, Douglas; Graff-Radford, Jonathan; Graff-Radford, Neill R.; Hof, Patrick R.; Krolak-Salmon, Pierre; Lehmann, Manja; Magnin, Eloi; Mendez, Mario F.; Nestor, Peter J.; Onyike, Chiadi U.; Pelak, Victoria S.; Pijnenburg, Yolande; Primativo, Silvia; Rossor, Martin N.; Ryan, Natalie S.; Scheltens, Philip; Shakespeare, Timothy J.; González, Aida Suárez; Tang-Wai, David F.; Yong, Keir X. X.; Carrillo, Maria; Fox, Nick C.
2017-01-01
Introduction A classification framework for posterior cortical atrophy (PCA) is proposed to improve the uniformity of definition of the syndrome in a variety of research settings. Methods Consensus statements about PCA were developed through a detailed literature review, the formation of an international multidisciplinary working party which convened on four occasions, and a Web-based quantitative survey regarding symptom frequency and the conceptualization of PCA. Results A three-level classification framework for PCA is described comprising both syndrome- and disease-level descriptions. Classification level 1 (PCA) defines the core clinical, cognitive, and neuroimaging features and exclusion criteria of the clinico-radiological syndrome. Classification level 2 (PCA-pure, PCA-plus) establishes whether, in addition to the core PCA syndrome, the core features of any other neurodegenerative syndromes are present. Classification level 3 (PCA attributable to AD [PCA-AD], Lewy body disease [PCA-LBD], corticobasal degeneration [PCA-CBD], prion disease [PCA-prion]) provides a more formal determination of the underlying cause of the PCA syndrome, based on available pathophysiological biomarker evidence. The issue of additional syndrome-level descriptors is discussed in relation to the challenges of defining stages of syndrome severity and characterizing phenotypic heterogeneity within the PCA spectrum. Discussion There was strong agreement regarding the definition of the core clinico-radiological syndrome, meaning that the current consensus statement should be regarded as a refinement, development, and extension of previous single-center PCA criteria rather than any wholesale alteration or redescription of the syndrome. The framework and terminology may facilitate the interpretation of research data across studies, be applicable across a broad range of research scenarios (e.g., behavioral interventions, pharmacological trials), and provide a foundation for future collaborative work. PMID:28259709
Dumbaugh, Mari; Bapolisi, Wyvine; van de Weerd, Jennie; Zabiti, Michel; Mommers, Paula; Balaluka, Ghislain Bisimwa; Merten, Sonja
2017-07-03
In this protocol we describe a mixed methods study in the province of South Kivu, Democratic Republic of Congo evaluating the effectiveness of different demand side strategies to increase maternal health service utilization and the practice of birth spacing. Conditional service subsidization, conditional cash transfers and non-monetary incentives aim to encourage women to use maternal health services and practice birth spacing in two different health districts. Our methodology will comparatively evaluate the effectiveness of different approaches against each other and no intervention. This study comprises four main research activities: 1) Formative qualitative research to determine feasibility of planned activities and inform development of the quantitative survey; 2) A community-based, longitudinal survey; 3) A retrospective review of health facility records; 4) Qualitative exploration of intervention acceptability and emergent themes through in-depth interviews with program participants, non-participants, their partners and health providers. Female community health workers are engaged as core members of the research team, working in tandem with female survey teams to identify women in the community who meet eligibility criteria. Female community health workers also act as key informants and community entry points during methods design and qualitative exploration. Main study outcomes are completion of antenatal care, institutional delivery, practice of birth spacing, family planning uptake and intervention acceptability in the communities. Qualitative methods also explore decision making around maternal health service use, fertility preference and perceptions of family planning. The innovative mixed methods design allows quantitative data to inform the relationships and phenomena to be explored in qualitative collection. In turn, qualitative findings will be triangulated with quantitative findings. Inspired by the principles of grounded theory, qualitative analysis will begin while data collection is ongoing. This "conversation" between quantitative and qualitative data will result in a more holistic, context-specific exploration and understanding of research topics, including the mechanisms through which the interventions are or are not effective. In addition, engagement of female community health workers as core members of the research team roots research methods in the realities of the community and provides teams with key informants who are simultaneously implicated in the health system, community and target population.
Experimental investigations of a uranium plasma pertinent to a self-sustaining plasma source
NASA Technical Reports Server (NTRS)
Schneider, R. T.
1971-01-01
The research is pertinent to the realization of a self-sustained fissioning plasma for applications such as nuclear propulsion, closed cycle MHD power generation using a plasma core reactor, and heat engines such as the nuclear piston engine, as well as the direct conversion of fission energy into optical radiation (nuclear pumped lasers). Diagnostic measurement methods and experimental devices simulating plasma core reactor conditions are discussed. Studies on the following topics are considered: (1) ballistic piston compressor (U-235); (2) high pressure uranium plasma (natural uranium); (3) sliding spark discharge (natural uranium); (4) fission fragment interaction (He-3 and U-235); and (5) nuclear pumped lasers (He-3 and U-235).
Incompleteness of Bluetooth protocol conformance test cases
NASA Astrophysics Data System (ADS)
Wu, Peng; Gao, Qiang
2001-10-01
This paper describes a formal method to verify the completeness of conformance testing, in which not only Implementation Under Test (IUT) is formalized in SDL, but also conformance tester is described in SDL so that conformance testing can be performed in simulator provided with CASE tool. The protocol set considered is Bluetooth, an open wireless communication technology. Our research results show that Bluetooth conformance test specification is not complete in that it has only limited coverage and many important capabilities defined in Bluetooth core specification are not tested. We also give a detail report on the missing test cases against Bluetooth core specification, and provide a guide on further test case generation in the future.
ERIC Educational Resources Information Center
Bain, Lisa Z.
2012-01-01
There are many different delivery methods used by institutions of higher education. These include traditional, hybrid, and online course offerings. The comparisons of these typically use final grade as the measure of student performance. This research study looks behind the final grade and compares student performance by assessment type, core…
Space-frame connection for small-diameter round timber
Ronald W. Wolfe; Agron E. Gjinolli; John R. King
2000-01-01
To promote more efficient use of small-diameter timber, research efforts are being focused on the development and evaluation of connection methods that can be easily applied to non-standard round wood profiles. This report summarizes an evaluation of a bdowel-nut connectionc as an option for the use of Douglas-fir peeler cores in three-dimensional truss or bspace-...
ERIC Educational Resources Information Center
Marttila, Katie L.
2017-01-01
For the realm of this study, the researcher reviewed two separate mathematics programs that have been implemented within the school district to address both the needs of the students with learning disabilities and the requirements of the local and state assessments. The mathematics programs are designed with two different methods to meet the…
ERIC Educational Resources Information Center
Dósa, Katalin; Russ, Rosemary
2016-01-01
Learning in higher education today is measured overwhelmingly on the basis of "correctness," that is, whether students sufficiently approached the preset "expert" answer to a test question. We posit that although conceptual correctness is at the core of good learning, there is much information instructors miss out on by relying…
ERIC Educational Resources Information Center
Kenedi, Gustave; Mountford-Zimdars, Anna
2018-01-01
Pro-Vice-Chancellors (PVCs) form the second-tier leadership of UK higher education institutions. However, their role and position remain under-theorised and under-researched. The present article explores the extent to which a PVC Education role requires core expertise in education or generic managerial skills. Using a mixed-methods approach, we…
Defining and quantifying the social phenotype in autism.
Klin, Ami; Jones, Warren; Schultz, Robert; Volkmar, Fred; Cohen, Donald
2002-06-01
Genetic and neurofunctional research in autism has highlighted the need for improved characterization of the core social disorder defining the broad spectrum of syndrome manifestations. This article reviews the advantages and limitations of current methods for the refinement and quantification of this highly heterogeneous social phenotype. The study of social visual pursuit by use of eye-tracking technology is offered as a paradigm for novel tools incorporating these requirements and as a research effort that builds on the emerging synergy of different branches of social neuroscience. Advances in the area will require increased consideration of processes underlying experimental results and a closer approximation of experimental methods to the naturalistic demands inherent in real-life social situations.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-14
..., or Partially-Exclusive Licensing of an Invention Concerning Method for Estimating Core Body... Serial No. 61/572,677, entitled ``Method for Estimating Core Body Temperature from Heart Rate,'' filed on... core temperature from heart rate. The invention further relates to a method of determining impending...
Fonkwe, Merline L D; Trapp, Stefan
2016-08-01
This research examines the feasibility of analyzing tree cores to detect benzene, toluene, ethylbenzene, and m, p, o-xylene (BTEX) compounds and methyl tertiary-butyl ether (MTBE) in groundwater in eastern Canada subarctic environments, using a former landfill site in the remote community of Happy Valley-Goose Bay, Labrador. Petroleum hydrocarbon contamination at the landfill site is the result of environmentally unsound pre-1990s disposal of households and industrial solid wastes. Tree cores were taken from trembling aspen, black spruce, and white birch and analyzed by headspace-gas chromatography-mass spectrometry. BTEX compounds were detected in tree cores, corroborating known groundwater contamination. A zone of anomalously high concentrations of total BTEX constituents was identified and recommended for monitoring by groundwater wells. Tree cores collected outside the landfill site at a local control area suggest the migration of contaminants off-site. Tree species exhibit different concentrations of BTEX constituents, indicating selective uptake and accumulation. Toluene in wood exhibited the highest concentrations, which may also be due to endogenous production. Meanwhile, MTBE was not found in the tree cores and is considered to be absent in the groundwater. The results demonstrate that tree-core analysis can be useful for detecting anomalous concentrations of petroleum hydrocarbons, such as BTEX compounds, in subarctic sites with shallow unconfined aquifers and permeable soils. This method can therefore aid in the proper management of contamination during landfill operations and after site closures.
Comparative effectiveness research.
Hirsch, J A; Schaefer, P W; Romero, J M; Rabinov, J D; Sanelli, P C; Manchikanti, L
2014-09-01
The goal of comparative effectiveness research is to improve health care while dealing with the seemingly ever-rising cost. An understanding of comparative effectiveness research as a core topic is important for neuroradiologists. It can be used in a variety of ways. Its goal is to look at alternative methods of interacting with a clinical condition, ideally, while improving delivery of care. While the Patient-Centered Outcome Research initiative is the most mature US-based foray into comparative effectiveness research, it has been used more robustly in decision-making in other countries for quite some time. The National Institute for Health and Clinical Excellence of the United Kingdom is a noteworthy example of comparative effectiveness research in action. © 2014 by American Journal of Neuroradiology.
Development of an open metadata schema for prospective clinical research (openPCR) in China.
Xu, W; Guan, Z; Sun, J; Wang, Z; Geng, Y
2014-01-01
In China, deployment of electronic data capture (EDC) and clinical data management system (CDMS) for clinical research (CR) is in its very early stage, and about 90% of clinical studies collected and submitted clinical data manually. This work aims to build an open metadata schema for Prospective Clinical Research (openPCR) in China based on openEHR archetypes, in order to help Chinese researchers easily create specific data entry templates for registration, study design and clinical data collection. Singapore Framework for Dublin Core Application Profiles (DCAP) is used to develop openPCR and four steps such as defining the core functional requirements and deducing the core metadata items, developing archetype models, defining metadata terms and creating archetype records, and finally developing implementation syntax are followed. The core functional requirements are divided into three categories: requirements for research registration, requirements for trial design, and requirements for case report form (CRF). 74 metadata items are identified and their Chinese authority names are created. The minimum metadata set of openPCR includes 3 documents, 6 sections, 26 top level data groups, 32 lower data groups and 74 data elements. The top level container in openPCR is composed of public document, internal document and clinical document archetypes. A hierarchical structure of openPCR is established according to Data Structure of Electronic Health Record Architecture and Data Standard of China (Chinese EHR Standard). Metadata attributes are grouped into six parts: identification, definition, representation, relation, usage guides, and administration. OpenPCR is an open metadata schema based on research registration standards, standards of the Clinical Data Interchange Standards Consortium (CDISC) and Chinese healthcare related standards, and is to be publicly available throughout China. It considers future integration of EHR and CR by adopting data structure and data terms in Chinese EHR Standard. Archetypes in openPCR are modularity models and can be separated, recombined, and reused. The authors recommend that the method to develop openPCR can be referenced by other countries when designing metadata schema of clinical research. In the next steps, openPCR should be used in a number of CR projects to test its applicability and to continuously improve its coverage. Besides, metadata schema for research protocol can be developed to structurize and standardize protocol, and syntactical interoperability of openPCR with other related standards can be considered.
Status of the core and the mini core collections for the U.S. gemrplasm collection of peanut
USDA-ARS?s Scientific Manuscript database
To maximize their usefulness, core and mini core collections should be dynamic. The peanut core collection was developed in the early 1990's, and the mini core was developed in the late 1990's. Research has shown that these collections can be used to improve the efficiency and effectiveness of ide...
Flow Cytometry Technician | Center for Cancer Research
PROGRAM DESCRIPTION The Basic Science Program (BSP) pursues independent, multidisciplinary research in basic and applied molecular biology, immunology, retrovirology, cancer biology, and human genetics. Research efforts and support are an integral part of the Center for Cancer Research (CCR) at the Frederick National Laboratory for Cancer Research (FNLCR). KEY ROLES/RESPONSIBILITIES The Flow Cytometry Core (Flow Core) of the Cancer and Inflammation Program (CIP) is a service core which supports the research efforts of the CCR by providing expertise in the field of flow cytometry (using analyzers and sorters) with the goal of gaining a more thorough understanding of the biology of cancer and cancer cells. The Flow Core provides service to 12-15 CIP laboratories and more than 22 non-CIP laboratories. Flow core staff provide technical advice on the experimental design of applications, which include immunological phenotyping, cell function assays, and cell cycle analysis. Work is performed per customer requirements, and no independent research is involved. The Flow Cytometry Technician will be responsible for: Monitor performance of and maintain high dimensional flow cytometer analyzers and cell sorters Operate high dimensional flow cytometer analyzers and cell sorters Monitoring lab supply levels and order lab supplies, perform various record keeping responsibilities Assist in the training of scientific end users on the use of flow cytometry in their research, as well as how to operate and troubleshoot the bench-top analyzer instruments Experience with sterile technique and tissue culture
How Do We Know This Works? An Overview of Research on Core Knowledge
ERIC Educational Resources Information Center
Core Knowledge Foundation, 2004
2004-01-01
Teachers, principals and parents often ask about the effectiveness of Core Knowledge. This article is meant to answer that question, by providing a brief overview of some of the most recent and most relevant research on Core Knowledge. This document has been divided into two sections. The first section treats direct evidence; the second looks at…
User Policies | Center for Cancer Research
User Policies 1. Authorship and Acknowledgement: The SAXS Core facility is a CCR resource dedicated to the CCR researchers. But we also make this resource accessible to non-CCR users free of charge. There are three ways to make use the SAXS Core resource. Asking the SAXS Core staff to collect, process and analyze data, and jointly interpret data with your teams. Asking the
The Core Six: Essential Strategies for Achieving Excellence with the Common Core
ERIC Educational Resources Information Center
Silver, Harvey F.; Perini, Matthew J.; Dewing, R. Thomas
2012-01-01
If you already have a strong grasp on the Common Core and are eager to do something about it, this book's research-based strategies will help you respond to the demands of the new standards, particularly the English language arts standards that affect every subject area and grade level. Drawing from the research on which classroom strategies are…
Improved Thermoplastic/Iron-Particle Transformer Cores
NASA Technical Reports Server (NTRS)
Wincheski, Russell A.; Bryant, Robert G.; Namkung, Min
2004-01-01
A method of fabricating improved transformer cores from composites of thermoplastic matrices and iron-particles has been invented. Relative to commercially available laminated-iron-alloy transformer cores, the cores fabricated by this method weigh less and are less expensive. Relative to prior polymer-matrix/ iron-particle composite-material transformer cores, the cores fabricated by this method can be made mechanically stronger and more magnetically permeable. In addition, whereas some prior cores have exhibited significant eddy-current losses, the cores fabricated by this method exhibit very small eddy-current losses. The cores made by this method can be expected to be attractive for use in diverse applications, including high-signal-to-noise transformers, stepping motors, and high-frequency ignition coils. The present method is a product of an experimental study of the relationships among fabrication conditions, final densities of iron particles, and mechanical and electromagnetic properties of fabricated cores. Among the fabrication conditions investigated were molding pressures (83, 104, and 131 MPa), and molding temperatures (250, 300, and 350 C). Each block of core material was made by uniaxial-compression molding, at the applicable pressure/temperature combination, of a mixture of 2 weight percent of LaRC (or equivalent high-temperature soluble thermoplastic adhesive) with 98 weight percent of approximately spherical iron particles having diameters in the micron range. Each molded block was cut into square cross-section rods that were used as core specimens in mechanical and electromagnetic tests. Some of the core specimens were annealed at 900 C and cooled slowly before testing. For comparison, a low-carbon-steel core was also tested. The results of the tests showed that density, hardness, and rupture strength generally increased with molding pressure and temperature, though the correlation was rather weak. The weakness of the correlation was attributed to the pores in the specimens. The maximum relative permeabilities of cores made without annealing ranged from 30 to 110, while those of cores made with annealing ranged from 900 to 1,400. However, the greater permeabilities of the annealed specimens were not associated with noticeably greater densities. The major practical result of the investigation was the discovery of an optimum distribution of iron-particle sizes: It was found that eddy-current losses in the molded cores were minimized by using 100 mesh (corresponding to particles with diameters less than or equal to 100 m) iron particles. The effect of optimization of particle sizes on eddy-current losses is depicted in the figure.
NASA Astrophysics Data System (ADS)
Campbell, S. W.; Williams, K.; Marston, L.; Kreutz, K. J.; Osterberg, E. C.; Wake, C. P.
2013-12-01
For the past six years, a multi-institution effort has undertaken a broad glaciological and climate research project in Denali National Park. Most recently, two ~208 m long surface to bedrock ice cores were recovered from the Mt. Hunter plateau with supporting geophysical and weather data collected. Twenty two individuals have participated in the field program providing thousands of person-hours towards completing our research goals. Technical and scientific results have been disseminated to the broader scientific community through dozens of professional presentations and six peer-reviewed publications. In addition, we have pursued the development of interactive computer applications that use our results for educational purposes, publically available fact sheets through Denali National Park, and most recently, with assistance from PolarTREC and other affiliations, the development of a children's book and roll-out of K-8 science curriculum based on this project. The K-8 curriculum will provide students with an opportunity to use real scientific data to meet their educational requirements through alternative, interactive, and exciting methods relative to more standard educational programs. Herein, we present examples of this diverse approach towards incorporating polar research into K-12 STEM classrooms.
Glauß, Benjamin; Steinmann, Wilhelm; Walter, Stephan; Beckers, Markus; Seide, Gunnar; Gries, Thomas; Roth, Georg
2013-01-01
This research explains the melt spinning of bicomponent fibers, consisting of a conductive polypropylene (PP) core and a piezoelectric sheath (polyvinylidene fluoride). Previously analyzed piezoelectric capabilities of polyvinylidene fluoride (PVDF) are to be exploited in sensor filaments. The PP compound contains a 10 wt % carbon nanotubes (CNTs) and 2 wt % sodium stearate (NaSt). The sodium stearate is added to lower the viscosity of the melt. The compound constitutes the fiber core that is conductive due to a percolation CNT network. The PVDF sheath’s piezoelectric effect is based on the formation of an all-trans conformation β phase, caused by draw-winding of the fibers. The core and sheath materials, as well as the bicomponent fibers, are characterized through different analytical methods. These include wide-angle X-ray diffraction (WAXD) to analyze crucial parameters for the development of a crystalline β phase. The distribution of CNTs in the polymer matrix, which affects the conductivity of the core, was investigated by transmission electron microscopy (TEM). Thermal characterization is carried out by conventional differential scanning calorimetry (DSC). Optical microscopy is used to determine the fibers’ diameter regularity (core and sheath). The materials’ viscosity is determined by rheometry. Eventually, an LCR tester is used to determine the core’s specific resistance. PMID:28811400
The CompHP core competencies framework for health promotion in Europe.
Barry, Margaret M; Battel-Kirk, Barbara; Dempsey, Colette
2012-12-01
The CompHP Project on Developing Competencies and Professional Standards for Health Promotion in Europe was developed in response to the need for new and changing health promotion competencies to address health challenges. This article presents the process of developing the CompHP Core Competencies Framework for Health Promotion across the European Union Member States and Candidate Countries. A phased, multiple-method approach was employed to facilitate a consensus-building process on the development of the core competencies. Key stakeholders in European health promotion were engaged in a layered consultation process using the Delphi technique, online consultations, workshops, and focus groups. Based on an extensive literature review, a mapping process was used to identify the core domains, which informed the first draft of the Framework. A consultation process involving two rounds of a Delphi survey with national experts in health promotion from 30 countries was carried out. In addition, feedback was received from 25 health promotion leaders who participated in two focus groups at a pan-European level and 116 health promotion practitioners who engaged in four country-specific consultations. A further 54 respondents replied to online consultations, and there were a number of followers on various social media platforms. Based on four rounds of redrafting, the final Framework document was produced, consisting of 11 core domains and 68 core competency statements. The CompHP Core Competencies Framework for Health Promotion provides a resource for workforce development in Europe, by articulating the necessary knowledge, skills, and abilities that are required for effective practice. The core domains are based on the multidisciplinary concepts, theories, and research that make health promotion distinctive. It is the combined application of all the domains, the knowledge base, and the ethical values that constitute the CompHP Core Competencies Framework for Health Promotion.
Core-core and core-valence correlation
NASA Technical Reports Server (NTRS)
Bauschlicher, Charles W., Jr.; Langhoff, Stephen R.; Taylor, Peter R.
1988-01-01
The effect of (1s) core correlation on properties and energy separations was analyzed using full configuration-interaction (FCI) calculations. The Be 1 S - 1 P, the C 3 P - 5 S and CH+ 1 Sigma + or - 1 Pi separations, and CH+ spectroscopic constants, dipole moment and 1 Sigma + - 1 Pi transition dipole moment were studied. The results of the FCI calculations are compared to those obtained using approximate methods. In addition, the generation of atomic natural orbital (ANO) basis sets, as a method for contracting a primitive basis set for both valence and core correlation, is discussed. When both core-core and core-valence correlation are included in the calculation, no suitable truncated CI approach consistently reproduces the FCI, and contraction of the basis set is very difficult. If the (nearly constant) core-core correlation is eliminated, and only the core-valence correlation is included, CASSCF/MRCI approached reproduce the FCI results and basis set contraction is significantly easier.
Core Certification of Data Repositories: Trustworthiness and Long-Term Stewardship
NASA Astrophysics Data System (ADS)
de Sherbinin, A. M.; Mokrane, M.; Hugo, W.; Sorvari, S.; Harrison, S.
2017-12-01
Scientific integrity and norms dictate that data created and used by scientists should be managed, curated, and archived in trustworthy data repositories thus ensuring that science is verifiable and reproducible while preserving the initial investment in collecting data. Research stakeholders including researchers, science funders, librarians, and publishers must also be able to establish the trustworthiness of data repositories they use to confirm that the data they submit and use remain useful and meaningful in the long term. Data repositories are increasingly recognized as a key element of the global research infrastructure and the importance of establishing their trustworthiness is recognised as a prerequisite for efficient scientific research and data sharing. The Core Trustworthy Data Repository Requirements are a set of universal requirements for certification of data repositories at the core level (see: https://goo.gl/PYsygW). They were developed by the ICSU World Data System (WDS: www.icsu-wds.org) and the Data Seal of Approval (DSA: www.datasealofapproval.org)—the two authoritative organizations responsible for the development and implementation of this standard to be further developed under the CoreTrustSeal branding . CoreTrustSeal certification of data repositories involves a minimally intensive process whereby repositories supply evidence that they are sustainable and trustworthy. Repositories conduct a self-assessment which is then reviewed by community peers. Based on this review CoreTrustSeal certification is granted by the CoreTrustSeal Standards and Certification Board. Certification helps data communities—producers, repositories, and consumers—to improve the quality and transparency of their processes, and to increase awareness of and compliance with established standards. This presentation will introduce the CoreTrustSeal certification requirements for repositories and offer an opportunity to discuss ways to improve the contribution of certified data repositories to sustain open data for open scientific research.
Flow Cytometry Scientist | Center for Cancer Research
PROGRAM DESCRIPTION The Basic Science Program (BSP) pursues independent, multidisciplinary research in basic and applied molecular biology, immunology, retrovirology, cancer biology, and human genetics. Research efforts and support are an integral part of the Center for Cancer Research (CCR) at the Frederick National Laboratory for Cancer Research (FNLCR). KEY ROLES/RESPONSIBILITIES The Flow Cytometry Core (Flow Core) in the Cancer and Inflammation Program (CIP) is a service core which supports the research efforts of the CCR by providing expertise in the field of flow cytometry (using analyzers and sorters) with the goal of gaining a more thorough understanding of the biology of the immune system, cancer, and inflammation processes. The Flow Core provides service to 12-15 CIP laboratories and more than 22 non-CIP laboratories. Flow core staff provide technical advice on the experimental design of applications, which include immunological phenotyping, cell function assays, and cell cycle analysis. Work is performed per customer requirements, and no independent research is involved. The Flow Cytometry Scientist will be responsible for: Daily management of the Flow Cytometry Core, to include the supervision and guidance of technical staff members Monitor performance of and maintain high dimensional flow cytometer analyzers and cell sorters Operate high dimensional flow cytometer analyzers and cell sorters Provide scientific expertise to the user community and facilitate the development of cutting edge technologies Interact with Flow Core users and customers, and provide technical and scientific advice, and guidance regarding their experiments, including possible collaborations Train staff and scientific end users on the use of flow cytometry in their research, as well as teach them how to operate and troubleshoot the bench-top analyzer instruments Prepare and deliver lectures, as well as one-on-one training sessions, with customers/users Ensure that protocols are up-to-date, and appropriately adhered to Experience with sterile technique and tissue culture
The extinction law from photometric data: linear regression methods
NASA Astrophysics Data System (ADS)
Ascenso, J.; Lombardi, M.; Lada, C. J.; Alves, J.
2012-04-01
Context. The properties of dust grains, in particular their size distribution, are expected to differ from the interstellar medium to the high-density regions within molecular clouds. Since the extinction at near-infrared wavelengths is caused by dust, the extinction law in cores should depart from that found in low-density environments if the dust grains have different properties. Aims: We explore methods to measure the near-infrared extinction law produced by dense material in molecular cloud cores from photometric data. Methods: Using controlled sets of synthetic and semi-synthetic data, we test several methods for linear regression applied to the specific problem of deriving the extinction law from photometric data. We cover the parameter space appropriate to this type of observations. Results: We find that many of the common linear-regression methods produce biased results when applied to the extinction law from photometric colors. We propose and validate a new method, LinES, as the most reliable for this effect. We explore the use of this method to detect whether or not the extinction law of a given reddened population has a break at some value of extinction. Based on observations collected at the European Organisation for Astronomical Research in the Southern Hemisphere, Chile (ESO programmes 069.C-0426 and 074.C-0728).
Viral Evolution Core | FNLCR Staging
Brandon F. Keele, Ph.D. PI/Senior Principal Investigator, Retroviral Evolution Section Head, Viral Evolution Core Leidos Biomedical Research, Inc. Frederick National Laboratory for Cancer Research Frederick, MD 21702-1201 Tel: 301-846-173
Building qualitative study design using nursing's disciplinary epistemology.
Thorne, Sally; Stephens, Jennifer; Truant, Tracy
2016-02-01
To discuss the implications of drawing on core nursing knowledge as theoretical scaffolding for qualitative nursing enquiry. Although nurse scholars have been using qualitative methods for decades, much of their methodological direction derives from conventional approaches developed for answering questions in the social sciences. The quality of available knowledge to inform practice can be enhanced through the selection of study design options informed by an appreciation for the nature of nursing knowledge. Discussion paper. Drawing on the body of extant literature dealing with nursing's theoretical and qualitative research traditions, we consider contextual factors that have shaped the application of qualitative research approaches in nursing, including prior attempts to align method with the structure and form of disciplinary knowledge. On this basis, we critically reflect on design considerations that would follow logically from core features associated with a nursing epistemology. The substantive knowledge used by nurses to inform their practice includes both aspects developed at the level of the general and also that which pertains to application in the unique context of the particular. It must be contextually relevant to a fluid and dynamic healthcare environment and adaptable to distinctive patient conditions. Finally, it must align with nursing's moral mandate and action imperative. Qualitative research design components informed by nursing's disciplinary epistemology will help ensure a logical line of reasoning in our enquiries that remains true to the nature and structure of practice knowledge. © 2015 John Wiley & Sons Ltd.
Wieringa, Nicolien F; Peschar, Jules L; Denig, Petra; de Graeff, Pieter A; Vos, Rein
2003-01-01
To identify core issues that contribute to the gap between pre-marketing clinical research and practice as seen from the perspective of medical practice, as well as possible changes and potential barriers for dosing this gap. Interviews with 47 physicians and pharmacists who were liaised to drug regulation through their role in the pre- and post-marketing shaping of new cardiovascular drugs. Data were analyzed using methods of grounded theory and analytical evaluations. Six core issues were identified that referred to the standards in drug regulation, the organization of the regulatory system, and conflicting interests. Pre-marketing trials should focus more on populations and research questions relevant to medical practice. In particular, variability in drug responses between subgroups of patients and demonstration of effectiveness should become major principles in drug regulation. An interactive post-marketing process in which public interests are represented was considered necessary to further guide research and development according to the needs in daily practice. Strategies for change could be applied within the present system of drug regulation, or affect its basic principles. Regulatory authorities were primarily identified to initiate changes, but many other parties should be involved. Barriers for change were identified regarding differences in interests between parties, organizational matters, and with respect to broader healthcare policies. Based on the respondents' opinions, there is a need to focus regulatory standards more on the needs in medical practice. Therefore, regulatory authorities should further develop their influence in the pre- and post-marketing drug development process, together with other parties involved, in order to bridge the gap between clinical research and medical practice.
The Earth's Core: How Does It Work? Perspectives in Science. Number 1.
ERIC Educational Resources Information Center
Carnegie Institution of Washington, Washington, DC.
Various research studies designed to enhance knowledge about the earth's core are discussed. Areas addressed include: (1) the discovery of the earth's core; (2) experimental approaches used in studying the earth's core (including shock-wave experiments and experiments at high static pressures), the search for the core's light elements, the…
Geochemistry records from laminated sediments of Shira Lake (Russian Asia)
NASA Astrophysics Data System (ADS)
Phedorin, M.; Vologina, E.; Drebuschak, M.; Tolomeev, A.; Kirichenko, I.; Toyabin, A.
2009-04-01
We measured downcore elements distributions in five cores collected across the Shira Lake situated in Central part of Asia (E90o12', N54o30'). The lake is small (32km2), saline (ca.20g/l SO4-, Cl-, Na+, Mg+, K+), being filled with regional precipitation of about 300mm/year (mainly through one major tributary, river Son) and has no surface outflow. The aim of our study was to reconstruct history of changes in the regime of the lake that happened both before and during period of instrumental meteorological observations. In particular, we were interested in lake-level changes due to evaporation, water supply from surface and from underground sources, and in changes of bioproduction in the lake as well. To construct depth-age model for the cores, we measured Cs-137 and unsupported Pb-210 in top layers of the cores. The sedimentation rate thus identified varied in the range of 1-2 mm/year for different cores. We visually observed fine sedimentation ‘rhythms' having thickness of about 0.x-2.x mm: these layers may now be reliably identified as annual lamination. We also determined concentrations of elements in the sediments by recording x-ray fluorescence (XRF) spectra when continuously scanning the halves of the cores under sharp synchrotron radiation (SR) beam, using an instrument described in (Zolotarev et al., 2001). The resolution of the scanning was 0.1 mm. After processing of the measured XRF-SR data as in (Phedorin and Goldberg, 2005) we obtained downcore records of 20 elements. We correlated all five cores employing elements patterns. We qualitatively identified variations in surface-water supply treating markers of ‘clastic' material (Ti, Rb, Zr). We identified downcore variations in authgenic mineralization, which appeared to have different kinds: Ca-related, Sr-related, Ba-related, Fe-related. We tried to assess biogenic production changes from Br distribution, admitting analogy of Br in Shira sediments to Br in Lake Baikal sediments (Phedorin et al., 2000) and in Lake Khubsugul sediments (Phedorin et al., 2008). The cores we studied provide us with high-resolution geochemical records of last century for further meteorological correlations and regressions back to the past. We plan to reconstruct regional trends proceeding with the investigation of this kind and studying sediments of some other Khakas lakes. The investigation was supported by the grant from RFBR (09-05-98027) and grant from the Siberian Branch of Russian Academy of Science. Phedorin M.A., Goldberg E.L., Grachev M.A., Levina O.L., Khlystov O.M., Dolbnya I.P. The Comparison of Biogenic Silica, Br and Nd Distributions in the Sediments of Lake Baikal as Proxies of Changing Paleoclimates of the Last 480 ky. // Nuclear Instruments and Methods in Physics Research, 2000, V. A448, № 1-2, pp. 400-406. Phedorin M.A., Goldberg E.L. Prediction of absolute concentrations of elements from SR XRF scan measurements of natural wet sediments. Nuclear Instruments and Methods in Physics Research A 543 (2005), p. 274-279. Phedorin M.A., Fedotov A.P., Vorobieva S.S., Ziborova G.A.. Signature of long supercycles in the Pleistocene history of Asian limnic systems. J Paleolimnol, 2008, 40/1, pp. 445-452. Zolotarev K.V., Goldberg E.L., Kondratyev V.I., Kulipanov G.N., Miginsky E.G., Tsukanov V.M., Phedorin M.A., Kolmogorov Y.P. Scanning SR-XRF beamline for analysis of bottom sediments. // Nuclear Instruments and Methods in Physics Research, 2001, V. A470, N 1-2, pp. 376-379.
Rustagi, Alison S.; Robinson, Julia; Kouyate, Seydou; Coutinho, Joana; Nduati, Ruth; Pfeiffer, James; Gloyd, Stephen; Sherr, Kenneth; Granato, S. Adam; Kone, Ahoua; Cruz, Emilia; Manuel, Joao Luis; Zucule, Justina; Napua, Manuel; Mbatia, Grace; Wariua, Grace; Maina, Martin
2016-01-01
Background: Despite large investments to prevent mother-to-child-transmission (PMTCT), pediatric HIV elimination goals are not on track in many countries. The Systems Analysis and Improvement Approach (SAIA) study was a cluster randomized trial to test whether a package of systems engineering tools could strengthen PMTCT programs. We sought to (1) define core and adaptable components of the SAIA intervention, and (2) explain the heterogeneity in SAIA's success between facilities. Methods: The Consolidated Framework for Implementation Research (CFIR) guided all data collection efforts. CFIR constructs were assessed in focus group discussions and interviews with study and facility staff in 6 health facilities (1 high-performing and 1 low-performing site per country, identified by study staff) in December 2014 at the end of the intervention period. SAIA staff identified the intervention's core and adaptable components at an end-of-study meeting in August 2015. Two independent analysts used CFIR constructs to code transcripts before reaching consensus. Results: Flow mapping and continuous quality improvement were the core to the SAIA in all settings, whereas the PMTCT cascade analysis tool was the core in high HIV prevalence settings. Five CFIR constructs distinguished strongly between high and low performers: 2 in inner setting (networks and communication, available resources) and 3 in process (external change agents, executing, reflecting and evaluating). Discussion: The CFIR is a valuable tool to categorize elements of an intervention as core versus adaptable, and to understand heterogeneity in study implementation. Future intervention studies should apply evidence-based implementation science frameworks, like the CFIR, to provide salient data to expand implementation to other settings. PMID:27355497
Dynamical Core in Atmospheric Model Does Matter in the Simulation of Arctic Climate
NASA Astrophysics Data System (ADS)
Jun, Sang-Yoon; Choi, Suk-Jin; Kim, Baek-Min
2018-03-01
Climate models using different dynamical cores can simulate significantly different winter Arctic climates even if equipped with virtually the same physics schemes. Current climate simulated by the global climate model using cubed-sphere grid with spectral element method (SE core) exhibited significantly warmer Arctic surface air temperature compared to that using latitude-longitude grid with finite volume method core. Compared to the finite volume method core, SE core simulated additional adiabatic warming in the Arctic lower atmosphere, and this was consistent with the eddy-forced secondary circulation. Downward longwave radiation further enhanced Arctic near-surface warming with a higher surface air temperature of about 1.9 K. Furthermore, in the atmospheric response to the reduced sea ice conditions with the same physical settings, only the SE core showed a robust cooling response over North America. We emphasize that special attention is needed in selecting the dynamical core of climate models in the simulation of the Arctic climate and associated teleconnection patterns.
Graded core/shell semiconductor nanorods and nanorod barcodes
Alivisatos, A. Paul; Scher, Erik C.; Manna, Liberato
2010-12-14
Graded core/shell semiconductor nanorods and shaped nanorods are disclosed comprising Group II-VI, Group III-V and Group IV semiconductors and methods of making the same. Also disclosed are nanorod barcodes using core/shell nanorods where the core is a semiconductor or metal material, and with or without a shell. Methods of labeling analytes using the nanorod barcodes are also disclosed.
Graded core/shell semiconductor nanorods and nanorod barcodes
Alivisatos, A. Paul; Scher, Erik C.; Manna, Liberato
2013-03-26
Graded core/shell semiconductor nanorods and shapped nanorods are disclosed comprising Group II-VI, Group III-V and Group IV semiconductors and methods of making the same. Also disclosed are nanorod barcodes using core/shell nanorods where the core is a semiconductor or metal material, and with or without a shell. Methods of labeling analytes using the nanorod barcodes are also disclosed.
Kinematic fingerprint of core-collapsed globular clusters
NASA Astrophysics Data System (ADS)
Bianchini, P.; Webb, J. J.; Sills, A.; Vesperini, E.
2018-03-01
Dynamical evolution drives globular clusters towards core collapse, which strongly shapes their internal properties. Diagnostics of core collapse have so far been based on photometry only, namely on the study of the concentration of the density profiles. Here, we present a new method to robustly identify core-collapsed clusters based on the study of their stellar kinematics. We introduce the kinematic concentration parameter, ck, the ratio between the global and local degree of energy equipartition reached by a cluster, and show through extensive direct N-body simulations that clusters approaching core collapse and in the post-core collapse phase are strictly characterized by ck > 1. The kinematic concentration provides a suitable diagnostic to identify core-collapsed clusters, independent from any other previous methods based on photometry. We also explore the effects of incomplete radial and stellar mass coverage on the calculation of ck and find that our method can be applied to state-of-art kinematic data sets.
A transmission line model for propagation in elliptical core optical fibers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Georgantzos, E.; Boucouvalas, A. C.; Papageorgiou, C.
The calculation of mode propagation constants of elliptical core fibers has been the purpose of extended research leading to many notable methods, with the classic step index solution based on Mathieu functions. This paper seeks to derive a new innovative method for the determination of mode propagation constants in single mode fibers with elliptic core by modeling the elliptical fiber as a series of connected coupled transmission line elements. We develop a matrix formulation of the transmission line and the resonance of the circuits is used to calculate the mode propagation constants. The technique, used with success in the casemore » of cylindrical fibers, is now being extended for the case of fibers with elliptical cross section. The advantage of this approach is that it is very well suited to be able to calculate the mode dispersion of arbitrary refractive index profile elliptical waveguides. The analysis begins with the deployment Maxwell’s equations adjusted for elliptical coordinates. Further algebraic analysis leads to a set of equations where we are faced with the appearance of harmonics. Taking into consideration predefined fixed number of harmonics simplifies the problem and enables the use of the resonant circuits approach. According to each case, programs have been created in Matlab, providing with a series of results (mode propagation constants) that are further compared with corresponding results from the ready known Mathieu functions method.« less
Saldanha, Ian J; Li, Tianjing; Yang, Cui; Ugarte-Gil, Cesar; Rutherford, George W; Dickersin, Kay
2016-02-01
Methods to develop core outcome sets, the minimum outcomes that should be measured in research in a topic area, vary. We applied social network analysis methods to understand outcome co-occurrence patterns in human immunodeficiency virus (HIV)/AIDS systematic reviews and identify outcomes central to the network of outcomes in HIV/AIDS. We examined all Cochrane reviews of HIV/AIDS as of June 2013. We defined a tie as two outcomes (nodes) co-occurring in ≥2 reviews. To identify central outcomes, we used normalized node betweenness centrality (nNBC) (the extent to which connections between other outcomes in a network rely on that outcome as an intermediary). We conducted a subgroup analysis by HIV/AIDS intervention type (i.e., clinical management, biomedical prevention, behavioral prevention, and health services). The 140 included reviews examined 1,140 outcomes, 294 of which were unique. The most central outcome overall was all-cause mortality (nNBC = 23.9). The most central and most frequent outcomes differed overall and within subgroups. For example, "adverse events (specified)" was among the most central but not among the most frequent outcomes, overall. Social network analysis methods are a novel application to identify central outcomes, which provides additional information potentially useful for developing core outcome sets. Copyright © 2016 Elsevier Inc. All rights reserved.
caGrid 1.0: An Enterprise Grid Infrastructure for Biomedical Research
Oster, Scott; Langella, Stephen; Hastings, Shannon; Ervin, David; Madduri, Ravi; Phillips, Joshua; Kurc, Tahsin; Siebenlist, Frank; Covitz, Peter; Shanbhag, Krishnakant; Foster, Ian; Saltz, Joel
2008-01-01
Objective To develop software infrastructure that will provide support for discovery, characterization, integrated access, and management of diverse and disparate collections of information sources, analysis methods, and applications in biomedical research. Design An enterprise Grid software infrastructure, called caGrid version 1.0 (caGrid 1.0), has been developed as the core Grid architecture of the NCI-sponsored cancer Biomedical Informatics Grid (caBIG™) program. It is designed to support a wide range of use cases in basic, translational, and clinical research, including 1) discovery, 2) integrated and large-scale data analysis, and 3) coordinated study. Measurements The caGrid is built as a Grid software infrastructure and leverages Grid computing technologies and the Web Services Resource Framework standards. It provides a set of core services, toolkits for the development and deployment of new community provided services, and application programming interfaces for building client applications. Results The caGrid 1.0 was released to the caBIG community in December 2006. It is built on open source components and caGrid source code is publicly and freely available under a liberal open source license. The core software, associated tools, and documentation can be downloaded from the following URL: https://cabig.nci.nih.gov/workspaces/Architecture/caGrid. Conclusions While caGrid 1.0 is designed to address use cases in cancer research, the requirements associated with discovery, analysis and integration of large scale data, and coordinated studies are common in other biomedical fields. In this respect, caGrid 1.0 is the realization of a framework that can benefit the entire biomedical community. PMID:18096909
caGrid 1.0: an enterprise Grid infrastructure for biomedical research.
Oster, Scott; Langella, Stephen; Hastings, Shannon; Ervin, David; Madduri, Ravi; Phillips, Joshua; Kurc, Tahsin; Siebenlist, Frank; Covitz, Peter; Shanbhag, Krishnakant; Foster, Ian; Saltz, Joel
2008-01-01
To develop software infrastructure that will provide support for discovery, characterization, integrated access, and management of diverse and disparate collections of information sources, analysis methods, and applications in biomedical research. An enterprise Grid software infrastructure, called caGrid version 1.0 (caGrid 1.0), has been developed as the core Grid architecture of the NCI-sponsored cancer Biomedical Informatics Grid (caBIG) program. It is designed to support a wide range of use cases in basic, translational, and clinical research, including 1) discovery, 2) integrated and large-scale data analysis, and 3) coordinated study. The caGrid is built as a Grid software infrastructure and leverages Grid computing technologies and the Web Services Resource Framework standards. It provides a set of core services, toolkits for the development and deployment of new community provided services, and application programming interfaces for building client applications. The caGrid 1.0 was released to the caBIG community in December 2006. It is built on open source components and caGrid source code is publicly and freely available under a liberal open source license. The core software, associated tools, and documentation can be downloaded from the following URL: https://cabig.nci.nih.gov/workspaces/Architecture/caGrid. While caGrid 1.0 is designed to address use cases in cancer research, the requirements associated with discovery, analysis and integration of large scale data, and coordinated studies are common in other biomedical fields. In this respect, caGrid 1.0 is the realization of a framework that can benefit the entire biomedical community.
Turnbull, Alison E; Sepulveda, Kristin A; Dinglas, Victor D; Chessare, Caroline M; Bingham, Clifton O; Needham, Dale M
2017-06-01
To identify the "core domains" (i.e., patient outcomes, health-related conditions, or aspects of health) that relevant stakeholders agree are essential to assess in all clinical research studies evaluating the outcomes of acute respiratory failure survivors after hospital discharge. A two-round consensus process, using a modified Delphi methodology, with participants from 16 countries, including patient and caregiver representatives. Prior to voting, participants were asked to review 1) results from surveys of clinical researchers, acute respiratory failure survivors, and caregivers that rated the importance of 19 preliminary outcome domains and 2) results from a qualitative study of acute respiratory failure survivors' outcomes after hospital discharge, as related to the 19 preliminary outcome domains. Participants also were asked to suggest any additional potential domains for evaluation in the first Delphi survey. Web-based surveys of participants representing four stakeholder groups relevant to clinical research evaluating postdischarge outcomes of acute respiratory failure survivors: clinical researchers, clinicians, patients and caregivers, and U.S. federal research funding organizations. None. None. Survey response rates were 97% and 99% in round 1 and round 2, respectively. There were seven domains that met the a priori consensus criteria to be designated as core domains: physical function, cognition, mental health, survival, pulmonary function, pain, and muscle and/or nerve function. This study generated a consensus-based list of core domains that should be assessed in all clinical research studies evaluating acute respiratory failure survivors after hospital discharge. Identifying appropriate measurement instruments to assess these core domains is an important next step toward developing a set of core outcome measures for this field of research.
Bibliometric study of grey literature in core veterinary medical journals
Pelzer, Nancy L.; Wiese, William H.
2003-01-01
Objectives: Grey literature has been perceived by many as belonging to the primary sources of information and has become an accepted method of nonconventional communication in the sciences and medicine. Since little is known about the use and nature of grey literature in veterinary medicine, a systematic study was done to analyze and characterize the bibliographic citations appearing in twelve core veterinary journals. Methods: Citations from 2,159 articles published in twelve core veterinary journals in 2000 were analyzed to determine the portion of citations from grey literature. Those citations were further analyzed and categorized according to the type of publication. Results: Citation analysis yielded 55,823 citations, of which 3,564 (6.38%) were considered to be grey literature. Four veterinary specialties, internal medicine, pathology, theriogenology, and microbiology, accounted for 70% of the total number of articles. Three small-animal clinical practice journals cited about 2.5–3% grey literature, less than half that of journals with basic research orientations, where results ranged from almost 6% to approximately 10% grey literature. Nearly 90% of the grey literature appeared as conferences, government publications, and corporate organization literature. Conclusions: The results corroborate other reported research that the incidence of grey literature is lower in medicine and biology than in some other fields, such as aeronautics and agriculture. As in other fields, use of the Internet and the Web has greatly expanded the communication process among veterinary professionals. The appearance of closed community email forums and specialized discussion groups within the veterinary profession is an example of what could become a new kind of grey literature. PMID:14566374
Lashkari, A; Khalafi, H; Kazeminejad, H
2013-05-01
In this work, kinetic parameters of Tehran research reactor (TRR) mixed cores have been calculated. The mixed core configurations are made by replacement of the low enriched uranium control fuel elements with highly enriched uranium control fuel elements in the reference core. The MTR_PC package, a nuclear reactor analysis tool, is used to perform the analysis. Simulations were carried out to compute effective delayed neutron fraction and prompt neutron lifetime. Calculation of kinetic parameters is necessary for reactivity and power excursion transient analysis. The results of this research show that effective delayed neutron fraction decreases and prompt neutron lifetime increases with the fuels burn-up. Also, by increasing the number of highly enriched uranium control fuel elements in the reference core, the prompt neutron lifetime increases, but effective delayed neutron fraction does not show any considerable change.
Effective delayed neutron fraction and prompt neutron lifetime of Tehran research reactor mixed-core
Lashkari, A.; Khalafi, H.; Kazeminejad, H.
2013-01-01
In this work, kinetic parameters of Tehran research reactor (TRR) mixed cores have been calculated. The mixed core configurations are made by replacement of the low enriched uranium control fuel elements with highly enriched uranium control fuel elements in the reference core. The MTR_PC package, a nuclear reactor analysis tool, is used to perform the analysis. Simulations were carried out to compute effective delayed neutron fraction and prompt neutron lifetime. Calculation of kinetic parameters is necessary for reactivity and power excursion transient analysis. The results of this research show that effective delayed neutron fraction decreases and prompt neutron lifetime increases with the fuels burn-up. Also, by increasing the number of highly enriched uranium control fuel elements in the reference core, the prompt neutron lifetime increases, but effective delayed neutron fraction does not show any considerable change. PMID:24976672
DOE Office of Scientific and Technical Information (OSTI.GOV)
Uyttenhove, W.; Baeten, P.; Ban, G.
The GUINEVERE (Generation of Uninterrupted Intense Neutron pulses at the lead Venus Reactor) project was launched in 2006 within the framework of FP6 EUROTRANS in order to validate on-line reactivity monitoring and subcriticality level determination in Accelerator Driven Systems. Therefore the VENUS reactor at SCK.CEN in Mol (Belgium) was modified towards a fast core (VENUS-F) and coupled to the GENEPI-3C accelerator built by CNRS The accelerator can operate in both continuous and pulsed mode. The VENUS-F core is loaded with enriched Uranium and reflected with solid lead. A well-chosen critical reference state is indispensable for the validation of the on-linemore » subcriticality monitoring methodology. Moreover a benchmarking tool is required for nuclear data research and code validation. In this paper the design and the importance of the critical reference state for the GUINEVERE project are motivated. The results of the first experimental phase on the critical core are presented. The control rods worth is determined by the rod drop technique and the application of the Modified Source Multiplication (MSM) method allows the determination of the worth of the safety rods. The results are implemented in the VENUS-F core certificate for full exploitation of the critical core. (authors)« less
An improved method for field extraction and laboratory analysis of large, intact soil cores
Tindall, J.A.; Hemmen, K.; Dowd, J.F.
1992-01-01
Various methods have been proposed for the extraction of large, undisturbed soil cores and for subsequent analysis of fluid movement within the cores. The major problems associated with these methods are expense, cumbersome field extraction, and inadequate simulation of unsaturated flow conditions. A field and laboratory procedure is presented that is economical, convenient, and simulates unsaturated and saturated flow without interface flow problems and can be used on a variety of soil types. In the field, a stainless steel core barrel is hydraulically pressed into the soil (30-cm diam. and 38 cm high), the barrel and core are extracted from the soil, and after the barrel is removed from the core, the core is then wrapped securely with flexible sheet metal and a stainless mesh screen is attached to the bottom of the core for support. In the laboratory the soil core is set atop a porous ceramic plate over which a soil-diatomaceous earth slurry has been poured to assure good contact between plate and core. A cardboard cylinder (mold) is fastened around the core and the empty space filled with paraffin wax. Soil cores were tested under saturated and unsaturated conditions using a hanging water column for potentials ???0. Breakthrough curves indicated that no interface flow occurred along the edge of the core. This procedure proved to be reliable for field extraction of large, intact soil cores and for laboratory analysis of solute transport.
NCI Core Open House Shines Spotlight on Supportive Science and Basic Research | Poster
The lobby of Building 549 at NCI at Frederick bustled with activity for two hours on Tuesday, May 1, as several dozen scientists and staff gathered for the NCI Core Open House. The event aimed to encourage discussion and educate visitors about the capabilities of the cores, laboratories, and facilities that offer support to NCI’s Center for Cancer Research.
Eloy, Jean Anderson; Svider, Peter F; Setzen, Michael; Baredes, Soly; Folbe, Adam J
2014-01-01
To determine whether American Academy of Otolaryngology-Head and Neck Surgery Foundation (AAO-HNSF) Centralized Otolaryngology Research Efforts (CORE) grants influence career paths and scholarly impact of fellowship-trained rhinologists, and whether funding from the National Institutes of Health (NIH) and CORE programs is associated with increased scholarly impact among rhinologists. Another aim was to explore whether obtaining CORE grant funding is associated with NIH award acquisition. Practice setting, academic rank, and fellowship-training status were determined for individuals in the CORE grant database. The h-index and publication experience of practitioners was calculated using the Scopus database. Faculty listings were used to determine this data for a non-CORE-grants-funded "control" group of academic rhinologists. Active and past NIH funding was obtained using the NIH RePORTER database. Fifteen of 26 (57.7%) fellowship-trained rhinologists receiving CORE grants were funded for rhinologic projects. Five of 6 rhinologists receiving NIH funding had a CORE-grants-funding history. Twenty-two of 26 (84.6%) rhinologists receiving CORE funding are currently in academic practice. Academic rhinologists receiving CORE or NIH funding had higher h-indices, a result reaching significance among promoted faculty and those with greater than 10 years of publication experience. Encouraging the pursuit of CORE grants among junior faculty as well as trainees interested in rhinology may be a strategy for developing highly effective research habits that pay dividends after the first few years of one's career. Fellowship-trained rhinologists with a CORE funding history predominantly pursue careers in academic medicine, although their CORE projects are not necessarily related to rhinologic topics. © 2013 ARS-AAOA, LLC.
Manwell, Laurie A; Barbic, Skye P; Roberts, Karen; Durisko, Zachary; Lee, Cheolsoon; Ware, Emma; McKenzie, Kwame
2015-06-02
Lack of consensus on the definition of mental health has implications for research, policy and practice. This study aims to start an international, interdisciplinary and inclusive dialogue to answer the question: What are the core concepts of mental health? 50 people with expertise in the field of mental health from 8 countries completed an online survey. They identified the extent to which 4 current definitions were adequate and what the core concepts of mental health were. A qualitative thematic analysis was conducted of their responses. The results were validated at a consensus meeting of 58 clinicians, researchers and people with lived experience. 46% of respondents rated the Public Health Agency of Canada (PHAC, 2006) definition as the most preferred, 30% stated that none of the 4 definitions were satisfactory and only 20% said the WHO (2001) definition was their preferred choice. The least preferred definition of mental health was the general definition of health adapted from Huber et al (2011). The core concepts of mental health were highly varied and reflected different processes people used to answer the question. These processes included the overarching perspective or point of reference of respondents (positionality), the frameworks used to describe the core concepts (paradigms, theories and models), and the way social and environmental factors were considered to act. The core concepts of mental health identified were mainly individual and functional, in that they related to the ability or capacity of a person to effectively deal with or change his/her environment. A preliminary model for the processes used to conceptualise mental health is presented. Answers to the question, 'What are the core concepts of mental health?' are highly dependent on the empirical frame used. Understanding these empirical frames is key to developing a useful consensus definition for diverse populations. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Manwell, Laurie A; Barbic, Skye P; Roberts, Karen; Durisko, Zachary; Lee, Cheolsoon; Ware, Emma; McKenzie, Kwame
2015-01-01
Objective Lack of consensus on the definition of mental health has implications for research, policy and practice. This study aims to start an international, interdisciplinary and inclusive dialogue to answer the question: What are the core concepts of mental health? Design and participants 50 people with expertise in the field of mental health from 8 countries completed an online survey. They identified the extent to which 4 current definitions were adequate and what the core concepts of mental health were. A qualitative thematic analysis was conducted of their responses. The results were validated at a consensus meeting of 58 clinicians, researchers and people with lived experience. Results 46% of respondents rated the Public Health Agency of Canada (PHAC, 2006) definition as the most preferred, 30% stated that none of the 4 definitions were satisfactory and only 20% said the WHO (2001) definition was their preferred choice. The least preferred definition of mental health was the general definition of health adapted from Huber et al (2011). The core concepts of mental health were highly varied and reflected different processes people used to answer the question. These processes included the overarching perspective or point of reference of respondents (positionality), the frameworks used to describe the core concepts (paradigms, theories and models), and the way social and environmental factors were considered to act. The core concepts of mental health identified were mainly individual and functional, in that they related to the ability or capacity of a person to effectively deal with or change his/her environment. A preliminary model for the processes used to conceptualise mental health is presented. Conclusions Answers to the question, ‘What are the core concepts of mental health?’ are highly dependent on the empirical frame used. Understanding these empirical frames is key to developing a useful consensus definition for diverse populations. PMID:26038353
Methods for the behavioral, educational, and social sciences: an R package.
Kelley, Ken
2007-11-01
Methods for the Behavioral, Educational, and Social Sciences (MBESS; Kelley, 2007b) is an open source package for R (R Development Core Team, 2007b), an open source statistical programming language and environment. MBESS implements methods that are not widely available elsewhere, yet are especially helpful for the idiosyncratic techniques used within the behavioral, educational, and social sciences. The major categories of functions are those that relate to confidence interval formation for noncentral t, F, and chi2 parameters, confidence intervals for standardized effect sizes (which require noncentral distributions), and sample size planning issues from the power analytic and accuracy in parameter estimation perspectives. In addition, MBESS contains collections of other functions that should be helpful to substantive researchers and methodologists. MBESS is a long-term project that will continue to be updated and expanded so that important methods can continue to be made available to researchers in the behavioral, educational, and social sciences.
An Optical Dye Method for Continuous Determination of Acidity in Ice Cores.
Kjær, Helle Astrid; Vallelonga, Paul; Svensson, Anders; Elleskov L Kristensen, Magnus; Tibuleac, Catalin; Winstrup, Mai; Kipfstuhl, Sepp
2016-10-04
The pH of polar ice is important for the stability and mobility of impurities in ice cores and can be strongly influenced by volcanic eruptions or anthropogenic emissions. We present a simple optical method for continuous determination of acidity in ice cores based on spectroscopically determined color changes of two common pH-indicator dyes, bromophenol blue, and chlorophenol red. The sealed-system method described here is not equilibrated with CO 2 , making it simpler than existing methods for pH determination in ice cores and offering a 10-90% peak response time of 45 s and a combined uncertainty of 9%. The method is applied to Holocene ice core sections from Greenland and Antarctica and compared to standard techniques such as electrical conductivity measurement (ECM) conducted on the solid ice, and electrolytic meltwater conductivity, EMWC. Acidity measured in the Greenland NGRIP ice core shows good agreement with acidity calculated from ion chromatography. Conductivity and dye-based acidity H dye + are found to be highly correlated in the Greenland NEGIS firn core (75.38° N, 35.56° W), with all signals greater than 3σ variability coinciding with either volcanic eruptions or possible wild fire activity. In contrast, the Antarctic Roosevelt Island ice core (79.36° S, 161.71° W) features an anticorrelation between conductivity and H dye + , likely due to strong influence of marine salts.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 20 Employees' Benefits 3 2011-04-01 2011-04-01 false How are core services and intensive services... § 652.208 How are core services and intensive services related to the methods of service delivery described in § 652.207(b)(2)? Core services and intensive services may be delivered through any of the...
Code of Federal Regulations, 2010 CFR
2010-04-01
... 20 Employees' Benefits 3 2010-04-01 2010-04-01 false How are core services and intensive services... § 652.208 How are core services and intensive services related to the methods of service delivery described in § 652.207(b)(2)? Core services and intensive services may be delivered through any of the...
PS2-10: The CRN Cancer Communication Research Center
Madrid, Sarah D; Dearing, James W; Glasgow, Russell E; Rabin, Borsika A; Mazor, Kathleen; Wagner, Edward H
2010-01-01
We propose an integrated set of three, large posters that will describe the main components of a new research center that bridges HMORN institutions. Background: The CRN Cancer Communication Research Center (CCRC) was established in September 2008 at Kaiser Colorado Institute for Health Research. Objectives: The CCRC’s objectives are to discover the most promising practice-based approaches to cancer communication and care coordination, and to disseminate those approaches. Integrated care delivery systems represent promising opportunities to study these approaches, and the CRN CCRC, with its embedded organizational focus will take advantage of the CRN as a virtual laboratory. Specific Aims: The CRN CCRC 1) leverages the existing infrastructure of the CRN to support both the discovery and dissemination of practice-based communication strategies and organizational resources; 2) supports four investigator-initiated research projects to advance communication theory and to evaluate strategies informed by theory; and 3) provides administrative, financial, and scientific support to new investigators, including clinicians, in the development of pilot projects, and assists in submission of broader, investigator-initiated proposals to be submitted for extramural funding. Methods: Two R01-scale investigator initiated research projects will advance and test communication theory. The first will: characterize patients’ and providers’ experiences communicating about errors in cancer care; investigate the health system factors that promote or inhibit effective communication; and develop, disseminate, and evaluate provider training materials and patient informational materials. The second will develop and test an intervention to decrease patient uncertainty and improve psychosocial and communicative outcomes during the period from suspicion of cancer through diagnosis and plan of care. The Center’s research projects will be augmented and supported by Shared Resource Cores. The Discovery Core will identify the most promising practicebased innovations and approaches; the Dissemination Core will focus on data harmonization and applying dissemination science to effective interventions.
ERIC Educational Resources Information Center
Sheehan, Kathleen M.; Kostin, Irene; Futagi, Yoko; Flor, Michael
2010-01-01
The Common Core Standards call for students to be exposed to a much greater level of text complexity than has been the norm in schools for the past 40 years. Textbook publishers, teachers, and assessment developers are being asked to refocus materials and methods to ensure that students are challenged to read texts at steadily increasing…
Flow characteristics of Korea multi-purpose research reactor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heonil Kim; Hee Taek Chae; Byung Jin Jun
1995-09-01
The construction of Korea Multi-purpose Research Reactor (KMRR), a 30 MW{sub th} open-tank-in-pool type, is completed. Various thermal-hydraulic experiments have been conducted to verify the design characteristics of the KMRR. This paper describes the commissioning experiments to determine the flow distribution of KMRR core and the flow characteristics inside the chimney which stands on top of the core. The core flow is distributed to within {+-}6% of the average values, which is sufficiently flat in the sense that the design velocity in the fueled region is satisfied. The role of core bypass flow to confine the activated core coolant inmore » the chimney structure is confirmed.« less
ERIC Educational Resources Information Center
Atibuni, Dennis Zami; Olema, David Kani; Ssenyonga, Joseph; Karl, Steffens; Kibanja, Grace Milly
2017-01-01
This study investigated the mediation effect of research skills proficiency on the relationship between core self-evaluations and research engagement among Master of Education students in Uganda. Questionnaire surveys including closed ended questions were administered to two cohorts of the students, 2011/2012 and 2012/2013, (N = 102). Results…
Core Hunter 3: flexible core subset selection.
De Beukelaer, Herman; Davenport, Guy F; Fack, Veerle
2018-05-31
Core collections provide genebank curators and plant breeders a way to reduce size of their collections and populations, while minimizing impact on genetic diversity and allele frequency. Many methods have been proposed to generate core collections, often using distance metrics to quantify the similarity of two accessions, based on genetic marker data or phenotypic traits. Core Hunter is a multi-purpose core subset selection tool that uses local search algorithms to generate subsets relying on one or more metrics, including several distance metrics and allelic richness. In version 3 of Core Hunter (CH3) we have incorporated two new, improved methods for summarizing distances to quantify diversity or representativeness of the core collection. A comparison of CH3 and Core Hunter 2 (CH2) showed that these new metrics can be effectively optimized with less complex algorithms, as compared to those used in CH2. CH3 is more effective at maximizing the improved diversity metric than CH2, still ensures a high average and minimum distance, and is faster for large datasets. Using CH3, a simple stochastic hill-climber is able to find highly diverse core collections, and the more advanced parallel tempering algorithm further increases the quality of the core and further reduces variability across independent samples. We also evaluate the ability of CH3 to simultaneously maximize diversity, and either representativeness or allelic richness, and compare the results with those of the GDOpt and SimEli methods. CH3 can sample equally representative cores as GDOpt, which was specifically designed for this purpose, and is able to construct cores that are simultaneously more diverse, and either are more representative or have higher allelic richness, than those obtained by SimEli. In version 3, Core Hunter has been updated to include two new core subset selection metrics that construct cores for representativeness or diversity, with improved performance. It combines and outperforms the strengths of other methods, as it (simultaneously) optimizes a variety of metrics. In addition, CH3 is an improvement over CH2, with the option to use genetic marker data or phenotypic traits, or both, and improved speed. Core Hunter 3 is freely available on http://www.corehunter.org .
Geophysical Properties of Hard Rock for Investigation of Stress Fields in Deep Mines
NASA Astrophysics Data System (ADS)
Tibbo, M.; Young, R. P.; Schmitt, D. R.; Milkereit, B.
2014-12-01
A complication in geophysical monitoring of deep mines is the high-stress dependency of the physical properties of hard rocks. In-mine observations show anisotropic variability of the in situ P- and S-wave velocities and resistivity of the hard rocks that are likely related to stress field changes. As part of a comprehensive study in a deep, highly stressed mine located in Sudbury, Ontario, Canada, data from in situ monitoring of the seismicity, conductivity, stress, and stress dependent physical properties has been obtain. In-laboratory experiments are also being performed on borehole cores from the Sudbury mines. These experiments will measure the Norite borehole core's properties including elastic modulus, bulk modulus, P- and S-wave velocities, and density. Hydraulic fracturing has been successfully implemented in industries such as oil and gas and enhanced geothermal systems, and is currently being investigated as a potential method for preconditioning in mining. However, further research is required to quantify how hydraulic fractures propagate through hard, unfractured rock as well as naturally fractured rock typically found in mines. These in laboratory experiments will contribute to a hydraulic fracturing project evaluating the feasibility and effectiveness of hydraulic fracturing as a method of de-stressing hard rock mines. A tri-axial deformation cell equipped with 18 Acoustic Emission (AE) sensors will be used to bring the borehole cores to a tri-axial state of stress. The cores will then be injected with fluid until the the hydraulic fracture has propagated to the edge of the core, while AE waveforms will be digitized continuously at 10 MHz and 12-bit resolution for the duration of each experiment. These laboratory hydraulic fracture experiments will contribute to understanding how parameters including stress ratio, fluid injection rate, and viscosity, affect the fracturing process.
Femtosecond laser processing of optical fibres for novel sensor development
NASA Astrophysics Data System (ADS)
Kalli, Kyriacos; Theodosiou, Antreas; Ioannou, Andreas; Lacraz, Amedee
2017-04-01
We present results of recent research where we have utilized a femtosecond laser to micro-structure silica and polymer optical fibres in order to realize versatile optical components such as diffractive optical elements on the fibre end face, the inscription of integrated waveguide circuits in the fibre cladding and novel optical fibre sensors designs based on Bragg gratings in the core. A major hurdle in tailoring or modifying the properties of optical fibres is the development of an inscription method that can prove to be a flexible and reliable process that is generally applicable to all optical fibre types; this requires careful matching of the laser parameters and optics in order to examine the spatial limits of direct laser writing, whether the application is structuring at the surface of the optical fibre or inscription in the core and cladding of the fibre. We demonstrate a variety of optical components such as two-dimensional grating structures, Bessel, Airy and vortex beam generators; moreover, optical bridging waveguides inscribed in the cladding of single-mode fibre as a means to selectively couple light from single-core to multi-core optical fibres, and demonstrate a grating based sensor; finally, we have developed a novel femtosecond laser inscription method for the precise inscription of tailored Bragg grating sensors in silica and polymer optical fibres. We also show that this novel fibre Bragg grating inscription technique can be used to modify and add versatility to an existing, encapsulated optical fibre pressure sensor.
Blau, Ashley; Brown, Alison; Mahanta, Lisa; Amr, Sami S.
2016-01-01
The Translational Genomics Core (TGC) at Partners Personalized Medicine (PPM) serves as a fee-for-service core laboratory for Partners Healthcare researchers, providing access to technology platforms and analysis pipelines for genomic, transcriptomic, and epigenomic research projects. The interaction of the TGC with various components of PPM provides it with a unique infrastructure that allows for greater IT and bioinformatics opportunities, such as sample tracking and data analysis. The following article describes some of the unique opportunities available to an academic research core operating within PPM, such the ability to develop analysis pipelines with a dedicated bioinformatics team and maintain a flexible Laboratory Information Management System (LIMS) with the support of an internal IT team, as well as the operational challenges encountered to respond to emerging technologies, diverse investigator needs, and high staff turnover. In addition, the implementation and operational role of the TGC in the Partners Biobank genotyping project of over 25,000 samples is presented as an example of core activities working with other components of PPM. PMID:26927185
Blau, Ashley; Brown, Alison; Mahanta, Lisa; Amr, Sami S
2016-02-26
The Translational Genomics Core (TGC) at Partners Personalized Medicine (PPM) serves as a fee-for-service core laboratory for Partners Healthcare researchers, providing access to technology platforms and analysis pipelines for genomic, transcriptomic, and epigenomic research projects. The interaction of the TGC with various components of PPM provides it with a unique infrastructure that allows for greater IT and bioinformatics opportunities, such as sample tracking and data analysis. The following article describes some of the unique opportunities available to an academic research core operating within PPM, such the ability to develop analysis pipelines with a dedicated bioinformatics team and maintain a flexible Laboratory Information Management System (LIMS) with the support of an internal IT team, as well as the operational challenges encountered to respond to emerging technologies, diverse investigator needs, and high staff turnover. In addition, the implementation and operational role of the TGC in the Partners Biobank genotyping project of over 25,000 samples is presented as an example of core activities working with other components of PPM.
Analysis and Design of ITER 1 MV Core Snubber
NASA Astrophysics Data System (ADS)
Wang, Haitian; Li, Ge
2012-11-01
The core snubber, as a passive protection device, can suppress arc current and absorb stored energy in stray capacitance during the electrical breakdown in accelerating electrodes of ITER NBI. In order to design the core snubber of ITER, the control parameters of the arc peak current have been firstly analyzed by the Fink-Baker-Owren (FBO) method, which are used for designing the DIIID 100 kV snubber. The B-H curve can be derived from the measured voltage and current waveforms, and the hysteresis loss of the core snubber can be derived using the revised parallelogram method. The core snubber can be a simplified representation as an equivalent parallel resistance and inductance, which has been neglected by the FBO method. A simulation code including the parallel equivalent resistance and inductance has been set up. The simulation and experiments result in dramatically large arc shorting currents due to the parallel inductance effect. The case shows that the core snubber utilizing the FBO method gives more compact design.
The tissue micro-array data exchange specification: a web based experience browsing imported data
Nohle, David G; Hackman, Barbara A; Ayers, Leona W
2005-01-01
Background The AIDS and Cancer Specimen Resource (ACSR) is an HIV/AIDS tissue bank consortium sponsored by the National Cancer Institute (NCI) Division of Cancer Treatment and Diagnosis (DCTD). The ACSR offers to approved researchers HIV infected biologic samples and uninfected control tissues including tissue cores in micro-arrays (TMA) accompanied by de-identified clinical data. Researchers interested in the type and quality of TMA tissue cores and the associated clinical data need an efficient method for viewing available TMA materials. Because each of the tissue samples within a TMA has separate data including a core tissue digital image and clinical data, an organized, standard approach to producing, navigating and publishing such data is necessary. The Association for Pathology Informatics (API) extensible mark-up language (XML) TMA data exchange specification (TMA DES) proposed in April 2003 provides a common format for TMA data. Exporting TMA data into the proposed format offers an opportunity to implement the API TMA DES. Using our public BrowseTMA tool, we created a web site that organizes and cross references TMA lists, digital "virtual slide" images, TMA DES export data, linked legends and clinical details for researchers. Microsoft Excel® and Microsoft Word® are used to convert tabular clinical data and produce an XML file in the TMA DES format. The BrowseTMA tool contains Extensible Stylesheet Language Transformation (XSLT) scripts that convert XML data into Hyper-Text Mark-up Language (HTML) web pages with hyperlinks automatically added to allow rapid navigation. Results Block lists, virtual slide images, legends, clinical details and exports have been placed on the ACSR web site for 14 blocks with 1623 cores of 2.0, 1.0 and 0.6 mm sizes. Our virtual microscope can be used to view and annotate these TMA images. Researchers can readily navigate from TMA block lists to TMA legends and to clinical details for a selected tissue core. Exports for 11 blocks with 3812 cores from three other institutions were processed with the BrowseTMA tool. Fifty common data elements (CDE) from the TMA DES were used and 42 more created for site-specific data. Researchers can download TMA clinical data in the TMA DES format. Conclusion Virtual TMAs with clinical data can be viewed on the Internet by interested researchers using the BrowseTMA tool. We have organized our approach to producing, sorting, navigating and publishing TMA information to facilitate such review. We have converted Excel TMA data into TMA DES XML, and imported it and TMA DES XML from another institution into BrowseTMA to produce web pages that allow us to browse through the merged data. We proposed enhancements to the TMA DES as a result of this experience. We implemented improvements to the API TMA DES as a result of using exported data from several institutions. A document type definition was written for the API TMA DES (that optionally includes proposed enhancements). Independent validators can be used to check exports against the DTD (with or without the proposed enhancements). Linking tissue core images to readily navigable clinical data greatly improves the value of the TMA. PMID:16086837
NASA Astrophysics Data System (ADS)
Ryang, Woo Hun; Kim, Seong Pil; Hahn, Jooyoung
2016-04-01
Geoacoustic model is to provide a model of the real seafloor with measured, extrapolated, and predicted values of geoacoustic environmental parameters. It controls acoustic propagation in underwater acoustics. In the Korean continental margin of the East Sea, this study reconstructed geoacoustic models using geoacoustic and marine geologic data of the Donghae-to-Gangneung region (37.4° to 37.8° in latitude). The models were based on the data of the high-resolution subbottom and air-gun seismic profiles with sediment cores. The Donghae region comprised measured P-wave velocities and attenuations of the cores, whereas the Gangneung region comprised regression values using measured values of the adjacent areas. Geoacoustic data of the cores were extrapolated down to a depth of the geoacoustic models. For actual modeling, the P-wave speed of the models was compensated to in situ depth below the sea floor using the Hamilton method. These geoacoustic models of this region probably contribute for geoacoustic and underwater acoustic modelling reflecting vertical and lateral variability of acoustic properties in the Korean continental margin of the western East Sea. Keywords: geoacoustic model, environmental parameter, East Sea, continental margin Acknowledgements: This research was supported by the research grants from the Agency of Defense Development (UD140003DD and UE140033DD).
NASA Technical Reports Server (NTRS)
Ratcliffe, James G.; Jackson, Wade C.
2008-01-01
A simple analysis method has been developed for predicting the residual compressive strength of impact-damaged sandwich panels. The method is tailored for honeycomb core-based sandwich specimens that exhibit an indentation growth failure mode under axial compressive loading, which is driven largely by the crushing behavior of the core material. The analysis method is in the form of a finite element model, where the impact-damaged facesheet is represented using shell elements and the core material is represented using spring elements, aligned in the thickness direction of the core. The nonlinear crush response of the core material used in the analysis is based on data from flatwise compression tests. A comparison with a previous analysis method and some experimental data shows good agreement with results from this new approach.
NASA Technical Reports Server (NTRS)
Ratcliffe, James G.; Jackson, Wade C.
2008-01-01
A simple analysis method has been developed for predicting the residual compression strength of impact-damaged sandwich panels. The method is tailored for honeycomb core-based sandwich specimens that exhibit an indentation growth failure mode under axial compression loading, which is driven largely by the crushing behavior of the core material. The analysis method is in the form of a finite element model, where the impact-damaged facesheet is represented using shell elements and the core material is represented using spring elements, aligned in the thickness direction of the core. The nonlinear crush response of the core material used in the analysis is based on data from flatwise compression tests. A comparison with a previous analysis method and some experimental data shows good agreement with results from this new approach.
Pistollato, Francesca; Ohayon, Elan L; Lam, Ann; Langley, Gillian R; Novak, Thomas J; Pamies, David; Perry, George; Trushina, Eugenia; Williams, Robin S B; Roher, Alex E; Hartung, Thomas; Harnad, Stevan; Barnard, Neal; Morris, Martha Clare; Lai, Mei-Chun; Merkley, Ryan; Chandrasekera, P Charukeshi
2016-06-28
Much of Alzheimer disease (AD) research has been traditionally based on the use of animals, which have been extensively applied in an effort to both improve our understanding of the pathophysiological mechanisms of the disease and to test novel therapeutic approaches. However, decades of such research have not effectively translated into substantial therapeutic success for human patients. Here we critically discuss these issues in order to determine how existing human-based methods can be applied to study AD pathology and develop novel therapeutics. These methods, which include patient-derived cells, computational analysis and models, together with large-scale epidemiological studies represent novel and exciting tools to enhance and forward AD research. In particular, these methods are helping advance AD research by contributing multifactorial and multidimensional perspectives, especially considering the crucial role played by lifestyle risk factors in the determination of AD risk. In addition to research techniques, we also consider related pitfalls and flaws in the current research funding system. Conversely, we identify encouraging new trends in research and government policy. In light of these new research directions, we provide recommendations regarding prioritization of research funding. The goal of this document is to stimulate scientific and public discussion on the need to explore new avenues in AD research, considering outcome and ethics as core principles to reliably judge traditional research efforts and eventually undertake new research strategies.
Pistollato, Francesca; Ohayon, Elan L.; Lam, Ann; Langley, Gillian R.; Novak, Thomas J.; Pamies, David; Perry, George; Trushina, Eugenia; Williams, Robin S.B.; Roher, Alex E.; Hartung, Thomas; Harnad, Stevan; Barnard, Neal; Morris, Martha Clare; Lai, Mei-Chun; Merkley, Ryan; Chandrasekera, P. Charukeshi
2016-01-01
Much of Alzheimer disease (AD) research has been traditionally based on the use of animals, which have been extensively applied in an effort to both improve our understanding of the pathophysiological mechanisms of the disease and to test novel therapeutic approaches. However, decades of such research have not effectively translated into substantial therapeutic success for human patients. Here we critically discuss these issues in order to determine how existing human-based methods can be applied to study AD pathology and develop novel therapeutics. These methods, which include patient-derived cells, computational analysis and models, together with large-scale epidemiological studies represent novel and exciting tools to enhance and forward AD research. In particular, these methods are helping advance AD research by contributing multifactorial and multidimensional perspectives, especially considering the crucial role played by lifestyle risk factors in the determination of AD risk. In addition to research techniques, we also consider related pitfalls and flaws in the current research funding system. Conversely, we identify encouraging new trends in research and government policy. In light of these new research directions, we provide recommendations regarding prioritization of research funding. The goal of this document is to stimulate scientific and public discussion on the need to explore new avenues in AD research, considering outcome and ethics as core principles to reliably judge traditional research efforts and eventually undertake new research strategies. PMID:27229915
Higashimoto, Makiko; Takahashi, Masahiko; Jokyu, Ritsuko; Syundou, Hiromi; Saito, Hidetsugu
2007-11-01
A HCV core antigen (Ag) detection assay system, Lumipulse Ortho HCV Ag has been developed and is commercially available in Japan with a lower detection level limit of 50 fmol/l, which is equivalent to 20 KIU/ml in PCR quantitative assay. HCV core Ag assay has an advantage of broader dynamic range compared with PCR assay, however the sensitivity is lower than PCR. We developed a novel HCV core Ag concentration method using polyethylene glycol (PEG), which can improve the sensitivity five times better than the original assay. The reproducibility was examined by consecutive five-time measurement of HCV patients serum, in which the results of HCV core Ag original and concentrated method were 56.8 +/- 8.1 fmol/l (mean +/- SD), CV 14.2% and 322.9 +/- 45.5 fmol/l CV 14.0%, respectively. The assay results of HCV negative samples in original HCV core Ag were all 0.1 fmol/l and the results were same even in the concentration method. The results of concentration method were 5.7 times higher than original assay, which was almost equal to theoretical rate as expected. The assay results of serially diluted samples were also as same as expected data in both original and concentration assay. We confirmed that the sensitivity of HCV core Ag concentration method had almost as same sensitivity as PCR high range assay in the competitive assay study using the serially monitored samples of five HCV patients during interferon therapy. A novel concentration method using PEG in HCV core Ag assay system seems to be useful for assessing and monitoring interferon treatment for HCV.
Standardization of Laboratory Methods for the PERCH Study
Karron, Ruth A.; Morpeth, Susan C.; Bhat, Niranjan; Levine, Orin S.; Baggett, Henry C.; Brooks, W. Abdullah; Feikin, Daniel R.; Hammitt, Laura L.; Howie, Stephen R. C.; Knoll, Maria Deloria; Kotloff, Karen L.; Madhi, Shabir A.; Scott, J. Anthony G.; Thea, Donald M.; Adrian, Peter V.; Ahmed, Dilruba; Alam, Muntasir; Anderson, Trevor P.; Antonio, Martin; Baillie, Vicky L.; Dione, Michel; Endtz, Hubert P.; Gitahi, Caroline; Karani, Angela; Kwenda, Geoffrey; Maiga, Abdoul Aziz; McClellan, Jessica; Mitchell, Joanne L.; Morailane, Palesa; Mugo, Daisy; Mwaba, John; Mwansa, James; Mwarumba, Salim; Nyongesa, Sammy; Panchalingam, Sandra; Rahman, Mustafizur; Sawatwong, Pongpun; Tamboura, Boubou; Toure, Aliou; Whistler, Toni; O’Brien, Katherine L.; Murdoch, David R.
2017-01-01
Abstract The Pneumonia Etiology Research for Child Health study was conducted across 7 diverse research sites and relied on standardized clinical and laboratory methods for the accurate and meaningful interpretation of pneumonia etiology data. Blood, respiratory specimens, and urine were collected from children aged 1–59 months hospitalized with severe or very severe pneumonia and community controls of the same age without severe pneumonia and were tested with an extensive array of laboratory diagnostic tests. A standardized testing algorithm and standard operating procedures were applied across all study sites. Site laboratories received uniform training, equipment, and reagents for core testing methods. Standardization was further assured by routine teleconferences, in-person meetings, site monitoring visits, and internal and external quality assurance testing. Targeted confirmatory testing and testing by specialized assays were done at a central reference laboratory. PMID:28575358
A new method for teaching physical examination to junior medical students
Sayma, Meelad; Williams, Hywel Rhys
2016-01-01
Introduction Teaching effective physical examination is a key component in the education of medical students. Preclinical medical students often have insufficient clinical knowledge to apply to physical examination recall, which may hinder their learning when taught through certain understanding-based models. This pilot project aimed to develop a method to teach physical examination to preclinical medical students using “core clinical cases”, overcoming the need for “rote” learning. Methods This project was developed utilizing three cycles of planning, action, and reflection. Thematic analysis of feedback was used to improve this model, and ensure it met student expectations. Results and discussion A model core clinical case developed in this project is described, with gout as the basis for a “foot and ankle” examination. Key limitations and difficulties encountered on implementation of this pilot are discussed for future users, including the difficulty encountered in “content overload”. Conclusion This approach aims to teach junior medical students physical examination through understanding, using a simulated patient environment. Robust research is now required to demonstrate efficacy and repeatability in the physical examination of other systems. PMID:26937208
NASA Astrophysics Data System (ADS)
Freudenthal, Tim; Bergenthal, Markus; Bohrmann, Gerhard; Pape, Thomas; Kopf, Achim; Huhn-Frehers, Katrin; Gohl, Karsten; Wefer, Gerold
2017-04-01
The MARUM-MeBo (abbreviation for Meeresboden-Bohrgerät, the German expression for seafloor drill rig) is a robotic drilling system that is developed since 2004 at the MARUM Center for Marine Environmental Sciences at the University of Bremen in close cooperation with Bauer Maschinen GmbH and other industry partners. The MARUM-MeBo drill rigs can be deployed from multipurpose research vessel like, RV MARIA S. MERIAN, RV METEOR, RV SONNE and RV POLARSTERN and are used for getting long cores both in soft sediments as well as hard rocks in the deep sea. The first generation drill rig, the MARUM-MeBo70 is dedicated for a drilling depth of more than 70 m (Freudenthal and Wefer, 2013). Between 2005 and 2016 it was deployed on 17 research expeditions and drilled about 3 km into different types of geology including carbonate and crystalline rocks, gas hydrates, glacial tills, sands and gravel, glacial till and hemipelagic mud with an average recovery rate of about 70 %. We used the development and operational experiences of MARUM-MeBo70 for the development of a second generation drill rig MARUM-MeBo200. This drill rig is dedicated for conducting core drilling down to 200 m below sea floor. After successful sea trials in the North Sea in October 2014 the MeBo200 was used on a scientific expedition on the research vessel RV SONNE (SO247) in March/April 2016. During 12 deployments we drilled altogether 514 m in hemipelagic sediments with volcanic ashes as well as in muddy and sandy slide deposits off New Zealand. The average core recovery was about 54%. The maximum drilling depth was 105 m below sea floor. Developments for the MeBo drilling technology include the development of a pressure core barrel that was successfully deployed on two research expeditions so far. Bore hole logging adds to the coring capacity. Several autonomous logging probes have been developed in the last years for a deployment with MeBo in the logging while tripping mode - a sonic probe measuring in situ p-wave velocity being the latest development. Various bore hole monitoring systems where developed and deployed with the MeBo system. They allow for long-term monitoring of pressure variability within the sealed bore holes. References: Freudenthal, T and Wefer, G (2013) Drilling cores on the sea floor with the remote-controlled sea floor drilling rig MeBo. Geoscientific Instrumentation, Methods and Data Systems, 2(2). 329-337. doi:10.5194/gi-2-329-2013
2013-10-01
role of copy number variants in prostate cancer risk and progression using a novel genome-wide screening method. 5a. CONTRACT NUMBER 5b. GRANT ...Prostate; Cancer; Risk; Deletion; Prognosismatter Published by Elsevier Inc. .urolonc.2013.06.004 d in part by DOD grant PC081025, by grant arly...Detection Research Network of the National CTRC at UTHSCSA grant P30CA054174. Data omics Core Shared Resource, which is supported CI P30CA054174 (CTRC of
Coherent diffractive imaging methods for semiconductor manufacturing
NASA Astrophysics Data System (ADS)
Helfenstein, Patrick; Mochi, Iacopo; Rajeev, Rajendran; Fernandez, Sara; Ekinci, Yasin
2017-12-01
The paradigm shift of the semiconductor industry moving from deep ultraviolet to extreme ultraviolet lithography (EUVL) brought about new challenges in the fabrication of illumination and projection optics, which constitute one of the core sources of cost of ownership for many of the metrology tools needed in the lithography process. For this reason, lensless imaging techniques based on coherent diffractive imaging started to raise interest in the EUVL community. This paper presents an overview of currently on-going research endeavors that use a number of methods based on lensless imaging with coherent light.
Gordetsky, Jennifer B; Schultz, Luciana; Porter, Kristin K; Nix, Jeffrey W; Thomas, John V; Del Carmen Rodriguez Pena, Maria; Rais-Bahrami, Soroush
2018-06-01
Magnetic resonance (MR)/ultrasound fusion-targeted biopsy (TB) routinely samples multiple cores from each MR lesion of interest. Pathologists can evaluate the extent of cancer involvement and grade using an individual core (IC) or aggregate (AG) method, which could potentially lead to differences in reporting. We reviewed patients who underwent TB followed by radical prostatectomy (RP). TB cores were evaluated for grade and tumor extent by 2 methods. In the IC method, the grade for each TB lesion was based on the core with the highest Gleason score. Tumor extent for each TB was based on the core with the highest percent of tumor involvement. In the AG method, the tumor from all cores within each TB lesion was aggregated to determine the final composite grade and percentage of tumor involvement. Each method was compared with MR lesional volume, MR lesional density (lesion volume/prostate volume), and RP. Fifty-five patients underwent TB followed by RP. Extent of tumor by the AG method showed a better correlation with target lesion volume (r= 0.27,P= .022) and lesional density (r = 0.32, P = .008) than did the IC method (r= 0.19 [P = .103] andr= 0.22 [P = .062]), respectively. Extent of tumor on TB was associated with extraprostatic extension on RP by the AG method (P= .04), but not by the IC method. This association was significantly higher in patients with a grade group (GG) of 3 or higher (P= .03). A change in cancer grade occurred in 3 patients when comparing methods (2 downgraded GG3 to GG2, 1 downgraded GG4 to GG3 by the AG method). For multiple cores obtained via TB, the AG method better correlates with target lesion volume, lesional density, and extraprostatic extension. Copyright © 2018 Elsevier Inc. All rights reserved.
Using Powder Cored Tubular Wire Technology to Enhance Electron Beam Freeform Fabricated Structures
NASA Technical Reports Server (NTRS)
Gonzales, Devon; Liu, Stephen; Domack, Marcia; Hafley, Robert
2016-01-01
Electron Beam Freeform Fabrication (EBF3) is an additive manufacturing technique, developed at NASA Langley Research Center, capable of fabricating large scale aerospace parts. Advantages of using EBF3 as opposed to conventional manufacturing methods include, decreased design-to-product time, decreased wasted material, and the ability to adapt controls to produce geometrically complex parts with properties comparable to wrought products. However, to fully exploit the potential of the EBF3 process development of materials tailored for the process is required. Powder cored tubular wire (PCTW) technology was used to modify Ti-6Al-4V and Al 6061 feedstock to enhance alloy content, refine grain size, and create a metal matrix composite in the as-solidified structures, respectively.
Fabrication of micro-alginate gel tubes utilizing micro-gelatin fibers
NASA Astrophysics Data System (ADS)
Sakaguchi, Katsuhisa; Arai, Takafumi; Shimizu, Tatsuya; Umezu, Shinjiro
2017-05-01
Tissues engineered utilizing biofabrication techniques have recently been the focus of much attention, because these bioengineered tissues have great potential to improve the quality of life of patients with various hard-to-treat diseases. Most tissues contain micro-tubular structures including blood vessels, lymphatic vessels, and bile canaliculus. Therefore, we bioengineered a micro diameter tube using alginate gel to coat the core gelatin gel. Micro-gelatin fibers were fabricated by the coacervation method and then coated with a very thin alginate gel layer by dipping. A micro diameter alginate tube was produced by dissolving the core gelatin gel. Consequently, these procedures led to the formation of micro-alginate gel tubes of various shapes and sizes. This biofabrication technique should contribute to tissue engineering research fields.
[Principles for molecular identification of traditional Chinese materia medica using DNA barcoding].
Chen, Shi-Lin; Yao, Hui; Han, Jian-Ping; Xin, Tian-Yi; Pang, Xiao-Hui; Shi, Lin-Chun; Luo, Kun; Song, Jing-Yuan; Hou, Dian-Yun; Shi, Shang-Mei; Qian, Zhong-Zhi
2013-01-01
Since the research of molecular identification of Chinese Materia Medica (CMM) using DNA barcode is rapidly developing and popularizing, the principle of this method is approved to be listed in the Supplement of the Pharmacopoeia of the People's Republic of China. Based on the study on comprehensive samples, the DNA barcoding systems have been established to identify CMM, i.e. ITS2 as a core barcode and psbA-trnH as a complementary locus for identification of planta medica, and COI as a core barcode and ITS2 as a complementary locus for identification of animal medica. This article introduced the principle of molecular identification of CMM using DNA barcoding and its drafting instructions. Furthermore, its application perspective was discussed.
Dieckmann, Peter; Phero, James C; Issenberg, S Barry; Kardong-Edgren, Suzie; Ostergaard, Doris; Ringsted, Charlotte
2011-08-01
In this article, we describe the preparation and execution of the first Research Consensus Summit (Summit) of the Society for Simulation in Healthcare (SSH) held in January 2011 in New Orleans, Louisiana. The goals of the Summit were to provide guidance for better simulation-related research, to broaden the scope of topics investigated, and to highlight the importance of simulation-related research. An international Core Group (the authors of this article) worked with the SSH Research Committee to identify 10 topic areas relevant for future research that would be examined by the 10 Topic Groups composed of Topic Chairs and Topic Group Members. Each Topic Group prepared a monograph and slide presentation on their topic which was presented at the 2-day Summit. The audience provided feedback on each presentation. Based on this feedback, the Topic Groups revised their presentations and monographs for publication in this supplement to Simulation in Healthcare. The Core Group has synthesized an overview of the key Summit themes in this article. In some groups, the agreement was that there is currently no consensus about the state of the science in certain topic aspects. Some key themes emerged from the Topic Groups. The conceptual and theoretical bases of simulation-related research, as well as the methods used and their methodological foundations, need to be more explicitly described in future publications. Although no single method is inherently better, the mix of research methods chosen should match the goal of each study. The impact of simulation, whether direct or indirect, needs to be assessed across different levels of training, and larger, more complex contexts need to be taken into account. When interpreting simulation-related research, the ecological validity of the results needs to be taken into consideration. The scope of simulation-related research can be widened from having simulation as the focus of research (research about simulation), to using simulation to investigate other research questions (research with simulation). Simulation-related research can benefit from an improved understanding of structural differences and similarities with other domains. The development of simulation equipment and concepts will benefit from applying known and available science-based design frameworks. Overall, the context of simulation-related research needs to be better understood. The progress of research depends on building overarching and sustainable research programs that relate individual studies with each other. The Summit was successful in taking a snapshot of the state of the science. Future summits might explore these topics further, monitor progress, and address new topics.
Hicks, Joshua; Adrian, Betty
2009-01-01
The Core Research Center (CRC) of the U.S. Geological Survey (USGS), located at the Denver Federal Center in Lakewood, Colo., currently houses rock core from more than 8,500 boreholes representing about 1.7 million feet of rock core from 35 States and cuttings from 54,000 boreholes representing 238 million feet of drilling in 28 States. Although most of the boreholes are located in the Rocky Mountain region, the geologic and geographic diversity of samples have helped the CRC become one of the largest and most heavily used public core repositories in the United States. Many of the boreholes represented in the collection were drilled for energy and mineral exploration, and many of the cores and cuttings were donated to the CRC by private companies in these industries. Some cores and cuttings were collected by the USGS along with other government agencies. Approximately one-half of the cores are slabbed and photographed. More than 18,000 thin sections and a large volume of analytical data from the cores and cuttings are also accessible. A growing collection of digital images of the cores are also becoming available on the CRC Web site Internet http://geology.cr.usgs.gov/crc/.
A study of tensile test on open-cell aluminum foam sandwich
NASA Astrophysics Data System (ADS)
Ibrahim, N. A.; Hazza, M. H. F. Al; Adesta, E. Y. T.; Abdullah Sidek, Atiah Bt.; Endut, N. A.
2018-01-01
Aluminum foam sandwich (AFS) panels are one of the growing materials in the various industries because of its lightweight behavior. AFS also known for having excellent stiffness to weight ratio and high-energy absorption. Due to their advantages, many researchers’ shows an interest in aluminum foam material for expanding the use of foam structure. However, there is still a gap need to be fill in order to develop reliable data on mechanical behavior of AFS with different parameters and analysis method approach. Least of researcher focusing on open-cell aluminum foam and statistical analysis. Thus, this research conducted by using open-cell aluminum foam core grade 6101 with aluminum sheets skin tested under tension. The data is analyzed using full factorial in JMP statistical analysis software (version 11). ANOVA result show a significant value of the model which less than 0.500. While scatter diagram and 3D plot surface profiler found that skins thickness gives a significant impact to stress/strain value compared to core thickness.
NASA Technical Reports Server (NTRS)
Stoker, C. R.; Clarke, J. D. A.; Direito, S.; Foing, B.
2011-01-01
The DOMEX program is a NASA-MMAMA funded project featuring simulations of human crews on Mars focused on science activities that involve collecting samples from the subsurface using both manual and robotic equipment methods and analyzing them in the field and post mission. A crew simulating a human mission to Mars performed activities focused on subsurface science for 2 weeks in November 2009 at Mars Desert Research Station near Hanksville, Utah --an important chemical and morphological Mars analog site. Activities performed included 1) survey of the area to identify geologic provinces, 2) obtaining soil and rock samples from each province and characterizing their mineralogy, chemistry, and biology; 3) site selection and reconnaissance for a future drilling mission; 4) deployment and testing of Mars Underground Mole, a percussive robotic soil sampling device; and 5) recording and analyzing how crew time was used to accomplish these tasks. This paper summarizes results from analysis of soil cores
DOE Office of Scientific and Technical Information (OSTI.GOV)
John D. Bess; Leland M. Montierth
2013-03-01
In its deployment as a pebble bed reactor (PBR) critical facility from 1992 to 1996, the PROTEUS facility was designated as HTR-PROTEUS. This experimental program was performed as part of an International Atomic Energy Agency (IAEA) Coordinated Research Project (CRP) on the Validation of Safety Related Physics Calculations for Low Enriched HTGRs. Within this project, critical experiments were conducted for graphite moderated LEU systems to determine core reactivity, flux and power profiles, reaction-rate ratios, the worth of control rods, both in-core and reflector based, the worth of burnable poisons, kinetic parameters, and the effects of moisture ingress on these parameters.more » One benchmark experiment was evaluated in this report: Core 4. Core 4 represents the only configuration with random pebble packing in the HTR-PROTEUS series of experiments, and has a moderator-to-fuel pebble ratio of 1:1. Three random configurations were performed. The initial configuration, Core 4.1, was rejected because the method for pebble loading, separate delivery tubes for the moderator and fuel pebbles, may not have been completely random; this core loading was rejected by the experimenters. Cores 4.2 and 4.3 were loaded using a single delivery tube, eliminating the possibility for systematic ordering effects. The second and third cores differed slightly in the quantity of pebbles loaded (40 each of moderator and fuel pebbles), stacked height of the pebbles in the core cavity (0.02 m), withdrawn distance of the stainless steel control rods (20 mm), and withdrawn distance of the autorod (30 mm). The 34 coolant channels in the upper axial reflector and the 33 coolant channels in the lower axial reflector were open. Additionally, the axial graphite fillers used in all other HTR-PROTEUS configurations to create a 12-sided core cavity were not used in the randomly packed cores. Instead, graphite fillers were placed on the cavity floor, creating a funnel-like base, to discourage ordering effects during pebble loading. Core 4 was determined to be acceptable benchmark experiment.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bess, John D.; Montierth, Leland M.; Sterbentz, James W.
2014-03-01
In its deployment as a pebble bed reactor (PBR) critical facility from 1992 to 1996, the PROTEUS facility was designated as HTR-PROTEUS. This experimental program was performed as part of an International Atomic Energy Agency (IAEA) Coordinated Research Project (CRP) on the Validation of Safety Related Physics Calculations for Low Enriched HTGRs. Within this project, critical experiments were conducted for graphite moderated LEU systems to determine core reactivity, flux and power profiles, reaction-rate ratios, the worth of control rods, both in-core and reflector based, the worth of burnable poisons, kinetic parameters, and the effects of moisture ingress on these parameters.more » One benchmark experiment was evaluated in this report: Core 4. Core 4 represents the only configuration with random pebble packing in the HTR-PROTEUS series of experiments, and has a moderator-to-fuel pebble ratio of 1:1. Three random configurations were performed. The initial configuration, Core 4.1, was rejected because the method for pebble loading, separate delivery tubes for the moderator and fuel pebbles, may not have been completely random; this core loading was rejected by the experimenters. Cores 4.2 and 4.3 were loaded using a single delivery tube, eliminating the possibility for systematic ordering effects. The second and third cores differed slightly in the quantity of pebbles loaded (40 each of moderator and fuel pebbles), stacked height of the pebbles in the core cavity (0.02 m), withdrawn distance of the stainless steel control rods (20 mm), and withdrawn distance of the autorod (30 mm). The 34 coolant channels in the upper axial reflector and the 33 coolant channels in the lower axial reflector were open. Additionally, the axial graphite fillers used in all other HTR-PROTEUS configurations to create a 12-sided core cavity were not used in the randomly packed cores. Instead, graphite fillers were placed on the cavity floor, creating a funnel-like base, to discourage ordering effects during pebble loading. Core 4 was determined to be acceptable benchmark experiment.« less
Lower Anogenital Tract Disease Therapy Outcomes, COMET, and CROWN: Call for Research Submissions.
Andrews, Jeffrey
2015-10-01
There is a problem of inconsistent and inappropriate outcome selection for research studies. We can improve the relevance of research results for women and for their physicians and clinicians by encouraging researchers to critically evaluate outcome measures, and use valid, appropriate, standardized measures. To this purpose, and to facilitate synthesis of the evidence, outcomes reported by clinical studies should be standardized for different disease conditions through the development of core outcome sets (COS). There is an international effort for reaching consensus on outcome measures and establishing COS that represent agreed-upon standardized collections of outcome measures that will be reported in all studies within a clinical area. Across clinical specialties, the Core Outcome Measures in Effectiveness Trials (COMET) initiative launched in 2010. In 2014, the editors of women's health journals answered the challenge of COMET and formed the Core Outcomes in Women's Health initiative. The Journal of Lower Genital Tract Diseases is a participating member of the Core Outcomes in Women's Health consortium. There is broad inconsistency in outcome measures and reporting in the field of lower anogenital tract diseases. No core outcome sets currently exist. Suggested target conditions in anogenital disease are vulvar dermatoses, cervical intraepithelial neoplasia, and vulvodynia. Investigators are encouraged to conduct secondary systematic research to determine previously reported primary outcome measures and suggest domains for COS. Core Outcomes in Women's health initiative and COMET encourage the formation of consensus panels of stakeholders (researchers, health care providers, patients, and others) to recommend outcome domains and COS and then publish their report.
The Role of Natural Hydrate on the Strength of Sands: Load-bearing or Cementing?
NASA Astrophysics Data System (ADS)
Priest, J. A.; Hayley, J. L.
2017-12-01
The strength of hydrate bearing sands is a key parameter for simulating the long-term performance of hydrate reservoirs during gas production and assessing reservoir and wellbore stability. Historically this parameter has been determined from testing synthesized hydrate sand samples, which has led to significant differences in measured strength that appears to reflect different formation methods adopted. At present, formation methods can be grouped into either those that form hydrate at grain contacts leading to a high strength `cemented' sand, or those where the hydrate forms a `load-bearing' structure in which the hydrate grains reside in the pore space resulting in more subtle changes in strength. Recovered natural hydrate-bearing cores typically exhibit this `load-bearing' behavior, although these cores have generally undergone significant changes in temperature and pressure during recovery, which may have altered the structure of the hydrate and sediment. Recent drilling expeditions using pressure coring, such as NGHP2 offshore India, have enabled intact hydrate bearing sediments to be recovered that have maintained hydrostatic stresses minimizing any changes in the hydrate structure within the core. Triaxial testing on these samples highlight enhanced strength even at zero effective stresses. This suggests that the hydrate forms a connected framework within the pore space apparently `cementing' the sand grains in place: we differentiate here between true cementation where hydrate is sintered onto the sand grains and typical observed behavior for cemented sands (cohesion, peak strength, post-peak strain softening). This inter-connected hydrate, and its ability to increase strength of the sands, appears to occur even at hydrate saturations as low as 30%, where typical `load-bearing' hydrates just start to increase strength. The results from pressure cores suggest that hydrate formation techniques that lead to `load-bearing' behavior may not capture the true interaction between the hydrate and sand and thus further research is needed to form synthesized hydrate bearing samples that more realistically mimic the observed strength behavior of natural hydrate bearing cores.
Validation of Persian Version of PedsQL™ 4.0™ Generic Core Scales in Toddlers and Children
Gheissari, Alaleh; Farajzadegan, Ziba; Heidary, Maryam; Salehi, Fatemeh; Masaeli, Ali; Mazrooei, Amin; Varni, James W; Fallah, Zahra; Zandieh, Fariborz
2012-01-01
Introduction: To evaluate the reliability, validity and feasibility of the Persian version of the Pediatric Quality of Life inventory (PedsQL™ 4.0™ 4.0) Generic Core Scales in Iranian healthy students ages 7-15 and chronically ill children ages 2-18. Methods: We followed the translation methodology proposed by developer to validate Persian version of PedsQL™ 4.0™ 4.0 Generic Core Scales for children. Six hundred and sixty children and adolescents and their parents were enrolled. Sample of 160 healthy students were chosen by random cluster method between 4 regions of Isfahan education offices and 60 chronically ill children were recruited from St. Alzahra hospital private clinics. The questionnaires were fulfilled by the participants. Results: The Persian version of PedsQL™ 4.0™ 4.0 Generic Core Scales discriminated between healthy and chronically ill children (healthy students mean score was 12.3 better than chronically ill children, P<0.001). Cronbachs’ alpha internal consistency values exceeded 0.7 for children self reports and proxy reports of children 5-7 years old and 13-18 years old. Reliability of proxy reports for 2-4 years old was much lower than 0.7. Although, proxy reports for chronically ill children 8-12 years old was more than 0.7, these reports for healthy children with same age group was slightly lower than 0.7. Constructive, criterion face and content validity were acceptable. In addition, the Persian version of PedsQL™ 4.0™ 4.0 Generic Core Scales was feasible and easy to complete. Conclusion: Results showed that Persian version of PedsQL™ 4.0™ 4.0 Generic Core Scales is valid and acceptable for pediatric health researches. It is necessary to alternate scoring for 2-4 years old questionnaire and to find a way to increase reliability for healthy children aged 8-12 years especially, according to Iranian culture. PMID:22701775
pH sensitive core-shell magnetic nanoparticles for targeted drug delivery in cancer therapy.
Lungu, Iulia Ioana; Rădulescu, Marius; Mogoşanu, George Dan; Grumezescu, Alexandru Mihai
2016-01-01
In the last decade, nanobiotechnology has evolved rapidly with an extensive impact on biomedical area. In order to improve bioavailability and minimize adverse effects, drug delivery systems based on magnetic nanocomposites are under development mainly for cancer imaging and antitumor therapy. In this regard, pH sensitive core-shell magnetic nanoparticles (NPs) with accurate controlled size and shape are synthesized by various modern methods, such as homogeneous precipitation, coprecipitation, microemulsion or polyol approaches, high temperature and hydrothermal reactions, sol-gel reactions, aerosol÷vapor processes and sonolysis. Due to their unique combined physico-chemical and biological properties (such as higher dispensability, chemical and thermal stability, biocompatibility), pH responsive core-shell magnetic NPs are widely investigated for controlled release of cytostatic drugs into the tumor site by means of pH change: magnetite@silicon dioxide (Fe3O4@SiO2), Fe3O4@titanium dioxide (TiO2), β-thiopropionate-polyethylene glycol (PEG)-modified Fe3O4@mSiO2, Fe3O4 NPs core coated with SiO2 with an imidazole group modified PEG-polypeptide (mPEG-poly-L-Asparagine), polyacrylic acid (PAA) and folic acid (FA) coating of the iron oxide NP core, methoxy polyethylene glycol-block-polymethacrylic acid-block-polyglycerol monomethacrylate (MPEG-b-PMAA-b-PGMA) attached by a PGMA block to a Fe3O4 core, PEG-modified polyamidoamine (PAMAM) dendrimer shell with Fe3O4 core and mesoporous silica coated on Fe3O4, mostly coated with an anticancer drug. This review paper highlights the modern research directions currently employed to demonstrate the utility of the pH responsive core-shell magnetic NPs in diagnosis and treatment of oncological diseases.
Initial Continuous Chemistry Results From The Roosevelt Island Ice Core (RICE)
NASA Astrophysics Data System (ADS)
Kjær, H. A.; Vallelonga, P. T.; Simonsen, M. F.; Neff, P. D.; Bertler, N. A. N.; Svensson, A.; Dahl-Jensen, D.
2014-12-01
The Roosevelt Island ice core (79.36° S, -161.71° W) was drilled in 2011-13 at the top of the Roosevelt Island ice dome, a location surrounded by the Ross ice shelf. The RICE ice core provides a unique opportunity to look into the past evolution of the West Antarctic Ice sheet. Further the site has high accumulation; 0.26 m of ice equivalent is deposited annually allowing annual layer determination for many chemical parameters. The RICE core was drilled to bedrock and has a total length of 763 metres. Preliminary results derived from water isotopes suggest that the oldest ice reaches back to the Eemian, with the last glacial being compressed in the bottom 60 metres. We present preliminary results from the RICE ice core including continuous measurements of acidity using an optical dye method, insoluble dust particles, conductivity and calcium. The core was analyzed at the New Zealand National Ice Core Research Facility at GNS Science in Wellington. The analytical set up used to determine climate proxies in the ice core was a modified version of the Copenhagen CFA system (Bigler et al., 2011). Key volcanic layers have been matched to those from the WAIS record (Sigl et al., 2013). A significant anti-correlation between acidity and calcium was seen in the Holocene part of the record. Due to the proximity to the ocean a large fraction of the calcium originates from sea salt and is in phase with total conductivity and sodium. In combination with the insoluble dust record, calcium has been apportioned into ocean-related and dust-related sources. Variability over the Holocene is presented and attributed to changing inputs of marine and dust aerosols.
Investigating the soil removal characteristics of flexible tube coring method for lunar exploration
NASA Astrophysics Data System (ADS)
Tang, Junyue; Quan, Qiquan; Jiang, Shengyuan; Liang, Jieneng; Lu, Xiangyong; Yuan, Fengpei
2018-02-01
Compared with other technical solutions, sampling the planetary soil and returning it back to Earth may be the most direct method to seek the evidence of extraterrestrial life. To keep sample's stratification for further analyzing, a novel sampling method called flexible tube coring has been adopted for China future lunar explorations. Given the uncertain physical properties of lunar regolith, proper drilling parameters should be adjusted immediately in piercing process. Otherwise, only a small amount of core could be sampled and overload drilling faults could occur correspondingly. Due to the fact that the removed soil is inevitably connected with the cored soil, soil removal characteristics may have a great influence on both drilling loads and coring results. To comprehend the soil removal characteristics, a non-contact measurement was proposed and verified to acquire the coring and removal results accurately. Herein, further more experiments in one homogenous lunar regolith simulant were conducted, revealing that there exists a sudden core failure during the sampling process and the final coring results are determined by the penetration per revolution index. Due to the core failure, both drilling loads and soil's removal states are also affected thereby.
Measurement and simulation of thermal neutron flux distribution in the RTP core
NASA Astrophysics Data System (ADS)
Rabir, Mohamad Hairie B.; Jalal Bayar, Abi Muttaqin B.; Hamzah, Na'im Syauqi B.; Mustafa, Muhammad Khairul Ariff B.; Karim, Julia Bt. Abdul; Zin, Muhammad Rawi B. Mohamed; Ismail, Yahya B.; Hussain, Mohd Huzair B.; Mat Husin, Mat Zin B.; Dan, Roslan B. Md; Ismail, Ahmad Razali B.; Husain, Nurfazila Bt.; Jalil Khan, Zareen Khan B. Abdul; Yakin, Shaiful Rizaide B. Mohd; Saad, Mohamad Fauzi B.; Masood, Zarina Bt.
2018-01-01
The in-core thermal neutron flux distribution was determined using measurement and simulation methods for the Malaysian’s PUSPATI TRIGA Reactor (RTP). In this work, online thermal neutron flux measurement using Self Powered Neutron Detector (SPND) has been performed to verify and validate the computational methods for neutron flux calculation in RTP calculations. The experimental results were used as a validation to the calculations performed with Monte Carlo code MCNP. The detail in-core neutron flux distributions were estimated using MCNP mesh tally method. The neutron flux mapping obtained revealed the heterogeneous configuration of the core. Based on the measurement and simulation, the thermal flux profile peaked at the centre of the core and gradually decreased towards the outer side of the core. The results show a good agreement (relatively) between calculation and measurement where both show the same radial thermal flux profile inside the core: MCNP model over estimation with maximum discrepancy around 20% higher compared to SPND measurement. As our model also predicts well the neutron flux distribution in the core it can be used for the characterization of the full core, that is neutron flux and spectra calculation, dose rate calculations, reaction rate calculations, etc.
NASA Astrophysics Data System (ADS)
Ross, P.-S.; Bourke, A.
2017-01-01
Physical property measurements are increasingly important in mining exploration. For density determinations on rocks, one method applicable on exploration drill cores relies on gamma ray attenuation. This non-destructive method is ideal because each measurement takes only 10 s, making it suitable for high-resolution logging. However calibration has been problematic. In this paper we present new empirical, site-specific correction equations for whole NQ and BQ cores. The corrections force back the gamma densities to the "true" values established by the immersion method. For the NQ core caliber, the density range extends to high values (massive pyrite, 5 g/cm3) and the correction is thought to be very robust. We also present additional empirical correction factors for cut cores which take into account the missing material. These "cut core correction factors", which are not site-specific, were established by making gamma density measurements on truncated aluminum cylinders of various residual thicknesses. Finally we show two examples of application for the Abitibi Greenstone Belt in Canada. The gamma ray attenuation measurement system is part of a multi-sensor core logger which also determines magnetic susceptibility, geochemistry and mineralogy on rock cores, and performs line-scan imaging.
NASA Astrophysics Data System (ADS)
Ito, T.; Funato, A.; Tamagawa, T.; Tezuka, K.; Yabe, Y.; Abe, S.; Ishida, A.; Ogasawara, H.
2017-12-01
When rock is cored at depth by drilling, anisotropic expansion occurs with the relief of anisotropic rock stresses, resulting in a sinusoidal variation of core diameter with a period of 180 deg. in the core roll angle. The circumferential variation of core diameter is given theoretically as a function of rock stresses. These new findings can lead various ideas to estimate the rock stress from circumferential variation of core diameter measured after the core retrieving. In the simplest case when a single core sample is only available, the difference between the maximum and minimum components of rock stress in a plane perpendicular to the drilled hole can be estimated from the maximum and minimum core diameters (see the detail in, Funato and Ito, IJRMMS, 2017). The advantages of this method include, (i) much easier measurement operation than those in other in-situ or in-lab estimation methods, and (ii) applicability in high stress environment where stress measurements need pressure for packers or pumping system for the hydro-fracturing methods higher than their tolerance levels. We have successfully tested the method at deep seismogenic zones in South African gold mines, and we are going to apply it to boreholes collared at 3 km depth and intersecting a M5.5 rupture plane several hundred meters below the mine workings in the ICDP project of "Drilling into Seismogenic zones of M2.0 - M5.5 earthquakes in deep South African gold mines" (DSeis) (e.g., http://www.icdp-online.org/projects/world/africa/orkney-s-africa/details/). If several core samples with different orientation are available, all of three principal components of 3D rock stress can be estimated. To realize this, we should have several boreholes drilled in different directions in a rock mass where the stress field is considered to be uniform. It is commonly carried out to dill boreholes in different directions from a mine gallery. Even in a deep borehole drilled vertically from the ground surface, the downhole tool of rotary sidewall coring allows us to take core samples with different orientations at depths of interest from the sidewall of the vertically-drilled borehole. The theoretical relationship between the core expansion and rock stress has been verified through the examination of core samples prepared in laboratory experiments and retrieved field cores.
Atomistic calculations of dislocation core energy in aluminium
Zhou, X. W.; Sills, R. B.; Ward, D. K.; ...
2017-02-16
A robust molecular dynamics simulation method for calculating dislocation core energies has been developed. This method has unique advantages: it does not require artificial boundary conditions, is applicable for mixed dislocations, and can yield highly converged results regardless of the atomistic system size. Utilizing a high-fidelity bond order potential, we have applied this method in aluminium to calculate the dislocation core energy as a function of the angle β between the dislocation line and Burgers vector. These calculations show that, for the face-centred-cubic aluminium explored, the dislocation core energy follows the same functional dependence on β as the dislocation elasticmore » energy: Ec = A·sin 2β + B·cos 2β, and this dependence is independent of temperature between 100 and 300 K. By further analysing the energetics of an extended dislocation core, we elucidate the relationship between the core energy and radius of a perfect versus extended dislocation. With our methodology, the dislocation core energy can be accurately accounted for in models of plastic deformation.« less
Atomistic calculations of dislocation core energy in aluminium
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhou, X. W.; Sills, R. B.; Ward, D. K.
A robust molecular dynamics simulation method for calculating dislocation core energies has been developed. This method has unique advantages: it does not require artificial boundary conditions, is applicable for mixed dislocations, and can yield highly converged results regardless of the atomistic system size. Utilizing a high-fidelity bond order potential, we have applied this method in aluminium to calculate the dislocation core energy as a function of the angle β between the dislocation line and Burgers vector. These calculations show that, for the face-centred-cubic aluminium explored, the dislocation core energy follows the same functional dependence on β as the dislocation elasticmore » energy: Ec = A·sin 2β + B·cos 2β, and this dependence is independent of temperature between 100 and 300 K. By further analysing the energetics of an extended dislocation core, we elucidate the relationship between the core energy and radius of a perfect versus extended dislocation. With our methodology, the dislocation core energy can be accurately accounted for in models of plastic deformation.« less
Generating or developing grounded theory: methods to understand health and illness.
Woods, Phillip; Gapp, Rod; King, Michelle A
2016-06-01
Grounded theory is a qualitative research methodology that aims to explain social phenomena, e.g. why particular motivations or patterns of behaviour occur, at a conceptual level. Developed in the 1960s by Glaser and Strauss, the methodology has been reinterpreted by Strauss and Corbin in more recent times, resulting in different schools of thought. Differences arise from different philosophical perspectives concerning knowledge (epistemology) and the nature of reality (ontology), demanding that researchers make clear theoretical choices at the commencement of their research when choosing this methodology. Compared to other qualitative methods it has ability to achieve understanding of, rather than simply describing, a social phenomenon. Achieving understanding however, requires theoretical sampling to choose interviewees that can contribute most to the research and understanding of the phenomenon, and constant comparison of interviews to evaluate the same event or process in different settings or situations. Sampling continues until conceptual saturation is reached, i.e. when no new concepts emerge from the data. Data analysis focusses on categorising data (finding the main elements of what is occurring and why), and describing those categories in terms of properties (conceptual characteristics that define the category and give meaning) and dimensions (the variations within properties which produce specificity and range). Ultimately a core category which theoretically explains how all other categories are linked together is developed from the data. While achieving theoretical abstraction in the core category, it should be logical and capture all of the variation within the data. Theory development requires understanding of the methodology not just working through a set of procedures. This article provides a basic overview, set in the literature surrounding grounded theory, for those wanting to increase their understanding and quality of research output.
Friendship Group Position and Substance Use
Osgood, D. Wayne; Feinberg, Mark E.; Wallace, Lacey N.; Moody, James
2014-01-01
This paper examines how an adolescent's position relative to cohesive friendship groups in the school-wide social network is associated with alcohol, tobacco, and marijuana use. We extend prior research in this area by refining the categories of group positions, using more extensive friendship information, applying newer analytic methods to identify friendship groups, and making strategic use of control variables to clarify the meaning of differences among group positions. We report secondary analyses of 6th through 9th grade data from the PROSPER study, which include approximately 9,500 adolescents each year from 27 school districts and 368 school grade cohort friendship networks. We find that core members of friendship groups were more likely to drink than isolates and liaisons, especially in light of their positive social integration in school, family, and religious contexts. Isolates were more likely to use cigarettes than core members, even controlling for all other factors. Finally, liaisons were more likely to use marijuana than core members. PMID:24389068
NASA Astrophysics Data System (ADS)
Shimono, T.; Matsumoto, R.
2016-12-01
Shallow gas hydrate is known to occur as massive nodular aggregates in subsurface and/or shallow marine sediments (e.g. Matsumoto et al. 2009). We conducted a rock magnetic study of marine core sediments to clarify the relationship between shallow gas hydrate and the surrounding sediments. The core samples were taken from around Oki area and offshore Joetsu, the eastern margin of Japan Sea, during PS15 cruise in 2015. We mainly report magnetic susceptibility measurement of whole-round core samples. From the onboard measurements, the magnetic susceptibilities of gas hydrates indicated diamagnetic mineral like water or ice ( -0.9 x 10-5 vol. SI). Moreover, we introduce a method to assess the amount of gas hydrate present within marine sediments using magnetic susceptibility and rock magnetic analyses. This study was conducted under the commission from AIST as a part of the methane hydrate research project of METI (the Ministry of Economy, Trade and Industry, Japan).
ERIC Educational Resources Information Center
Moshier, Kenneth
This module providing an introduction to research procedures is one of a set of five on evaluation and research and is part of a larger series of thirty-four modules constituting a core curriculum for use in the professional preparation of vocational educators in the areas of agricultural, business, home economics, and industrial education.…
D. Jimenez; B. Butler; K. Hiers; R. Ottmar; M. Dickinson; R. Kremens; J. O' Brien; A. Hudak; C. Clements
2009-01-01
The Rx-CADRE project was the combination of local and national fire expertise in the field of core fire research. The project brought together approximately 30 fire scientists from six geographic regions and seven diff erent agencies. The project objectives were to demonstrate the capacity for collaborative research by bringing together individuals and teams with a...
Method of Making a Composite Panel Having Subsonic Transverse Wave Speed Characteristics
NASA Technical Reports Server (NTRS)
Palumbo, Daniel L. (Inventor); Klos, Jacob (Inventor)
2012-01-01
A method of making a composite panel having subsonic transverse wave speed characteristics which has first and second sheets sandwiching a core with at least one of the sheets being attached to the core at first regions thereof and unattached to the core at second regions thereof.
A vortex-filament and core model for wings with edge vortex separation
NASA Technical Reports Server (NTRS)
Pao, J. L.; Lan, C. E.
1981-01-01
A method for predicting aerodynamic characteristics of slender wings with edge vortex separation was developed. Semiempirical but simple methods were used to determine the initial positions of the free sheet and vortex core. Comparison with available data indicates that: the present method is generally accurate in predicting the lift and induced drag coefficients but the predicted pitching moment is too positive; the spanwise lifting pressure distributions estimated by the one vortex core solution of the present method are significantly better than the results of Mehrotra's method relative to the pressure peak values for the flat delta; the two vortex core system applied to the double delta and strake wing produce overall aerodynamic characteristics which have good agreement with data except for the pitching moment; and the computer time for the present method is about two thirds of that of Mehrotra's method.
A Two-Step Approach to Uncertainty Quantification of Core Simulators
Yankov, Artem; Collins, Benjamin; Klein, Markus; ...
2012-01-01
For the multiple sources of error introduced into the standard computational regime for simulating reactor cores, rigorous uncertainty analysis methods are available primarily to quantify the effects of cross section uncertainties. Two methods for propagating cross section uncertainties through core simulators are the XSUSA statistical approach and the “two-step” method. The XSUSA approach, which is based on the SUSA code package, is fundamentally a stochastic sampling method. Alternatively, the two-step method utilizes generalized perturbation theory in the first step and stochastic sampling in the second step. The consistency of these two methods in quantifying uncertainties in the multiplication factor andmore » in the core power distribution was examined in the framework of phase I-3 of the OECD Uncertainty Analysis in Modeling benchmark. With the Three Mile Island Unit 1 core as a base model for analysis, the XSUSA and two-step methods were applied with certain limitations, and the results were compared to those produced by other stochastic sampling-based codes. Based on the uncertainty analysis results, conclusions were drawn as to the method that is currently more viable for computing uncertainties in burnup and transient calculations.« less
Miller, B Paige; Shrum, Wesley
2012-01-01
Using panel data gathered across two waves (2001 and 2005) from researchers in Ghana, Kenya, and Kerala, India, we examine three questions: (1) To what extent do gender differences exist in the core professional networks of scientists in low-income areas? (2) How do gender differences shift over time? (3) Does use of information and communication technologies (ICTs) mediate the relationship between gender and core network composition? Our results indicate that over a period marked by dramatic increases in access to and use of various ICTs, the composition and size of female researchers core professional ties have either not changed significantly or have changed in an unexpected direction. Indeed, the size of women's ties are retracting over time rather than expanding.
Research advances in polymer emulsion based on "core-shell" structure particle design.
Ma, Jian-zhong; Liu, Yi-hong; Bao, Yan; Liu, Jun-li; Zhang, Jing
2013-09-01
In recent years, quite many studies on polymer emulsions with unique core-shell structure have emerged at the frontier between material chemistry and many other fields because of their singular morphology, properties and wide range of potential applications. Organic substance as a coating material onto either inorganic or organic internal core materials promises an unparalleled opportunity for enhancement of final functions through rational designs. This contribution provides a brief overview of recent progress in the synthesis, characterization, and applications of both inorganic-organic and organic-organic polymer emulsions with core-shell structure. In addition, future research trends in polymer composites with core-shell structure are also discussed in this review. Copyright © 2013 Elsevier B.V. All rights reserved.
A method for modeling finite-core vortices in wake-flow calculations
NASA Technical Reports Server (NTRS)
Stremel, P. M.
1984-01-01
A numerical method for computing nonplanar vortex wakes represented by finite-core vortices is presented. The approach solves for the velocity on an Eulerian grid, using standard finite-difference techniques; the vortex wake is tracked by Lagrangian methods. In this method, the distribution of continuous vorticity in the wake is replaced by a group of discrete vortices. An axially symmetric distribution of vorticity about the center of each discrete vortex is used to represent the finite-core model. Two distributions of vorticity, or core models, are investigated: a finite distribution of vorticity represented by a third-order polynomial, and a continuous distribution of vorticity throughout the wake. The method provides for a vortex-core model that is insensitive to the mesh spacing. Results for a simplified case are presented. Computed results for the roll-up of a vortex wake generated by wings with different spanwise load distributions are presented; contour plots of the flow-field velocities are included; and comparisons are made of the computed flow-field velocities with experimentally measured velocities.
NASA Astrophysics Data System (ADS)
Agustin, R. R.; Liliasari, L.
2017-02-01
The purpose of this study was to attain an insight into pre-service science teachers’ technological pedagogical content knowledge (TPACK) as an integrative competency that is addressed by 21st century skills. The methods used in the study was descriptive. Nineteen pre-service science teachers (PSTs) of an educational university in Indonesia were involved in a semester long school science course. The course mainly develop students’ pedagogical content knowledge (PCK) by utilizing content representation (CoRe) template. Furthermore an infusion of technological knowledge (TK) analysis led to the study of their TPACK by extending the template with a question in line to TK. The extended CoRe and self-reported survey were employed as instruments. The analysis of data used were quantitative and qualitative technique to obtain the insight into PSTs’ PCK and TK. The results shows contrary value of PCK and TK identified by CoRe template to those measured by self-reported survey. However, the PSTs perceive their TPACK much higher, that, is 74.74%. Further investigation regarding PSTs ability to compose lesson plan was recommended for further research to capture more comprehensive insight into PSTs’ TPACK.
Perceived pros and cons of smoking and quitting in hard-core smokers: a focus group study
2014-01-01
Background In the last decade, so-called hard-core smokers have received increasing interest in research literature. For smokers in general, the study of perceived costs and benefits (or ‘pros and cons’) of smoking and quitting is of particular importance in predicting motivation to quit and actual quitting attempts. Therefore, this study aims to gain insight into the perceived pros and cons of smoking and quitting in hard-core smokers. Methods We conducted 11 focus group interviews among current hard-core smokers (n = 32) and former hard-core smokers (n = 31) in the Netherlands. Subsequently, each participant listed his or her main pros and cons in a questionnaire. We used a structural procedure to analyse the data obtained from the group interviews and from the questionnaires. Results Using the qualitative data of both the questionnaires and the transcripts, the perceived pros and cons of smoking and smoking cessation were grouped into 6 main categories: Finance, Health, Intrapersonal Processes, Social Environment, Physical Environment and Food and Weight. Conclusions Although the perceived pros and cons of smoking in hard-core smokers largely mirror the perceived pros and cons of quitting, there are some major differences with respect to weight, social integration, health of children and stress reduction, that should be taken into account in clinical settings and when developing interventions. Based on these findings we propose the ‘Distorted Mirror Hypothesis’. PMID:24548463
NASA Technical Reports Server (NTRS)
Saiyed, Naseem H.; Mikkelsen, Kevin L.; Bridges, James E.
2000-01-01
The NASA Glenn Research Center recently completed an experimental study to reduce the jet noise from modern turbofan engines. The study concentrated on exhaust nozzle designs for high-bypass-ratio engines. These designs modified the core and fan nozzles individually and simultaneously. Several designs provided an ideal jet noise reduction of over 2.5 EPNdB for the effective perceived noise level (EPNL) metric. Noise data, after correcting for takeoff thrust losses, indicated over a 2.0-EPNdB reduction for nine designs. Individually modifying the fan nozzle did not provide attractive EPNL reductions. Designs in which only the core nozzle was modified provided greater EPNL reductions. Designs in which core and fan nozzles were modified simultaneously provided the greatest EPNL reduction. The best nozzle design had a 2.7-EPNdB reduction (corrected for takeoff thrust loss) with a 0.06-point cruise thrust loss. This design simultaneously employed chevrons on the core and fan nozzles. In comparison with chevrons, tabs appeared to be an inefficient method for reducing jet noise. Data trends indicate that the sum of the thrust losses from individually modifying core and fan nozzles did not generally equal the thrust loss from modifying them simultaneously. Flow blockage from tabs did not scale directly with cruise thrust loss and the interaction between fan flow and the core nozzle seemed to strongly affect noise and cruise performance. Finally, the nozzle configuration candidates for full-scale engine demonstrations are identified.
The statistical analysis of circadian phase and amplitude in constant-routine core-temperature data
NASA Technical Reports Server (NTRS)
Brown, E. N.; Czeisler, C. A.
1992-01-01
Accurate estimation of the phases and amplitude of the endogenous circadian pacemaker from constant-routine core-temperature series is crucial for making inferences about the properties of the human biological clock from data collected under this protocol. This paper presents a set of statistical methods based on a harmonic-regression-plus-correlated-noise model for estimating the phases and the amplitude of the endogenous circadian pacemaker from constant-routine core-temperature data. The methods include a Bayesian Monte Carlo procedure for computing the uncertainty in these circadian functions. We illustrate the techniques with a detailed study of a single subject's core-temperature series and describe their relationship to other statistical methods for circadian data analysis. In our laboratory, these methods have been successfully used to analyze more than 300 constant routines and provide a highly reliable means of extracting phase and amplitude information from core-temperature data.
Chaudhry, Anil R; Dzugan, Robert; Harrington, Richard M; Neece, Faurice D; Singh, Nipendra P; Westendorf, Travis
2013-11-26
A method of creating a foam pattern comprises mixing a polyol component and an isocyanate component to form a liquid mixture. The method further comprises placing a temporary core having a shape corresponding to a desired internal feature in a cavity of a mold and inserting the mixture into the cavity of the mold so that the mixture surrounds a portion of the temporary core. The method optionally further comprises using supporting pins made of foam to support the core in the mold cavity, with such pins becoming integral part of the pattern material simplifying subsequent processing. The method further comprises waiting for a predetermined time sufficient for a reaction from the mixture to form a foam pattern structure corresponding to the cavity of the mold, wherein the foam pattern structure encloses a portion of the temporary core and removing the temporary core from the pattern independent of chemical leaching.
Evaluation of the use of nodal methods for MTR neutronic analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reitsma, F.; Mueller, E.Z.
1997-08-01
Although modern nodal methods are used extensively in the nuclear power industry, their use for research reactor analysis has been very limited. The suitability of nodal methods for material testing reactor analysis is investigated with the emphasis on the modelling of the core region (fuel assemblies). The nodal approach`s performance is compared with that of the traditional finite-difference fine mesh approach. The advantages of using nodal methods coupled with integrated cross section generation systems are highlighted, especially with respect to data preparation, simplicity of use and the possibility of performing a great variety of reactor calculations subject to strict timemore » limitations such as are required for the RERTR program.« less
ERIC Educational Resources Information Center
Hough, Heather; Kalogrides, Demetra; Loeb, Susanna
2017-01-01
The research featured in this paper is part of the CORE-PACE Research Partnership, through which Policy Analysis for California Education (PACE) has partnered with the CORE districts to conduct research designed to support them in continuous improvement while simultaneously helping to improve policy and practice in California and nationwide.…
Reconfigurable Hardware Adapts to Changing Mission Demands
NASA Technical Reports Server (NTRS)
2003-01-01
A new class of computing architectures and processing systems, which use reconfigurable hardware, is creating a revolutionary approach to implementing future spacecraft systems. With the increasing complexity of electronic components, engineers must design next-generation spacecraft systems with new technologies in both hardware and software. Derivation Systems, Inc., of Carlsbad, California, has been working through NASA s Small Business Innovation Research (SBIR) program to develop key technologies in reconfigurable computing and Intellectual Property (IP) soft cores. Founded in 1993, Derivation Systems has received several SBIR contracts from NASA s Langley Research Center and the U.S. Department of Defense Air Force Research Laboratories in support of its mission to develop hardware and software for high-assurance systems. Through these contracts, Derivation Systems began developing leading-edge technology in formal verification, embedded Java, and reconfigurable computing for its PF3100, Derivational Reasoning System (DRS ), FormalCORE IP, FormalCORE PCI/32, FormalCORE DES, and LavaCORE Configurable Java Processor, which are designed for greater flexibility and security on all space missions.
[Social network analysis of traditional Chinese medicine on treatment of constipation].
Du, Li-Dong; Tian, Jin-Hui; Wu, Guo-Tai; Niu, Ting-Hui; Chen, Zhen-He; Ren, Yuan
2017-01-01
The methods of literature metrology and data mining were used to study the research topics and social network analysis of traditional Chinese medicine for constipation. The major Chinese databases were searched to include the research studies of traditional Chinese medicine for constipation. BICOMS analysis software was used to extract and collect the main information and produce co-occurrence Matrix; gCLUTO software was used for cluster analysis. Data analysis was conducted by using SPSS 19.0 software. The results showed that the number of studies on traditional Chinese medicine for constipation was constantly increased, with two literature volume peaks respectively in 2003 and 2006. Related studies have been published in 31 provinces, autonomous regions and municipalities have published, but the studies in developed areas were more than those in developing areas. There was little cooperation between research institutions and the authors, especially the cooperation between different areas. At present, the research field of Chinese medicine for constipation is divided into five research topics. In terms of specific traditional Chinese medicine, angelica sinensis is in the core position. The results showed regional imbalance in the number of studies on Chinese medicine treatment for constipation, as well as little cooperation between researchers and research institutions. The research topics mainly focused on the evaluation of clinical efficacy, but the research on optimizing the prescriptions was still not enough, so the future researchers shall pay more attention to the studies of constipation prescriptions with Angelica sinensis as the core herb. Copyright© by the Chinese Pharmaceutical Association.
Staff Scientist | Center for Cancer Research
The scientist will be tasked with independent research projects that support and/or further the scope of our laboratory goals as determined by the Principal Investigator. The scientist will be responsible for overseeing daily operations and coordination of projects in close conjunction with all laboratory personnel. The scientist will participate in teaching laboratory methods to first-time post-docs, research fellows, and students. The scientist will work closely with a full-time research biologist, both in collaboration of research projects and in the lab-critical administrative tasks of IRB-approval, animal protocols, budget, etc. Our laboratory has two post-doctoral researchers at any given time. This is a great opportunity for candidates who are interested in cancer biology and want to grow their research career by working in our program with outstanding support of other established laboratories and core facilities in the National Cancer Institute.
Monitoring of NMR porosity changes in the full-size core salvage through the drying process
NASA Astrophysics Data System (ADS)
Fattakhov, Artur; Kosarev, Victor; Doroginitskii, Mikhail; Skirda, Vladimir
2015-04-01
Currently the principle of nuclear magnetic resonance (NMR) is one of the most popular technologies in the field of borehole geophysics and core analysis. Results of NMR studies allow to calculate the values of the porosity and permeability of sedimentary rocks with sufficient reliability. All standard tools for the study of core salvage on the basis of NMR have significant limitations: there is considered only long relaxation times corresponding to the mobile formation fluid. Current trends in energy obligate to move away from conventional oil to various alternative sources of energy. One of these sources are deposits of bitumen and high-viscosity oil. In Kazan (Volga Region) Federal University (Russia) there was developed a mobile unit for the study of the full-length core salvage by the NMR method ("NMR-Core") together with specialists of "TNG-Group" (a company providing maintenance services to oil companies). This unit is designed for the study of core material directly on the well, after removing it from the core receiver. The maximum diameter of the core sample may be up to 116 mm, its length (or length of the set of samples) may be up to 1000 mm. Positional precision of the core sample relative to the measurement system is 1 mm, and the spatial resolution along the axis of the core is 10 mm. Acquisition time of the 1 m core salvage varies depending on the mode of research and is at least 20 minutes. Furthermore, there is implemented a special investigation mode of the core samples with super small relaxation times (for example, heavy oil) is in the tool. The aim of this work is tracking of the NMR porosity changes in the full-size core salvage in time. There was used a water-saturated core salvage from the shallow educational well as a sample. The diameter of the studied core samples is 93 mm. There was selected several sections length of 1m from the 200-meter coring interval. The studied core samples are being measured several times. The time interval between the measurements is from 1 hour to 48 hours. Making the measurements it possible to draw conclusions about that the processes of NMR porosity changes in time as a result of evaporation of the part of fluid from the surface layer of the core salvage and suggest a core analysis technique directly on the well. This work is supported by the grant of Ministry of Education and Science of the Russian Federation (project No. 02.G25.31.0029).
Integrated care: a comprehensive bibliometric analysis and literature review
Sun, Xiaowei; Tang, Wenxi; Ye, Ting; Zhang, Yan; Wen, Bo; Zhang, Liang
2014-01-01
Introduction Integrated care could not only fix up fragmented health care but also improve the continuity of care and the quality of life. Despite the volume and variety of publications, little is known about how ‘integrated care’ has developed. There is a need for a systematic bibliometric analysis on studying the important features of the integrated care literature. Aim To investigate the growth pattern, core journals and jurisdictions and identify the key research domains of integrated care. Methods We searched Medline/PubMed using the search strategy ‘(delivery of health care, integrated [MeSH Terms]) OR integrated care [Title/Abstract]’ without time and language limits. Second, we extracted the publishing year, journals, jurisdictions and keywords of the retrieved articles. Finally, descriptive statistical analysis by the Bibliographic Item Co-occurrence Matrix Builder and hierarchical clustering by SPSS were used. Results As many as 9090 articles were retrieved. Results included: (1) the cumulative numbers of the publications on integrated care rose perpendicularly after 1993; (2) all documents were recorded by 1646 kinds of journals. There were 28 core journals; (3) the USA is the predominant publishing country; and (4) there are six key domains including: the definition/models of integrated care, interdisciplinary patient care team, disease management for chronically ill patients, types of health care organizations and policy, information system integration and legislation/jurisprudence. Discussion and conclusion Integrated care literature has been most evident in developed countries. International Journal of Integrated Care is highly recommended in this research area. The bibliometric analysis and identification of publication hotspots provides researchers and practitioners with core target journals, as well as an overview of the field for further research in integrated care. PMID:24987322
A new method for teaching physical examination to junior medical students.
Sayma, Meelad; Williams, Hywel Rhys
2016-01-01
Teaching effective physical examination is a key component in the education of medical students. Preclinical medical students often have insufficient clinical knowledge to apply to physical examination recall, which may hinder their learning when taught through certain understanding-based models. This pilot project aimed to develop a method to teach physical examination to preclinical medical students using "core clinical cases", overcoming the need for "rote" learning. This project was developed utilizing three cycles of planning, action, and reflection. Thematic analysis of feedback was used to improve this model, and ensure it met student expectations. A model core clinical case developed in this project is described, with gout as the basis for a "foot and ankle" examination. Key limitations and difficulties encountered on implementation of this pilot are discussed for future users, including the difficulty encountered in "content overload". This approach aims to teach junior medical students physical examination through understanding, using a simulated patient environment. Robust research is now required to demonstrate efficacy and repeatability in the physical examination of other systems.
Magneto-plasmonic Au-Coated Co nanoparticles synthesized via hot-injection method
NASA Astrophysics Data System (ADS)
Souza, João B., Jr.; Varanda, Laudemir C.
2018-02-01
A synthetic procedure is described for the obtaining of superparamagnetic Co nanoparticles (NPs) via hot-injection method in the presence of sodium borohydride. The Co NPs obtained have an average diameter of 5.3 nm and saturation magnetization of 115 emu g-1. A modified Langevin equation is fitted to the magnetization curves using a log-normal distribution for the particle diameter and an effective field to account for dipolar interactions. The calculated magnetic diameter of the Co NPs is 0.6 nm smaller than TEM-derived values, implying a magnetic dead layer of 0.3 nm. The magnetic core is coated with Au to prevent oxidation, resulting in water-stable magneto-plasmonic Co/Au core/shell NPs with saturation of 71.6 emu g-1. The coating adds a localized surface plasmon resonance property with absorbance in the so-called ‘therapeutic window’ (690-900 nm), suitable for biomedical applications. It is suggested that these multifunctional NPs are distinguished as a potential platform for applied and fundamental research.
Common Processes in Evidence-Based Adolescent HIV Prevention Programs
Ingram, Barbara L.; Flannery, Diane; Elkavich, Amy
2014-01-01
Dissemination of evidence-based HIV prevention programs for adolescents will be increased if community interventionists are able to distinguish core, essential program elements from optional, discretionary ones. We selected five successful adolescent HIV prevention programs, used a qualitative coding method to identify common processes described in the procedural manuals, and then compared the programs. Nineteen common processes were categorized as structural features, group management strategies, competence building, and addressing developmental challenges of adolescence. All programs shared the same structural features (goal-setting and session agendas), used an active engagement style of group management, and built cognitive competence. Programs varied in attention to developmental challenges, emphasis on behavioral and emotional competence, and group management methods. This qualitative analysis demonstrated that successful HIV programs contain processes not articulated in their developers’ theoretical models. By moving from the concrete specifics of branded interventions to identification of core, common processes, we are consistent with the progress of “common factors” research in psychotherapy. PMID:18330687
The breastfeeding experiences of Canadian teenage mothers.
Nelson, Alison; Sethi, Sarla
2005-01-01
To discover the phenomenon of breastfeeding as experienced by teenage mothers. Grounded theory method was used to study the first-time breastfeeding experiences of teenage mothers, aged 15 to 19 years. The research occurred between September 2000 and April 2001 in Calgary, Alberta, Canada. A purposive sample of 8 teenage mothers was recruited through self-identification and Calgary Health Region staff referral. DATA GENERATION AND ANALYSIS: The data were generated using informal interviews and demographic questionnaires. The data were transcribed, coded, and analyzed using constant comparative method. The emergent core variable was Teenage Mothers: Continuously Committing to Breastfeeding. Four categories supported the core variable: (a) Deciding to Breastfeed, (b) Learning to Breastfeed, (c) Adjusting to Breastfeeding, and (d) Ending Breastfeeding. The two supporting subcategories were (a) Vacillating Between the Good Things and Hard Things About Breastfeeding and (b) Social Support and Other Social Influences. Teenage mothers' breastfeeding experiences may be similar to adult women's breastfeeding experiences, but teenage mothers may require additional breastfeeding support.
Preserving Social Studies as Core Curricula in an Era of Common Core Reform
ERIC Educational Resources Information Center
Denton, David W.; Sink, Cindy
2015-01-01
Education reform over the last two decades has changed perceptions of core curricula. Although social studies has traditionally been part of the core, emphasis on standards-based teaching and learning, along with elaborate accountability schemes, is causing unbalanced treatment of subjects. While the research literature indicates teachers are…
Examining Core Curricula in Writing for Grades 3-5
ERIC Educational Resources Information Center
Holtz, Jill; McCurdy, Merilee; Roehling, Julia V.
2015-01-01
Within a Response to Intervention (RtI) framework, Tier 1 instruction requires the selection of research-based core curricula. However, many educators and administrators are not aware of high-quality core writing curricula. The authors assembled a rubric to assist schools in evaluating core writing curricula for Grades 3-5. Rubric components…
Corp, Nadia; Watt, Fiona E.; Felson, David T.; O’Neill, Terence W.; Holt, Cathy A.; Jones, Richard K.; Conaghan, Philip G.; Arden, Nigel K.
2016-01-01
Objective. Treatment of OA by stratifying for commonly used and novel therapies will likely improve the range of effective therapy options and their rational deployment in this undertreated, chronic disease. In order to develop appropriate datasets for conducting post hoc analyses to inform approaches to stratification for OA, our aim was to develop recommendations on the minimum data that should be recorded at baseline in all future OA interventional and observational studies. Methods. An Arthritis Research UK study group comprised of 32 experts used a Delphi-style approach supported by a literature review of systematic reviews to come to a consensus on core data collection for OA studies. Results. Thirty-five systematic reviews were used as the basis for the consensus group discussion. For studies with a primary structural endpoint, core domains for collection were defined as BMI, age, gender, racial origin, comorbidities, baseline OA pain, pain in other joints and occupation. In addition to the items generalizable to all anatomical sites, joint-specific domains included radiographic measures, surgical history and anatomical factors, including alignment. To demonstrate clinical relevance for symptom studies, the collection of mental health score, self-efficacy and depression scales were advised in addition to the above. Conclusions. Currently it is not possible to stratify patients with OA into therapeutic groups. A list of core and optional data to be collected in all OA interventional and observational studies was developed, providing a basis for future analyses to identify predictors of progression or response to treatment. PMID:27084310
Geochemical studies of backfill aggregates, lake sediment cores and the Hueco Bolson Aquifer
NASA Astrophysics Data System (ADS)
Thapalia, Anita
This dissertation comprises of three different researches that focuses on the application of geochemistry from aggregates, lake sediment cores and Hueco Bolson Aquifer. Each study is independent and presented in the publication format. The first chapter is already published and the second chapter is in revision phase. Overall, three studies measure the large scale (field) as well as bench scale (lab) water-rock interactions influenced by the climatic and anthropogenic factors spans from the field of environmental geology to civil engineering. The first chapter of this dissertation addresses the chemical evaluation of coarse aggregates from six different quarries in Texas. The goal of this work is to find out the best geochemical methods for assessing the corrosion potential of coarse aggregates prior to their use in mechanically stabilized earth walls. Electrochemical parameters help to define the corrosion potential of aggregates following two different leaching protocols. Testing the coarse and fine aggregates demonstrate the chemical difference due to size-related kinetic leaching effects. Field fines also show different chemistry than the bulk rock indicating the weathering impact on carbonate rocks. The second chapter investigates zinc (Zn) isotopic signatures from eight lake sediment cores collected both from pristine lakes and those impacted by urban anthropogenic contamination. Zinc from the natural weathering of rocks and anthropogenic atmospheric pollutants are transported to these lakes and the signatures are recorded in the sediments. Isotopic analysis of core samples provides the signature of anthropogenic contamination sources. Dated sediment core and isotopic analysis can identify Zn inputs that are correlated to the landuse and population change of the watersheds. Comparison of isotopic data from both pristine and urban lake sediment core also serves as an analog in other lake sediment cores in the world. The third chapter studies on Hueco Bolson Aquifer that an important sources of water in the El Paso/Cd. Juraez metroplex. To delineate the boundary between fresh and brackish water from the northern Hueco Bolson Aquifer, we utilize an integrative geochemical, geophysical, and sedimentological approach. The goal of this study is to use geophysical well-log analysis and the water chemical analysis for identifying the changes in the quality of the groundwater. A detailed microgravity survey is utilized to explore the subsurface geological structures that control the conduits and/or barriers of groundwater flow. A detailed geochemical analysis of aquifer samples provide salinity of groundwater that will complement to the subsurface structures obtained from the geophysical study. This fundamental research in developing methods from an integrated approach to estimate aquifer quality can be used as an analog for similar studies in other arid regions.
New Users | Center for Cancer Research
New Users Becoming a Core Facilities User The following steps are applicable to anyone who would like to become a user of the CCR SAXS Core facilities. All users are required to follow the Core Facilty User Polices.
Levels of reduction in van Manen's phenomenological hermeneutic method: an empirical example.
Heinonen, Kristiina
2015-05-01
To describe reduction as a method using van Manen's phenomenological hermeneutic research approach. Reduction involves several levels that can be distinguished for their methodological usefulness. Researchers can use reduction in different ways and dimensions for their methodological needs. A study of Finnish multiple-birth families in which open interviews (n=38) were conducted with public health nurses, family care workers and parents of twins. A systematic literature and knowledge review showed there were no articles on multiple-birth families that used van Manen's method. Discussion The phenomena of the 'lifeworlds' of multiple-birth families consist of three core essential themes as told by parents: 'a state of constant vigilance', 'ensuring that they can continue to cope' and 'opportunities to share with other people'. Reduction provides the opportunity to carry out in-depth phenomenological hermeneutic research and understand people's lives. It helps to keep research stages separate but also enables a consolidated view. Social care and healthcare professionals have to hear parents' voices better to comprehensively understand their situation; they need further tools and training to be able to empower parents of twins. This paper adds an empirical example to the discussion of phenomenology, hermeneutic study and reduction as a method. It opens up reduction for researchers to exploit.
ERIC Educational Resources Information Center
Moshier, Kenneth
This module on the development of a research proposal in vocational education is one of a set of five on evaluation and research and is part of a larger series of thirty-four modules constituting a core curriculum for use in the professional preparation of vocational educators in the areas of agricultural, business, home economics, and industrial…
Pervasive healthcare as a scientific discipline.
Bardram, J E
2008-01-01
The OECD countries are facing a set of core challenges; an increasing elderly population; increasing number of chronic and lifestyle-related diseases; expanding scope of what medicine can do; and increasing lack of medical professionals. Pervasive healthcare asks how pervasive computing technology can be designed to meet these challenges. The objective of this paper is to discuss 'pervasive healthcare' as a research field and tries to establish how novel and distinct it is, compared to related work within biomedical engineering, medical informatics, and ubiquitous computing. The paper presents the research questions, approach, technologies, and methods of pervasive healthcare and discusses these in comparison to those of other related scientific disciplines. A set of central research themes are presented; monitoring and body sensor networks; pervasive assistive technologies; pervasive computing for hospitals; and preventive and persuasive technologies. Two projects illustrate the kind of research being done in pervasive healthcare. The first project is targeted at home-based monitoring of hypertension; the second project is designing context-aware technologies for hospitals. Both projects approach the healthcare challenges in a new way, apply a new type of research method, and come up with new kinds of technological solutions. 'Clinical proof-of-concept' is recommended as a new method for pervasive healthcare research; the method helps design and test pervasive healthcare technologies, and in ascertaining their clinical potential before large-scale clinical tests are needed. The paper concludes that pervasive healthcare as a research field and agenda is novel; it is addressing new emerging research questions, represents a novel approach, designs new types of technologies, and applies a new kind of research method.
Method and apparatus for recovering unstable cores
McGuire, Patrick L.; Barraclough, Bruce L.
1983-01-01
A method and apparatus suitable for stabilizing hydrocarbon cores are given. Such stabilized cores have not previously been obtainable for laboratory study, and such study is believed to be required before the hydrate reserves can become a utilizable resource. The apparatus can be built using commercially available parts and is very simple and safe to operate.
The structure and emerging trends of construction safety management research: a bibliometric review.
Liang, Huakang; Zhang, Shoujian; Su, Yikun
2018-03-29
Recently, construction safety management (CSM) practices and systems have become important topics for stakeholders to take care of human resources. However, few studies have attempted to map the global research on CSM. A comprehensive bibliometric review was conducted in this study based on multiple methods. In total, 1172 CSM-related papers from the Web of Science Core Collection database were examined. The analyses focused on publication year, country-institute, publication source, author and research topics. The results indicated that the USA, China, Australia and the UK took leading positions in CSM research. Two branches of journals were identified, namely the branch of engineering science and that of safety science and social science. Additionally, seven themes together with 28 specific topics were detected to allow researchers to track the main structure and temporal evolution of CSM research. Finally, the main research trends and potential research directions were discussed to guide the future research.
Psychotherapy training: Suggestions for core ingredients and future research.
Boswell, James F; Castonguay, Louis G
2007-12-01
Despite our considerable depth and breadth of empirical knowledge on psychotherapy process and outcome, research on psychotherapy training is somewhat lacking. We would argue, however, that the scientist-practitioner model should not only guide practice, but also the way our field approaches training. In this paper we outline our perspective on the crucial elements of psychotherapy training based on available evidence, theory, and clinical experience, focusing specifically on the structure, key components, and important skills to be learned in a successful training program. In addition, we derive specific research directions based on the crucial elements of our proposed training perspective, and offer general considerations for research on training, including method and measurement issues. (PsycINFO Database Record (c) 2010 APA, all rights reserved).
From research to evidence-informed decision making: a systematic approach
Poot, Charlotte C; van der Kleij, Rianne M; Brakema, Evelyn A; Vermond, Debbie; Williams, Siân; Cragg, Liza; van den Broek, Jos M; Chavannes, Niels H
2018-01-01
Abstract Background Knowledge creation forms an integral part of the knowledge-to-action framework aimed at bridging the gap between research and evidence-informed decision making. Although principles of science communication, data visualisation and user-centred design largely impact the effectiveness of communication, their role in knowledge creation is still limited. Hence, this article aims to provide researchers a systematic approach on how knowledge creation can be put into practice. Methods A systematic two-phased approach towards knowledge creation was formulated and executed. First, during a preparation phase the purpose and audience of the knowledge were defined. Subsequently, a developmental phase facilitated how the content is ‘said’ (language) and communicated (channel). This developmental phase proceeded via two pathways: a translational cycle and design cycle, during which core translational and design components were incorporated. The entire approach was demonstrated by a case study. Results The case study demonstrated how the phases in this systematic approach can be operationalised. It furthermore illustrated how created knowledge can be delivered. Conclusion The proposed approach offers researchers a systematic, practical and easy-to-implement tool to facilitate effective knowledge creation towards decision-makers in healthcare. Through the integration of core components of knowledge creation evidence-informed decision making will ultimately be optimized. PMID:29538728
Soil hydrophobicity - relating effects at atomic, molecular, core and national scales
NASA Astrophysics Data System (ADS)
Matthews, Peter; Doerr, Stefan; Van Keulen, Geertje; Dudley, Ed; Francis, Lewis; Whalley, Richard; Gazze, Andrea; Hallin, Ingrid; Quinn, Gerry; Sinclair, Kat; Ashton, Rhys
2016-04-01
The detrimental impacts of soil hydrophobicity include increased runoff, erosion and flooding, reduced biomass production, inefficient use of irrigation water and preferential leaching of pollutants. Its impacts may exacerbate flood risk associated with more extreme drought and precipitation events predicted with UK climate change scenarios. The UK's Natural Environment Research Council (NERC) has therefore funded a major research programme to investigate soil hydrophobicity over length scales ranging from atomic through molecular, core and landscape scale. This presentation gives an overview of the findings to date. The programme is predicated on the hypothesis that changes in soil protein abundance and localization, induced by variations in soil moisture and temperature, are crucial driving forces for transitions between hydrophobic and hydrophilic conditions at soil particle surfaces. Three soils were chosen based on the severity of hydrophobicity that can be achieved in the field: severe to extreme (Cefn Bryn, Gower, Wales), intermediate to severe (National Botanical Garden, Wales), and subcritical (Park Grass, Rothamsted Research near London). The latter is already highly characterised so was also used as a control. Hydrophobic/ hydrophilic transitions were measured from water droplet penetration times. Scientific advances in the following five areas will be described: (i) the identification of these soil proteins by proteomic methods, using a novel separation method which reduces interference by humic acids, and allows identification by ESI and MALDI TOF mass spectrometry and database searches, (ii) the examination of such proteins, which form ordered hydrophobic ridges, and measurement of their elasticity, stickiness and hydrophobicity at nano- to microscale using atomic force microscopy adapted for the rough surfaces of soil particles, (iii) the novel use of a picoliter goniometer to show hydrophobic effects at a 1 micron diameter droplet level, which avoids the averaging over soil cores and particles evident in microliter goniometry, with which the results are compared, (iv) measurements at core scale using water retention and wicking experiments, and (v) the interpretation, integration and upscaling of the results using a development of the PoreXpert void network model, a significant advance on the Van Genuchten approach. An explanation will also be given as to how the results will be incorporated into the JULES hydrological model of the UK Meteorological Office, used to predict flooding for different soil types and usage.
ENFIN--A European network for integrative systems biology.
Kahlem, Pascal; Clegg, Andrew; Reisinger, Florian; Xenarios, Ioannis; Hermjakob, Henning; Orengo, Christine; Birney, Ewan
2009-11-01
Integration of biological data of various types and the development of adapted bioinformatics tools represent critical objectives to enable research at the systems level. The European Network of Excellence ENFIN is engaged in developing an adapted infrastructure to connect databases, and platforms to enable both the generation of new bioinformatics tools and the experimental validation of computational predictions. With the aim of bridging the gap existing between standard wet laboratories and bioinformatics, the ENFIN Network runs integrative research projects to bring the latest computational techniques to bear directly on questions dedicated to systems biology in the wet laboratory environment. The Network maintains internally close collaboration between experimental and computational research, enabling a permanent cycling of experimental validation and improvement of computational prediction methods. The computational work includes the development of a database infrastructure (EnCORE), bioinformatics analysis methods and a novel platform for protein function analysis FuncNet.
Quantum Physics Principles and Communication in the Acute Healthcare Setting: A Pilot Study.
Helgeson, Heidi L; Peyerl, Colleen Kraft; Solheim-Witt, Marit
This pilot study explores whether clinician awareness of quantum physics principles could facilitate open communication between patients and providers. In the spirit of action research, this study was conceptualized with a holistic view of human health, using a mixed method design of grounded theory as an emergent method. Instrumentation includes surveys and a focus group discussion with twelve registered nurses working in an acute care hospital setting. Findings document that the preliminary core phenomenon, energy as information, influences communication in the healthcare environment. Key emergent themes include awareness, language, validation, open communication, strategies, coherence, incoherence and power. Research participants indicate that quantum physics principles provide a language and conceptual framework for improving their awareness of communication and interactions in the healthcare environment. Implications of this pilot study support the feasibility of future research and education on awareness of quantum physics principles in other clinical settings. Copyright © 2016 Elsevier Inc. All rights reserved.
Li, C T; Shi, C H; Wu, J G; Xu, H M; Zhang, H Z; Ren, Y L
2004-04-01
The selection of an appropriate sampling strategy and a clustering method is important in the construction of core collections based on predicted genotypic values in order to retain the greatest degree of genetic diversity of the initial collection. In this study, methods of developing rice core collections were evaluated based on the predicted genotypic values for 992 rice varieties with 13 quantitative traits. The genotypic values of the traits were predicted by the adjusted unbiased prediction (AUP) method. Based on the predicted genotypic values, Mahalanobis distances were calculated and employed to measure the genetic similarities among the rice varieties. Six hierarchical clustering methods, including the single linkage, median linkage, centroid, unweighted pair-group average, weighted pair-group average and flexible-beta methods, were combined with random, preferred and deviation sampling to develop 18 core collections of rice germplasm. The results show that the deviation sampling strategy in combination with the unweighted pair-group average method of hierarchical clustering retains the greatest degree of genetic diversities of the initial collection. The core collections sampled using predicted genotypic values had more genetic diversity than those based on phenotypic values.
Xiao, Limin; Jin, Wei; Demokan, M S
2007-01-15
We demonstrate a novel method for low-loss splicing small-core photonic crystal fibers (PCFs) and single-mode fibers (SMFs) by repeated arc discharges using a conventional fusion splicer. An optimum mode field match at the interface of PCF-SMF and an adiabatic mode field variation in the longitudinal direction of the small-core PCF can be achieved by repeated arc discharges applied over the splicing joint to gradually collapse the air holes of the small-core PCF. This method is simple and offers a practical solution for light coupling between small-core PCFs and SMFs.
Fusion splicing small-core photonic crystal fibers and single-mode fibers by repeated arc discharges
NASA Astrophysics Data System (ADS)
Xiao, Limin; Jin, Wei; Demokan, M. S.
2007-01-01
We demonstrate a novel method for low-loss splicing small-core photonic crystal fibers (PCFs) and single-mode fibers (SMFs) by repeated arc discharges using a conventional fusion splicer. An optimum mode field match at the interface of PCF-SMF and an adiabatic mode field variation in the longitudinal direction of the small-core PCF can be achieved by repeated arc discharges applied over the splicing joint to gradually collapse the air holes of the small-core PCF. This method is simple and offers a practical solution for light coupling between small-core PCFs and SMFs.
MCore: A High-Order Finite-Volume Dynamical Core for Atmospheric General Circulation Models
NASA Astrophysics Data System (ADS)
Ullrich, P.; Jablonowski, C.
2011-12-01
The desire for increasingly accurate predictions of the atmosphere has driven numerical models to smaller and smaller resolutions, while simultaneously exponentially driving up the cost of existing numerical models. Even with the modern rapid advancement of computational performance, it is estimated that it will take more than twenty years before existing models approach the scales needed to resolve atmospheric convection. However, smarter numerical methods may allow us to glimpse the types of results we would expect from these fine-scale simulations while only requiring a fraction of the computational cost. The next generation of atmospheric models will likely need to rely on both high-order accuracy and adaptive mesh refinement in order to properly capture features of interest. We present our ongoing research on developing a set of ``smart'' numerical methods for simulating the global non-hydrostatic fluid equations which govern atmospheric motions. We have harnessed a high-order finite-volume based approach in developing an atmospheric dynamical core on the cubed-sphere. This type of method is desirable for applications involving adaptive grids, since it has been shown that spuriously reflected wave modes are intrinsically damped out under this approach. The model further makes use of an implicit-explicit Runge-Kutta-Rosenbrock (IMEX-RKR) time integrator for accurate and efficient coupling of the horizontal and vertical model components. We survey the algorithmic development of the model and present results from idealized dynamical core test cases, as well as give a glimpse at future work with our model.
NASA Astrophysics Data System (ADS)
Li, Gong Ping; Chen, Rui; Guo, Dong Lai; Wong, Lai Mun; Wang, Shi Jie; Sun, Han Dong; Wu, Tom
2011-08-01
Controllably constructing hierarchical nanostructures with distinct components and designed architectures is an important theme of research in nanoscience, entailing novel but reliable approaches of bottom-up synthesis. Here, we report a facile method to reproducibly create semiconductor-insulator-metal core/shell nanostructures, which involves first coating uniform MgO shells onto metal oxide nanostructures in solution and then decorating them with Au nanoparticles. The semiconductor nanowire core can be almost any material and, herein, ZnO, SnO2 and In2O3 are used as examples. We also show that linear chains of short ZnO nanorods embedded in MgO nanotubes and porous MgO nanotubes can be obtained by taking advantage of the reduced thermal stability of the ZnO core. Furthermore, after MgO shell-coating and the appropriate annealing treatment, the intensity of the ZnO near-band-edge UV emission becomes much stronger, showing a 25-fold enhancement. The intensity ratio of the UV/visible emission can be increased further by decorating the surface of the ZnO/MgO nanowires with high-density plasmonic Au nanoparticles. These heterostructured semiconductor-insulator-metal nanowires with tailored morphologies and enhanced functionalities have great potential for use as nanoscale building blocks in photonic and electronic applications.Controllably constructing hierarchical nanostructures with distinct components and designed architectures is an important theme of research in nanoscience, entailing novel but reliable approaches of bottom-up synthesis. Here, we report a facile method to reproducibly create semiconductor-insulator-metal core/shell nanostructures, which involves first coating uniform MgO shells onto metal oxide nanostructures in solution and then decorating them with Au nanoparticles. The semiconductor nanowire core can be almost any material and, herein, ZnO, SnO2 and In2O3 are used as examples. We also show that linear chains of short ZnO nanorods embedded in MgO nanotubes and porous MgO nanotubes can be obtained by taking advantage of the reduced thermal stability of the ZnO core. Furthermore, after MgO shell-coating and the appropriate annealing treatment, the intensity of the ZnO near-band-edge UV emission becomes much stronger, showing a 25-fold enhancement. The intensity ratio of the UV/visible emission can be increased further by decorating the surface of the ZnO/MgO nanowires with high-density plasmonic Au nanoparticles. These heterostructured semiconductor-insulator-metal nanowires with tailored morphologies and enhanced functionalities have great potential for use as nanoscale building blocks in photonic and electronic applications. Electronic supplementary information (ESI) available: Representative SEM and TEM images of 700 °C annealed ZnO/MgO core/shell NWs, a TEM image of an individual MgO nanocrystal inside the MgO NTs and SEM images of SnO2 NP chains embedded in MgO NTs and comb-shaped MgO hollow nanostructures. See DOI: 10.1039/c1nr10352k
Thermal Imaging to Study Stress Non-invasively in Unrestrained Birds.
Jerem, Paul; Herborn, Katherine; McCafferty, Dominic; McKeegan, Dorothy; Nager, Ruedi
2015-11-06
Stress, a central concept in biology, describes a suite of emergency responses to challenges. Among other responses, stress leads to a change in blood flow that results in a net influx of blood to key organs and an increase in core temperature. This stress-induced hyperthermia is used to assess stress. However, measuring core temperature is invasive. As blood flow is redirected to the core, the periphery of the body can cool. This paper describes a protocol where peripheral body temperature is measured non-invasively in wild blue tits (Cyanistes caeruleus) using infrared thermography. In the field we created a set-up bringing the birds to an ideal position in front of the camera by using a baited box. The camera takes a short thermal video recording of the undisturbed bird before applying a mild stressor (closing the box and therefore capturing the bird), and the bird's response to being trapped is recorded. The bare skin of the eye-region is the warmest area in the image. This allows an automated extraction of the maximum eye-region temperature from each image frame, followed by further steps of manual data filtering removing the most common sources of errors (motion blur, blinking). This protocol provides a time series of eye-region temperature with a fine temporal resolution that allows us to study the dynamics of the stress response non-invasively. Further work needs to demonstrate the usefulness of the method to assess stress, for instance to investigate whether eye-region temperature response is proportional to the strength of the stressor. If this can be confirmed, it will provide a valuable alternative method of stress assessment in animals and will be useful to a wide range of researchers from ecologists, conservation biologists, physiologists to animal welfare researchers.
Thermal Imaging to Study Stress Non-invasively in Unrestrained Birds
Jerem, Paul; Herborn, Katherine; McCafferty, Dominic; McKeegan, Dorothy; Nager, Ruedi
2015-01-01
Stress, a central concept in biology, describes a suite of emergency responses to challenges. Among other responses, stress leads to a change in blood flow that results in a net influx of blood to key organs and an increase in core temperature. This stress-induced hyperthermia is used to assess stress. However, measuring core temperature is invasive. As blood flow is redirected to the core, the periphery of the body can cool. This paper describes a protocol where peripheral body temperature is measured non-invasively in wild blue tits (Cyanistes caeruleus) using infrared thermography. In the field we created a set-up bringing the birds to an ideal position in front of the camera by using a baited box. The camera takes a short thermal video recording of the undisturbed bird before applying a mild stressor (closing the box and therefore capturing the bird), and the bird’s response to being trapped is recorded. The bare skin of the eye-region is the warmest area in the image. This allows an automated extraction of the maximum eye-region temperature from each image frame, followed by further steps of manual data filtering removing the most common sources of errors (motion blur, blinking). This protocol provides a time series of eye-region temperature with a fine temporal resolution that allows us to study the dynamics of the stress response non-invasively. Further work needs to demonstrate the usefulness of the method to assess stress, for instance to investigate whether eye-region temperature response is proportional to the strength of the stressor. If this can be confirmed, it will provide a valuable alternative method of stress assessment in animals and will be useful to a wide range of researchers from ecologists, conservation biologists, physiologists to animal welfare researchers. PMID:26575985
Shaped nanocrystal particles and methods for making the same
Alivisatos, A Paul [Oakland, CA; Scher, Erik C [Menlo Park, CA; Manna, Liberato [Berkeley, CA
2011-11-22
Shaped nanocrystal particles and methods for making shaped nanocrystal particles are disclosed. One embodiment includes a method for forming a branched, nanocrystal particle. It includes (a) forming a core having a first crystal structure in a solution, (b) forming a first arm extending from the core having a second crystal structure in the solution, and (c) forming a second arm extending from the core having the second crystal structure in the solution.
Shaped nanocrystal particles and methods for making the same
Alivisatos, A. Paul; Scher, Erik C; Manna, Liberato
2013-12-17
Shaped nanocrystal particles and methods for making shaped nanocrystal particles are disclosed. One embodiment includes a method for forming a branched, nanocrystal particle. It includes (a) forming a core having a first crystal structure in a solution, (b) forming a first arm extending from the core having a second crystal structure in the solution, and (c) forming a second arm extending from the core having the second crystal structure in the solution.
Shaped nanocrystal particles and methods for working the same
Alivisatos, A. Paul; Sher, Eric C.; Manna, Liberato
2007-12-25
Shaped nanocrystal particles and methods for making shaped nanocrystal particles are disclosed. One embodiment includes a method for forming a branched, nanocrystal particle. It includes (a) forming a core having a first crystal structure in a solution, (b) forming a first arm extending from the core having a second crystal structure in the solution, and (c) forming a second arm extending from the core having the second crystal structure in the solution.
Shaped Nonocrystal Particles And Methods For Making The Same
Alivisatos, A. Paul; Scher, Erik C.; Manna, Liberato
2005-02-15
Shaped nanocrystal particles and methods for making shaped nanocrystal particles are disclosed. One embodiment includes a method for forming a branched, nanocrystal particle. It includes (a) forming a core having a first crystal structure in a solution, (b) forming a first arm extending from the core having a second crystal structure in the solution, and (c) forming a second arm extending from the core having the second crystal structure in the solution.
Core outcome sets and trial registries.
Clarke, Mike; Williamson, Paula
2015-05-14
Some reasons for registering trials might be considered as self-serving, such as satisfying the requirements of a journal in which the researchers wish to publish their eventual findings or publicising the trial to boost recruitment. Registry entries also help others, including systematic reviewers, to know about ongoing or unpublished studies and contribute to reducing research waste by making it clear what studies are ongoing. Other sources of research waste include inconsistency in outcome measurement across trials in the same area, missing data on important outcomes from some trials, and selective reporting of outcomes. One way to reduce this waste is through the use of core outcome sets: standardised sets of outcomes for research in specific areas of health and social care. These do not restrict the outcomes that will be measured, but provide the minimum to include if a trial is to be of the most use to potential users. We propose that trial registries, such as ISRCTN, encourage researchers to note their use of a core outcome set in their entry. This will help people searching for trials and those worried about selective reporting in closed trials. Trial registries can facilitate these efforts to make new trials as useful as possible and reduce waste. The outcomes section in the entry could prompt the researcher to consider using a core outcome set and facilitate the specification of that core outcome set and its component outcomes through linking to the original core outcome set. In doing this, registries will contribute to the global effort to ensure that trials answer important uncertainties, can be brought together in systematic reviews, and better serve their ultimate aim of improving health and well-being through improving health and social care.
Goldhahn, Jörg; Beaton, Dorcas; Ladd, Amy; Macdermid, Joy; Hoang-Kim, Amy
2014-02-01
Lack of standardization of outcome measurement has hampered an evidence-based approach to clinical practice and research. We adopted a process of reviewing evidence on current use of measures and appropriate theoretical frameworks for health and disability to inform a consensus process that was focused on deriving the minimal set of core domains in distal radius fracture. We agreed on the following seven core recommendations: (1) pain and function were regarded as the primary domains, (2) very brief measures were needed for routine administration in clinical practice, (3) these brief measures could be augmented by additional measures that provide more detail or address additional domains for clinical research, (4) measurement of pain should include measures of both intensity and frequency as core attributes, (5) a numeric pain scale, e.g. visual analogue scale or visual numeric scale or the pain subscale of the patient-reported wrist evaluation (PRWE) questionnaires were identified as reliable, valid and feasible measures to measure these concepts, (6) for function, either the Quick Disability of the arm, shoulder and hand questionnaire or PRWE-function subscale was identified as reliable, valid and feasible measures, and (7) a measure of participation and treatment complications should be considered core outcomes for both clinical practice and research. We used a sound methodological approach to form a comprehensive foundation of content for outcomes in the area of distal radius fractures. We recommend the use of symptom and function as separate domains in the ICF core set in clinical research or practice for patients with wrist fracture. Further research is needed to provide more definitive measurement properties of measures across all domains.
Producing gapped-ferrite transformer cores
NASA Technical Reports Server (NTRS)
Mclyman, W. T.
1980-01-01
Improved manufacturing techniques make reproducible gaps and minimize cracking. Molded, unfired transformer cores are cut with thin saw and then fired. Hardened semicircular core sections are bonded together, placed in aluminum core box, and fluidized-coated. After winding is run over box, core is potted. Economical method significantly reduces number of rejects.
Traditions of research into interruptions in healthcare: A conceptual review.
McCurdie, Tara; Sanderson, Penelope; Aitken, Leanne M
2017-01-01
Researchers from diverse theoretical backgrounds have studied workplace interruptions in healthcare, leading to a complex and conflicting body of literature. Understanding pre-existing viewpoints may advance the field more effectively than attempts to remove bias from investigations. To identify research traditions that have motivated and guided interruptions research, and to note research questions posed, gaps in approach, and possible avenues for future research. A critical review was conducted of research on interruptions in healthcare. Two researchers identified core research communities based on the community's motivations, philosophical outlook, and methods. Among the characteristics used to categorise papers into research communities were the predominant motivation for studying interruptions, the research questions posed, and key contributions to the body of knowledge on interruptions in healthcare. In cases where a paper approached an equal number of characteristics from two traditions, it was placed in a blended research community. A total of 141 papers were identified and categorised; all papers identified were published from 1994 onwards. Four principal research communities emerged: epidemiology, quality improvement, cognitive systems engineering (CSE), and applied cognitive psychology. Blends and areas of mutual influence between the research communities were identified that combine the benefits of individual traditions, but there was a notable lack of blends incorporating quality improvement initiatives. The question most commonly posed by researchers across multiple communities was: what is the impact of interruptions? Impact was measured as a function of task time or risk in the epidemiology tradition, situation awareness in the CSE tradition, or resumption lag (time to resume an interrupted task) in the applied cognitive psychology tradition. No single question about interruptions in healthcare was shared by all four of the core communities. Much research on workplace interruptions in healthcare can be described in terms of fundamental values of four distinct research traditions and the communities that bring the values and methods: of those research traditions to their investigations. Blends between communities indicate that mutual influence has occurred as interruptions research has progressed. It is clear from this review that there is no single or privileged perspective to study interruptions. Instead, these findings suggest that researchers investigating interruptions in healthcare would benefit from being more aware of different perspectives from their own, especially when they consider workplace interventions to reduce interruptions. Copyright © 2016. Published by Elsevier Ltd.
Webster, Lucy; Groskreutz, Derek; Grinbergs-Saull, Anna; Howard, Rob; O'Brien, John T; Mountain, Gail; Banerjee, Sube; Woods, Bob; Perneczky, Robert; Lafortune, Louise; Roberts, Charlotte; McCleery, Jenny; Pickett, James; Bunn, Frances; Challis, David; Charlesworth, Georgina; Featherstone, Katie; Fox, Chris; Goodman, Claire; Jones, Roy; Lamb, Sallie; Moniz-Cook, Esme; Schneider, Justine; Shepperd, Sasha; Surr, Claire; Thompson-Coon, Jo; Ballard, Clive; Brayne, Carol; Burke, Orlaith; Burns, Alistair; Clare, Linda; Garrard, Peter; Kehoe, Patrick; Passmore, Peter; Holmes, Clive; Maidment, Ian; Murtagh, Fliss; Robinson, Louise; Livingston, Gill
2017-01-01
BACKGROUND There is currently no disease-modifying treatment available to halt or delay the progression of the disease pathology in dementia. An agreed core set of the best-available and most appropriate outcomes for disease modification would facilitate the design of trials and ensure consistency across disease modification trials, as well as making results comparable and meta-analysable in future trials. OBJECTIVES To agree a set of core outcomes for disease modification trials for mild to moderate dementia with the UK dementia research community and patient and public involvement (PPI). DATA SOURCES We included disease modification trials with quantitative outcomes of efficacy from (1) references from related systematic reviews in workstream 1; (2) searches of the Cochrane Dementia and Cognitive Improvement Group study register, Cochrane Central Register of Controlled Trials, Cumulative Index to Nursing and Allied Health Literature, EMBASE, Latin American and Caribbean Health Sciences Literature and PsycINFO on 11 December 2015, and clinical trial registries [International Standard Randomised Controlled Trial Number (ISRCTN) and clinicaltrials.gov] on 22 and 29 January 2016; and (3) hand-searches of reference lists of relevant systematic reviews from database searches. REVIEW METHODS The project consisted of four workstreams. (1) We obtained related core outcome sets and work from co-applicants. (2) We systematically reviewed published and ongoing disease modification trials to identify the outcomes used in different domains. We extracted outcomes used in each trial, recording how many used each outcome and with how many participants. We divided outcomes into the domains measured and searched for validation data. (3) We consulted with PPI participants about recommended outcomes. (4) We presented all the synthesised information at a conference attended by the wider body of National Institute for Health Research (NIHR) dementia researchers to reach consensus on a core set of outcomes. RESULTS We included 149 papers from the 22,918 papers screened, referring to 125 individual trials. Eighty-one outcomes were used across trials, including 72 scales [31 cognitive, 12 activities of daily living (ADLs), 10 global, 16 neuropsychiatric and three quality of life] and nine biological techniques. We consulted with 18 people for PPI. The conference decided that only cognition and biological markers are core measures of disease modification. Cognition should be measured by the Mini Mental State Examination (MMSE) or the Alzheimer's Disease Assessment Scale - Cognitive subscale (ADAS-Cog), and brain changes through structural magnetic resonance imaging (MRI) in a subset of participants. All other domains are important but not core. We recommend using the Neuropsychiatric Inventory for neuropsychiatric symptoms: the Disability Assessment for Dementia for ADLs, the Dementia Quality of Life Measure for quality of life and the Clinical Dementia Rating scale to measure dementia globally. LIMITATIONS Most of the trials included participants with Alzheimer's disease, so recommendations may not apply to other types of dementia. We did not conduct economic analyses. The PPI consultation was limited to members of the Alzheimer's Society Research Network. CONCLUSIONS Cognitive outcomes and biological markers form the core outcome set for future disease modification trials, measured by the MMSE or ADAS-Cog, and structural MRI in a subset of participants. FUTURE WORK We envisage that the core set may be superseded in the future, particularly for other types of dementia. There is a need to develop an algorithm to compare scores on the MMSE and ADAS-Cog. STUDY REGISTRATION The project was registered with Core Outcome Measures in Effectiveness Trials [ www.comet-initiative.org/studies/details/819?result=true (accessed 7 April 2016)]. The systematic review protocol is registered as PROSPERO CRD42015027346. FUNDING The National Institute for Health Research Health Technology Assessment programme. PMID:28625273
USGS leads United States effort in Mallik Well
2002-01-01
This winter, in the extremely cold, far reaches of the upper Northwest Territory of Canada, there is an international consortium of researchers participating in a program to study methane hydrates. The researchers are currently drilling a 1200 m-deep production research well through the permafrost. It is one of three wells located in the Mackenzie Delta, on the shore of the Beaufort Sea. Two observation wells were drilled adjacent to the main production test well earlier this year.Research objectives for the program focus on two themes: (1) the assessment of the production and properties of gas hydrates, and (2) an assessment of the stability of continental gas hydrates given warming trends predicted by climate change models. Of particular interest is the physical response of the gas hydrate to depressurization and thermal production stimulation. Cores are being taken from the well, and scientists hope to retrieve at least 200 m of core, including all the gas hydrate-rich intervals. Once cored, the samples are transported 200 kilometers over ice roads to Inuvik. Nearly 60 researchers are examining the cores for everything from geophysical parameters to microbiological analyses.
Morris, Christopher; Dunkley, Colin; Gibbon, Frances M; Currier, Janet; Roberts, Deborah; Rogers, Morwenna; Crudgington, Holly; Bray, Lucy; Carter, Bernie; Hughes, Dyfrig; Tudur Smith, Catrin; Williamson, Paula R; Gringras, Paul; Pal, Deb K
2017-11-28
There is increasing recognition that establishing a core set of outcomes to be evaluated and reported in trials of interventions for particular conditions will improve the usefulness of health research. There is no established core outcome set for childhood epilepsy. The aim of this work is to select a core outcome set to be used in evaluative research of interventions for children with rolandic epilepsy, as an exemplar of common childhood epilepsy syndromes. First we will identify what outcomes should be measured; then we will decide how to measure those outcomes. We will engage relevant UK charities and health professional societies as partners, and convene advisory panels for young people with epilepsy and parents of children with epilepsy. We will identify candidate outcomes from a search for trials of interventions for childhood epilepsy, statutory guidance and consultation with our advisory panels. Families, charities and health, education and neuropsychology professionals will be invited to participate in a Delphi survey following recommended practices in the development of core outcome sets. Participants will be able to recommend additional outcome domains. Over three rounds of Delphi survey participants will rate the importance of candidate outcome domains and state the rationale for their decisions. Over the three rounds we will seek consensus across and between families and health professionals on the more important outcomes. A face-to-face meeting will be convened to ratify the core outcome set. We will then review and recommend ways to measure the shortlisted outcomes using clinical assessment and/or patient-reported outcome measures. Our methodology is a proportionate and pragmatic approach to expediently produce a core outcome set for evaluative research of interventions aiming to improve the health of children with epilepsy. A number of decisions have to be made when designing a study to develop a core outcome set including defining the scope, choosing which stakeholders to engage, most effective ways to elicit their views, especially children and a potential role for qualitative research.
Data quality assurance and control in cognitive research: Lessons learned from the PREDICT-HD study.
Westervelt, Holly James; Bernier, Rachel A; Faust, Melanie; Gover, Mary; Bockholt, H Jeremy; Zschiegner, Roland; Long, Jeffrey D; Paulsen, Jane S
2017-09-01
We discuss the strategies employed in data quality control and quality assurance for the cognitive core of Neurobiological Predictors of Huntington's Disease (PREDICT-HD), a long-term observational study of over 1,000 participants with prodromal Huntington disease. In particular, we provide details regarding the training and continual evaluation of cognitive examiners, methods for error corrections, and strategies to minimize errors in the data. We present five important lessons learned to help other researchers avoid certain assumptions that could potentially lead to inaccuracies in their cognitive data. Copyright © 2017 John Wiley & Sons, Ltd.
Identifying influential spreaders in complex networks based on kshell hybrid method
NASA Astrophysics Data System (ADS)
Namtirtha, Amrita; Dutta, Animesh; Dutta, Biswanath
2018-06-01
Influential spreaders are the key players in maximizing or controlling the spreading in a complex network. Identifying the influential spreaders using kshell decomposition method has become very popular in the recent time. In the literature, the core nodes i.e. with the largest kshell index of a network are considered as the most influential spreaders. We have studied the kshell method and spreading dynamics of nodes using Susceptible-Infected-Recovered (SIR) epidemic model to understand the behavior of influential spreaders in terms of its topological location in the network. From the study, we have found that every node in the core area is not the most influential spreader. Even a strategically placed lower shell node can also be a most influential spreader. Moreover, the core area can also be situated at the periphery of the network. The existing indexing methods are only designed to identify the most influential spreaders from core nodes and not from lower shells. In this work, we propose a kshell hybrid method to identify highly influential spreaders not only from the core but also from lower shells. The proposed method comprises the parameters such as kshell power, node's degree, contact distance, and many levels of neighbors' influence potential. The proposed method is evaluated using nine real world network datasets. In terms of the spreading dynamics, the experimental results show the superiority of the proposed method over the other existing indexing methods such as the kshell method, the neighborhood coreness centrality, the mixed degree decomposition, etc. Furthermore, the proposed method can also be applied to large-scale networks by considering the three levels of neighbors' influence potential.
Effect of verification cores on tip capacity of drilled shafts.
DOT National Transportation Integrated Search
2009-02-01
This research addressed two key issues: : 1) Will verification cores holes fill during concrete backfilling? If so, what are the mechanical properties of the : filling material? In dry conditions, verification core holes always completely fill with c...
Characterising and modelling regolith stratigraphy using multiple geophysical techniques
NASA Astrophysics Data System (ADS)
Thomas, M.; Cremasco, D.; Fotheringham, T.; Hatch, M. A.; Triantifillis, J.; Wilford, J.
2013-12-01
Regolith is the weathered, typically mineral-rich layer from fresh bedrock to land surface. It encompasses soil (A, E and B horizons) that has undergone pedogenesis. Below is the weathered C horizon that retains at least some of the original rocky fabric and structure. At the base of this is the lower regolith boundary of continuous hard bedrock (the R horizon). Regolith may be absent, e.g. at rocky outcrops, or may be many 10's of metres deep. Comparatively little is known about regolith, and critical questions remain regarding composition and characteristics - especially deeper where the challenge of collecting reliable data increases with depth. In Australia research is underway to characterise and map regolith using consistent methods at scales ranging from local (e.g. hillslope) to continental scales. These efforts are driven by many research needs, including Critical Zone modelling and simulation. Pilot research in South Australia using digitally-based environmental correlation techniques modelled the depth to bedrock to 9 m for an upland area of 128 000 ha. One finding was the inability to reliably model local scale depth variations over horizontal distances of 2 - 3 m and vertical distances of 1 - 2 m. The need to better characterise variations in regolith to strengthen models at these fine scales was discussed. Addressing this need, we describe high intensity, ground-based multi-sensor geophysical profiling of three hillslope transects in different regolith-landscape settings to characterise fine resolution (i.e. < 1 m) regolith stratigraphy. The geophysics included: ground penetrating radar collected at a number of frequencies; multiple frequency, multiple coil electromagnetic induction; and high resolution resistivity. These were accompanied by georeferenced, closely spaced deep cores to 9 m - or to core refusal. The intact cores were sub-sampled to standard depths and analysed for regolith properties to compile core datasets consisting of: water content; texture; electrical conductivity; and weathered state. After preprocessing (filtering, geo-registration, depth correction, etc.) each geophysical profile was evaluated by matching the core data. Applying traditional geophysical techniques, the best profiles were inverted using the core data creating two-dimensional (2-D) stratigraphic regolith models for each transect, and evaluated using independent validation. Next, in a test of an alternative method borrowed from digital soil mapping, the best preprocessed geophysical profiles were co-registered and stratigraphic models for each property created using multivariate environmental correlation. After independent validation, the qualities of the latest models were compared to the traditionally derived 2-D inverted models. Finally, the best overall stratigraphic models were used in conjunction with local environmental data (e.g. geology, geochemistry, terrain, soils) to create conceptual regolith hillslope models for each transect highlighting important features and processes, e.g. morphology, hydropedology and weathering characteristics. Results are presented with recommendations regarding the use of geophysics in modelling regolith stratigraphy at fine scales.
Cosmic ray radiography of the damaged cores of the Fukushima reactors
Borozdin, Konstantin; Greene, Steven; Lukić, Zarija; ...
2012-10-11
The passage of muons through matter is dominated by the Coulomb interaction with electrons and nuclei. The interaction with the electrons leads to continuous energy loss and stopping of the muons. The interaction with nuclei leads to angle “diffusion.” Two muon-imaging methods that use flux attenuation and multiple Coulomb scattering of cosmic-ray muons are being studied as tools for diagnosing the damaged cores of the Fukushima reactors. Here, we compare these two methods. We conclude that the scattering method can provide detailed information about the core. Lastly, attenuation has low contrast and little sensitivity to the core.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grant, Sean Campbell; Ao, Tommy; Davis, Jean-Paul
The CHEDS researchers are engaged in a collaborative research project to study the properties of iron and iron alloys under Earth’s core conditions. The Earth’s core, inner and outer, is composed primarily of iron, thus studying iron and iron alloys at high pressure and temperature conditions will give the best estimate of its properties. Also, comparing studies of iron alloys with known properties of the core can constrain the potential light element compositions found within the core, such as fitting sound speeds and densities of iron alloys to established inner- Earth models. One of the lesser established properties of themore » core is the thermal conductivity, where current estimates vary by a factor of three. Therefore, one of the primary goals of this collaboration is to make relevant measurements to elucidate this conductivity.« less
ERIC Educational Resources Information Center
Hough, Heather; Kalogrides, Demetra; Loeb, Susanna
2017-01-01
The research featured in this paper is part of the CORE-PACE Research Partnership, through which Policy Analysis for California Education (PACE) has partnered with the CORE districts to conduct research designed to support them in continuous improvement while simultaneously helping to improve policy and practice in California and nationwide.…
Tutoring for Success: Empowering Graduate Nurses After Failure on the NCLEX-RN.
Lutter, Stacy L; Thompson, Cheryl W; Condon, Marian C
2017-12-01
Failure on the National Council Licensure Examination for Registered Nurses (NCLEX-RN) is a devastating experience. Most research related to NCLEX-RN is focused on predicting and preventing failure. Despite these efforts, more than 20,000 nursing school graduates experience failure on the NCLEX-RN each year, and there is a paucity of literature regarding remediation after failure. The aim of this article is to describe an individualized tutoring approach centered on establishing a trusting relationship and incorporating two core strategies for remediation: the nugget method, and a six-step strategy for question analysis. This individualized tutoring method has been used by three nursing faculty with a 95% success rate on an NCLEX retake attempt. Further research is needed to identify the elements of this tutoring method that influence success. [J Nurs Educ. 2017;56(12):758-761.]. Copyright 2017, SLACK Incorporated.
Liakata, Maria; Saha, Shyamasree; Dobnik, Simon; Batchelor, Colin; Rebholz-Schuhmann, Dietrich
2012-04-01
Scholarly biomedical publications report on the findings of a research investigation. Scientists use a well-established discourse structure to relate their work to the state of the art, express their own motivation and hypotheses and report on their methods, results and conclusions. In previous work, we have proposed ways to explicitly annotate the structure of scientific investigations in scholarly publications. Here we present the means to facilitate automatic access to the scientific discourse of articles by automating the recognition of 11 categories at the sentence level, which we call Core Scientific Concepts (CoreSCs). These include: Hypothesis, Motivation, Goal, Object, Background, Method, Experiment, Model, Observation, Result and Conclusion. CoreSCs provide the structure and context to all statements and relations within an article and their automatic recognition can greatly facilitate biomedical information extraction by characterizing the different types of facts, hypotheses and evidence available in a scientific publication. We have trained and compared machine learning classifiers (support vector machines and conditional random fields) on a corpus of 265 full articles in biochemistry and chemistry to automatically recognize CoreSCs. We have evaluated our automatic classifications against a manually annotated gold standard, and have achieved promising accuracies with 'Experiment', 'Background' and 'Model' being the categories with the highest F1-scores (76%, 62% and 53%, respectively). We have analysed the task of CoreSC annotation both from a sentence classification as well as sequence labelling perspective and we present a detailed feature evaluation. The most discriminative features are local sentence features such as unigrams, bigrams and grammatical dependencies while features encoding the document structure, such as section headings, also play an important role for some of the categories. We discuss the usefulness of automatically generated CoreSCs in two biomedical applications as well as work in progress. A web-based tool for the automatic annotation of articles with CoreSCs and corresponding documentation is available online at http://www.sapientaproject.com/software http://www.sapientaproject.com also contains detailed information pertaining to CoreSC annotation and links to annotation guidelines as well as a corpus of manually annotated articles, which served as our training data. liakata@ebi.ac.uk Supplementary data are available at Bioinformatics online.
NASA Astrophysics Data System (ADS)
Thompson, Nick; Watters, Robert J.; Schiffman, Peter
2008-04-01
Hawaiian Island flank failures are recognized as the largest landslide events on Earth, reaching volumes of several thousand cubic kilometers and lengths of over 200 km and occurring on an average of once every 100 000 years. The 3.1 km deep Hawaii Scientific Drilling Project (HSDP) enabled an investigation of the rock mass strength variations on the island of Hawaii [Schiffman, P., Watters, R.J., Thompson, N., Walton, A.W., 2006. Hyaloclastites and the slope stability of Hawaiian volcanoes: insights from the Hawaiian Scientific Drilling Project's 3-km drill core. Journal of Volcanology and Geothermal Research, 151 (1-3): 217-228]. This study builds on that of Schiffman et al. [Schiffman, P., Watters, R.J., Thompson, N., Walton, A.W., 2006. Hyaloclastites and the slope stability of Hawaiian volcanoes: Insights from the Hawaiian Scientific Drilling Project's 3-km drill core. Journal of Volcanology and Geothermal Research, 151 (1-3): 217-228] by considering more in-depth rock mass classification and strength testing methods of the HSDP core. Geotechnical core logging techniques combined with laboratory strength testing methods show that rock strength differences exist within the edifice. Comparing the rock strength parameters obtained from the various volcano lithologies identified weak zones, suggesting the possible location of future slip surfaces for large flank failures. Relatively weak rock layers were recognized within poorly consolidated hyaloclastite zones, with increases in strength based on degree of alteration. Subaerial and submarine basalt flows are found to be significantly stronger. With the aid of digital elevation models, cross-sections have been developed of key flank areas on the island of Hawaii. Limit equilibrium slope stability analyses are performed on each cross-section using various failure criteria for the rock mass strength calculations. Based on the stability analyses the majority of the slopes analyzed are considered stable. In cases where instability (i.e. failure) is predicted, decreased rock mass quality (strength) of the altered and highly poorly consolidated lithologies is found to have a significant influence. These lithologies are present throughout the Hawaiian Islands, representing potential failure surfaces for large flank collapses. Failure criterion input parameters are considered in sensitivity analyses as are the influences of certain external stability factors such as sea level variation and seismic loading.
Calibration Plans for the Global Precipitation Measurement (GPM)
NASA Technical Reports Server (NTRS)
Bidwell, S. W.; Flaming, G. M.; Adams, W. J.; Everett, D. F.; Mendelsohn, C. R.; Smith, E. A.; Turk, J.
2002-01-01
The Global Precipitation Measurement (GPM) is an international effort led by the National Aeronautics and Space Administration (NASA) of the U.S.A. and the National Space Development Agency of Japan (NASDA) for the purpose of improving research into the global water and energy cycle. GPM will improve climate, weather, and hydrological forecasts through more frequent and more accurate measurement of precipitation world-wide. Comprised of U.S. domestic and international partners, GPM will incorporate and assimilate data streams from many spacecraft with varied orbital characteristics and instrument capabilities. Two of the satellites will be provided directly by GPM, the core satellite and a constellation member. The core satellite, at the heart of GPM, is scheduled for launch in November 2007. The core will carry a conical scanning microwave radiometer, the GPM Microwave Imager (GMI), and a two-frequency cross-track-scanning radar, the Dual-frequency Precipitation Radar (DPR). The passive microwave channels and the two radar frequencies of the core are carefully chosen for investigating the varying character of precipitation over ocean and land, and from the tropics to the high-latitudes. The DPR will enable microphysical characterization and three-dimensional profiling of precipitation. The GPM-provided constellation spacecraft will carry a GMI radiometer identical to that on the core spacecraft. This paper presents calibration plans for the GPM, including on-board instrument calibration, external calibration methods, and the role of ground validation. Particular emphasis is on plans for inter-satellite calibration of the GPM constellation. With its Unique instrument capabilities, the core spacecraft will serve as a calibration transfer standard to the GPM constellation. In particular the Dual-frequency Precipitation Radar aboard the core will check the accuracy of retrievals from the GMI radiometer and will enable improvement of the radiometer retrievals. Observational intersections of the core with the constellation spacecraft are essential in applying this technique to the member satellites. Information from core spacecraft retrievals during intersection events will be transferred to the constellation radiometer instruments in the form of improved calibration and, with experience, improved radiometric algorithms. In preparation for the transfer standard technique, comparisons using the Tropical Rainfall Measuring Mission (TRMM) with sun-synchronous radiometers have been conducted. Ongoing research involves study of critical variables in the inter-comparison, such as correlation with spatial-temporal separation of intersection events, frequency of intersection events, variable azimuth look angles, and variable resolution cells for the various sensors.
Lin, Hui-Chen; Lin, Chi-Yi; Chien, Tsui-Wei; Liu, Kuei-Fen; Chen, Miao-Yen; Lin, Wen-Chuan
2013-02-01
A constellation of factors accounts for teaching efficacy in the fundamental nursing practicum. Teachers play a critical role in terms of designing and executing an appropriate teaching plan, choosing effective methods, and holding appropriate teaching attitudes. It is thus extremely important that clinical teachers master the core characteristics of basic nursing practice. This study aimed to illuminate the core characteristics of basic nursing practice for students for reference by clinical practicum teachers. Qualitative research was used to identify the fundamentals of nursing practice by clinical teacher. Five focus group meetings were convened during the practice period. The researchers presided over group discussions held during the normal weekly teaching schedule and lasting approximately 2-4 hours each. The content analysis was adopted to analyze the data. Three major themes were proposed, including (1) student status: "novices were stymied by problems and thus improved slowly"; (2) teacher awareness: "teachers need to be aware of student capabilities, mood, and discomfort"; and (3) teaching style: "a good choice of methods should support and encourage students. To cultivate professional nursing knowledge and self-confidence for future professional commitment, clinical teachers must first understand the characteristics and motivations of learning of their students and then select the, skills, and attitudes appropriate to provide step-by-step guidance. Communication with staffs and the preparation of atmosphere prior to nursing practice are also essential for students. Results provide insights into the technical college environment with regard to basic-level clinical nursing practice.
A common and optimized age scale for Antarctic ice cores
NASA Astrophysics Data System (ADS)
Parrenin, F.; Veres, D.; Landais, A.; Bazin, L.; Lemieux-Dudon, B.; Toye Mahamadou Kele, H.; Wolff, E.; Martinerie, P.
2012-04-01
Dating ice cores is a complex problem because 1) there is a age shift between the gas bubbles and the surrounding ice 2) there are many different ice cores which can be synchronized with various proxies and 3) there are many methods to date the ice and the gas bubbles, each with advantages and drawbacks. These methods fall into the following categories: 1) Ice flow (for the ice) and firn densification modelling (for the gas bubbles); 2) Comparison of ice core proxies with insolation variations (so-called orbital tuning methods); 3) Comparison of ice core proxies with other well dated archives; 4) Identification of well-dated horizons, such as tephra layers or geomagnetic anomalies. Recently, an new dating tool has been developped (DATICE, Lemieux-Dudon et al., 2010), to take into account all the different dating information into account and produce a common and optimal chronology for ice cores with estimated confidence intervals. In this talk we will review the different dating information for Antarctic ice cores and show how the DATICE tool can be applied.
[Three-dimensional computer aided design for individualized post-and-core restoration].
Gu, Xiao-yu; Wang, Ya-ping; Wang, Yong; Lü, Pei-jun
2009-10-01
To develop a method of three-dimensional computer aided design (CAD) of post-and-core restoration. Two plaster casts with extracted natural teeth were used in this study. The extracted teeth were prepared and scanned using tomography method to obtain three-dimensional digitalized models. According to the basic rules of post-and-core design, posts, cores and cavity surfaces of the teeth were designed using the tools for processing point clouds, curves and surfaces on the forward engineering software of Tanglong prosthodontic system. Then three-dimensional figures of the final restorations were corrected according to the configurations of anterior teeth, premolars and molars respectively. Computer aided design of 14 post-and-core restorations were finished, and good fitness between the restoration and the three-dimensional digital models were obtained. Appropriate retention forms and enough spaces for the full crown restorations can be obtained through this method. The CAD of three-dimensional figures of the post-and-core restorations can fulfill clinical requirements. Therefore they can be used in computer-aided manufacture (CAM) of post-and-core restorations.
NASA Astrophysics Data System (ADS)
Tan, Jonathan
We describe a research plan to develop and extend the mid-infrared (MIR) extinction mapping technique presented by Butler & Tan (2009), who studied Infrared Dark Clouds (IRDCs) using Spitzer Space Telescope Infrared Array Camera (IRAC) 8 micron images. This method has the ability to probe the detailed spatial structure of very high column density regions, i.e. the gas clouds thought to represent the initial conditions for massive star and star cluster formation. We will analyze the data Spitzer obtained at other wavelengths, i.e. the IRAC bands at 3.6, 4.5 and 5.8 microns, and the Multiband Imaging Photometer (MIPS) bands, especially at 24 microns. This will allow us to measure the dust extinction law across the MIR and search for evidence of dust grain evolution, e.g. grain growth and ice mantle formation, as a function of gas density and column density. We will also study the detailed structure of the extinction features, including individual cores that may form single stars or close binaries, especially focusing on those cores that may form massive stars. By studying independent dark cores in a given IRDC, we will be able to test if they have a common minimum observed intensity, which we will then attribute to the foreground. This is a new method that should allow us to more accurately map distant, high column density IRDCs, probing more extreme regimes of star formation. We will combine MIR extinction mapping, which works best at high column densities, with near- IR mapping based on 2MASS images of star fields, which is most useful at lower columns that probe the extended giant molecular cloud structure. This information is crucial to help understand the formation process of IRDCs, which may be the rate limiting step for global galactic star formation rates. We will use our new extinction mapping methods to analyze large samples of IRDCs and thus search the Galaxy for the most extreme examples of high column density cores and assess the global star formation efficiency in dense gas. We will estimate the ability of future NASA missions, such as JWST, to carry out MIR extinction mapping science. We will develop the results of this research into an E/PO presentation to be included in the various public outreach events organized and courses taught by the PI.